CN115762108A - Remote control method, remote control device and controlled device - Google Patents

Remote control method, remote control device and controlled device Download PDF

Info

Publication number
CN115762108A
CN115762108A CN202210248607.6A CN202210248607A CN115762108A CN 115762108 A CN115762108 A CN 115762108A CN 202210248607 A CN202210248607 A CN 202210248607A CN 115762108 A CN115762108 A CN 115762108A
Authority
CN
China
Prior art keywords
remote control
screen
target position
information
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210248607.6A
Other languages
Chinese (zh)
Inventor
牛洋
任哲坡
汤琼
丁洪霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to PCT/CN2022/113924 priority Critical patent/WO2023030067A1/en
Publication of CN115762108A publication Critical patent/CN115762108A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The embodiment of the invention provides a remote control method, remote control equipment and controlled equipment. In the technical solution provided in the embodiment of the present invention, the controlled device includes a screen, and the method includes: receiving first information of a remote control device; determining the target position of the remote control equipment facing the screen according to the first information; and executing a first operation on a screen according to the target position. According to the embodiment of the invention, the remote operation of the controlled equipment can be realized on the premise that the user does not need to perform mechanical frequent operation on the five-way navigation key, and the experience of the user in the using process is improved.

Description

Remote control method, remote control device and controlled device
[ technical field ] A method for producing a semiconductor device
The present invention relates to the field of remote control technologies, and in particular, to a remote control method, a remote control device, and a controlled device.
[ background of the invention ]
The current remote control method of the controlled device mainly includes two methods, one is remote control operation of a remote controller, and the other is remote control operation using a mobile phone as the controlled device.
The existing remote controller, no matter an infrared remote controller or a Bluetooth remote controller, comprises five-way navigation keys. When a user uses a remote controller to remotely control a controlled device, the user is easy to get on the hand, but compared with a mobile phone and a tablet which are similar interactive products, the operation experience of the user is poor. Under the trend of large screens, more and more contents can be displayed on the screens, and when a user uses a basic five-guide key, the user wants to quickly locate the desired contents, and the user often needs to perform mechanical frequent operations to realize the positioning. The mobile phone is used as a large-screen remote controller, and needs to have an infrared function and be supported by installing corresponding software on the mobile phone. In addition, the mobile phone is used as a remote controller of the controlled device, the distance and the infrared emission angle are limited, and the problem that interaction is completed by switching the five-way navigation key back and forth is not fundamentally solved, so that the experience sense in the use process is poor.
Therefore, the current remote control method of the controlled device requires the user to mechanically and frequently operate the five-way navigation key to realize the remote operation of the large screen, which results in poor experience in the use process.
[ summary of the invention ]
In view of this, embodiments of the present invention provide a remote control method, a remote control device, and a controlled device, which enable a user to implement remote operation on the controlled device without performing mechanical frequent operation on a five-way navigation key, and improve the experience of the user in the using process.
In a first aspect, an embodiment of the present invention provides a remote control method, which is applied to a controlled device, where the controlled device includes a screen, and the method includes:
receiving first information of a remote control device;
determining a target position of the remote control equipment facing the screen according to the first information;
and executing a first operation on the screen according to the target position.
In one possible implementation, the first information includes position coordinates of the target position where the remote control device is facing the screen;
determining the target position of the remote control equipment facing the screen according to the first information, wherein the determining comprises the following steps:
and determining the target position corresponding to the position coordinate on the screen according to the position coordinate.
In one possible implementation, the position coordinates of the target position are obtained from an image containing the outline of the screen taken by a binocular camera of the remote control device.
In one possible implementation, the first information includes an image including an outline of the screen captured by a binocular camera of the remote control apparatus;
the determining the target position of the remote control device facing the screen according to the first information includes: and determining the target position corresponding to the position coordinates on the screen according to the image.
In one possible implementation, the determining, according to the image, a target position on the screen corresponding to the position coordinate includes:
obtaining the position coordinate of the target position of the remote control equipment facing the screen according to the image;
and determining the target position corresponding to the position coordinate on the screen according to the position coordinate.
In one possible implementation, the first operation includes: displaying a pointer at the target position; or highlighting or shading the control corresponding to the target position.
On the other hand, an embodiment of the present invention provides a remote control method, which is applied to a remote control device, and the method includes:
acquiring first information, wherein the first information is used for determining a target position of the remote control equipment facing a screen of controlled equipment;
and sending the first information to the controlled equipment so that the controlled equipment determines the target position according to the first information and executes a first operation on the screen according to the target position.
In one possible implementation manner, the remote control device includes a binocular camera, and the first information includes position coordinates of the target position where the remote control device faces the screen;
acquiring first information, including:
when the remote control equipment points to the screen, shooting an image containing the outline of the screen through the binocular camera;
and obtaining the position coordinate of the target position of the remote control equipment just facing the screen according to the image.
In one possible implementation, the remote control device includes a binocular camera, and the first information includes an image including an outline of the screen captured by the binocular camera;
acquiring first information, including:
and when the remote control equipment points to the screen, shooting the image through the binocular camera.
In one possible implementation, the first operation includes: displaying a pointer at the target position; or highlighting or shading the control corresponding to the target position.
In another aspect, an embodiment of the present invention provides a remote control apparatus, applied to a controlled device, where the controlled device includes a screen, and the apparatus includes:
the receiving module is used for receiving first information of the remote control equipment;
the determining module is used for determining the target position of the remote control equipment facing the screen according to the first information;
and the execution module is used for executing a first operation on the screen according to the target position.
In one possible implementation, the first information includes position coordinates of the target position where the remote control device is facing the screen;
the determining module is specifically configured to determine the target position corresponding to the position coordinate on the screen according to the position coordinate.
In one possible implementation, the position coordinates of the target position are obtained from an image containing the outline of the screen taken by a binocular camera of the remote control device.
In one possible implementation, the first information includes an image including an outline of the screen captured by a binocular camera of the remote control apparatus;
the determining module is specifically configured to determine the target position corresponding to the position coordinate on the screen according to the image.
In a possible implementation manner, the determining module is specifically configured to: and obtaining the position coordinate of the target position of the remote control equipment, which is just opposite to the screen, according to the image, and determining the target position corresponding to the position coordinate on the screen according to the position coordinate.
In one possible implementation, the first operation includes: displaying a pointer at the target position; or highlighting or shading the control corresponding to the target position.
In another aspect, an embodiment of the present invention provides a remote control apparatus, which is applied to a remote control device, and the apparatus includes:
the remote control device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring first information which is used for determining the target position of the screen of the remote control device facing the controlled device;
and the sending module is used for sending the first information to the controlled equipment so that the controlled equipment determines the target position according to the first information and executes a first operation on the screen according to the target position.
In one possible implementation manner, the remote control device includes a binocular camera, and the first information includes position coordinates of the target position where the remote control device faces the screen;
the acquisition module is specifically used for shooting an image containing the outline of the screen through the binocular camera when the remote control equipment points to the screen, and obtaining the position coordinate of the target position of the remote control equipment just facing the screen according to the image.
In one possible implementation, the remote control device includes a binocular camera, and the first information includes an image including an outline of the screen captured by the binocular camera;
the acquisition module is specifically used for shooting the image through the binocular camera when the remote control equipment points to the screen.
In one possible implementation, the first operation includes: displaying a pointer at the target location; or highlighting or shading the control corresponding to the target position.
In another aspect, an embodiment of the present invention provides a controlled device, including a screen, a processor, and a memory, where the memory is used to store a computer program, and the computer program includes program instructions, and when the processor executes the program instructions, the controlled device is caused to perform the following steps:
receiving first information of a remote control device;
determining the target position of the remote control equipment facing the screen according to the first information;
and executing a first operation on the screen according to the target position.
In one possible implementation, when the processor executes the program instructions, the controlled device is caused to perform the following steps:
the first information comprises position coordinates of the target position of the screen, which is just opposite to the remote control equipment;
determining the target position of the remote control equipment facing the screen according to the first information, wherein the determining comprises the following steps:
and determining the target position corresponding to the position coordinate on the screen according to the position coordinate.
In one possible implementation, the position coordinates of the target position are obtained from an image containing the outline of the screen taken by a binocular camera of the remote control device.
In one possible implementation, when the processor executes the program instructions, the controlled device is caused to perform the following steps:
the first information includes an image including an outline of the screen photographed by a binocular camera of the remote control apparatus;
the determining the target position of the remote control device facing the screen according to the first information includes: and determining the target position corresponding to the position coordinates on the screen according to the image.
In one possible implementation, when the processor executes the program instructions, the controlled device is caused to perform the following steps:
the determining a target position on the screen corresponding to the position coordinates according to the image includes:
obtaining the position coordinate of the target position of the remote control equipment facing the screen according to the image;
and determining the target position corresponding to the position coordinate on the screen according to the position coordinate.
In one possible implementation, the first operation includes: displaying a pointer at the target location; or highlighting or shading the control corresponding to the target position.
In another aspect, an embodiment of the present invention provides a remote control device, including a processor and a memory, where the memory is used to store a computer program, and the computer program includes program instructions, and when the processor executes the program instructions, the remote control device is caused to perform the following steps:
acquiring first information, wherein the first information is used for determining a target position of the remote control equipment facing a screen of controlled equipment;
and sending the first information to the controlled equipment so that the controlled equipment determines the target position according to the first information and executes a first operation on the screen according to the target position.
In one possible implementation manner, the remote control device includes a binocular camera, and the first information includes position coordinates of the target position where the remote control device faces the screen; when the processor executes the program instructions, causing the remote control device to perform the steps of:
acquiring first information, including:
when the remote control equipment points to the screen, shooting an image containing the outline of the screen through the binocular camera;
and obtaining the position coordinate of the target position of the remote control equipment just facing the screen according to the image.
In one possible implementation, the remote control device includes a binocular camera, and the first information includes an image including an outline of the screen captured by the binocular camera; when the processor executes the program instructions, the remote control device is caused to perform the steps of:
acquiring first information, including:
and when the remote control equipment points to the screen, shooting the image through the binocular camera.
In one possible implementation, the first operation includes: displaying a pointer at the target location; or highlighting or shading the control corresponding to the target position.
In another aspect, the present invention provides a computer-readable storage medium, which stores a computer program, where the computer program includes program instructions, and when the program is requested to be executed by a computer, the computer executes the method described above.
In the technical solutions of the remote control method, the remote control device, and the controlled device provided in the embodiments of the present invention, the controlled device includes a screen, and the method includes: receiving first information of a remote control device; determining the target position of the remote control equipment facing the screen according to the first information; and executing a first operation on a screen according to the target position. According to the embodiment of the invention, the remote operation of the controlled equipment can be realized on the premise that the user does not need to mechanically and frequently operate the five-way navigation key, and the experience of the user in the using process is improved.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is an architecture diagram of a remote control system according to an embodiment of the present invention;
fig. 2 is an architecture diagram of a remote control system according to another embodiment of the present invention;
FIG. 3 is a schematic view of a pointing screen of the remote control device;
FIG. 4 is a schematic view of the outline of the screen itself and the outline information of the screen;
FIG. 5 is a schematic diagram of the workflow of the algorithm module;
fig. 6 is a flowchart of a remote control method according to an embodiment of the present invention;
FIG. 7 is a flowchart illustrating a first operation performed on the screen corresponding to the first information according to the first information in FIG. 6;
FIG. 8 is a flowchart of a remote control method according to another embodiment of the present invention;
FIG. 9 is a detailed flowchart of the process of FIG. 8 for obtaining first information for determining a target location of a screen of the remote control device facing the controlled device;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
[ detailed description ] embodiments
For better understanding of the technical solutions of the present invention, the following detailed descriptions of the embodiments of the present invention are provided with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely a relationship that describes an associated object, meaning that three relationships may exist, e.g., A and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The controlled equipment is provided with an open operating system and a chip and an open application platform, so that the bidirectional man-machine interaction function can be realized, and people can freely select to watch any favorite program in the free time. Moreover, with the development of artificial intelligence technology, the controlled device can recommend programs according with the watching habits and preferences of the user, the entertainment is greatly improved, and the controlled device gradually becomes the most important interaction center for living room entertainment. However, the interaction mode of the controlled device is realized by a remote control method.
The current remote control method of the controlled device mainly includes two methods, one is remote control operation of a remote controller, and the other is remote control operation using a mobile phone as the controlled device.
The existing remote controller, no matter an infrared remote controller or a Bluetooth remote controller, comprises five-way navigation keys. The user is easy to get on hand when using the remote controller to remotely control the controlled equipment, but compared with the mobile phone and the tablet which are similar interactive products, the operation experience of the user is poor. Under the trend of large screens, more and more contents can be displayed on the screens, and when a user uses a basic five-guide key, the user wants to quickly locate the desired contents and often needs to perform mechanical frequent operations to realize the purpose. The mobile phone is used as a large-screen remote controller, and needs to have an infrared function and be supported by installing corresponding software on the mobile phone. In addition, the mobile phone is used as a remote controller of the controlled device, the distance and the infrared emission angle are limited, and the problem that interaction is completed by switching the five-way navigation key back and forth is not fundamentally solved, so that the experience sense in the use process is poor.
Therefore, in the existing remote control method of the controlled device, the user needs to perform mechanical frequent operation on the five-way navigation key to realize remote operation on the large screen, so that the experience is poor in the using process.
In order to solve the above technical problem, embodiments of the present invention provide a remote control method, a remote control device, and a controlled device.
Referring to fig. 1, fig. 1 is an architecture diagram of a remote control system according to an embodiment of the present invention. The remote control system comprises two electronic devices, wherein the two electronic devices can comprise a controlled device and a remote control device, and the communication mode between the controlled device and the remote control device comprises wireless transmission. As shown in fig. 1, the remote control system includes a remote control device 100 and a controlled device 200. The remote control device 100 and the controlled device 200 may perform wireless transmission, such as: wireless transmissions include bluetooth, infrared, mobile network or WLAN transmissions. The remote control device 100 includes a binocular camera and a transmitting module. The controlled device 200 includes a receiving module, an algorithm module, and a screen.
In fig. 1, a binocular camera of the remote control device 100 is used to capture an image of a screen of the controlled device 200, the image including an outline of the screen, and transmit the image to a transmission module; the sending module is configured to send the first information to the controlled device 200 in a wireless transmission manner. Wherein the first information includes an image including an outline of the screen photographed by a binocular camera of the remote control apparatus 100. The receiving module of the controlled device 200 is configured to receive the first information and send the first information to the algorithm module. The algorithm module determines the position coordinates of the target position where the remote control device 100 is facing the screen based on the first information. Specifically, the algorithm module obtains a position coordinate of a target position where the remote control device 100 is directly facing the screen according to the image, and transmits the position coordinate to the screen. The screen is used for determining a target position corresponding to the position coordinates on the screen according to the position coordinates, and executing a first operation on the screen according to the target position.
Referring to fig. 2, fig. 2 is an architecture diagram of a remote control system according to another embodiment of the present invention. The remote control system comprises two electronic devices, wherein the two electronic devices can comprise a controlled device and a remote control device, and the communication mode between the controlled device and the remote control device comprises wireless transmission. As shown in fig. 1, the remote control system includes a remote control device 100 and a controlled device 200. The remote control device 100 and the controlled device 200 may perform wireless transmission, such as: wireless transmissions include bluetooth, infrared, mobile network or WLAN transmissions. The remote control device 100 includes a binocular camera, an algorithm module, and a transmission module. The controlled device 200 includes a receiving module and a screen.
In fig. 2, the binocular camera of the remote control apparatus 100 is used to capture an image of the screen of the controlled apparatus 200, the image including the outline of the screen, and transmit the image including the outline of the screen to the algorithm module. The algorithm module obtains the position coordinates of the target position of the remote control device 100 just facing the screen according to the image, and sends the first information to the sending module. Wherein the first information includes position coordinates of a target position where the remote control device 100 is facing the screen. The sending module is configured to send the first information to the controlled device 200 in a wireless transmission manner. The receiving module of the controlled device 200 is configured to receive the first information and send the first information to the screen. The screen is used for determining the target position of the remote control equipment facing the screen according to the first information; and executing a first operation on the screen according to the target position. Specifically, the screen is configured to determine a target position corresponding to the position coordinate on the screen according to the position coordinate, and execute a first operation on the screen according to the target position.
In fig. 1 and 2, the remote control device 100 includes a binocular camera. Specifically, as shown in fig. 3, a binocular camera is located at the front end of the remote control apparatus 100, and can recognize the outline of the screen. The screen is photographed by a binocular camera to obtain two images containing the outline of the screen, and as shown in fig. 4, the image containing the outline of the screen on the left side is photographed by a left eye camera, and the image containing the outline of the screen on the right side is photographed by a right eye camera.
For example, the binocular camera includes a binocular depth camera.
It should be noted that, as shown in fig. 3, when the remote control device 100 faces the screen, a connecting line between the center point of the remote control device 100 and the target position on the screen is parallel to the longer two sides of the remote control device 100.
As shown in fig. 5, the algorithm module is specifically configured to: calibrating the binocular camera in an off-line state; the distortion caused by a binocular camera is removed by correcting the two images of the screen; calculating a matching point between the two corrected images through binocular matching to obtain a disparity map; and calculating the position coordinates of the remote control device 100 facing the screen through the 3D coordinates according to the parallax map.
The aim of calibrating the binocular cameras in the off-line state is to align the two cameras by acquiring the internal reference and the external reference of the two cameras. Firstly, calibrating a left eye camera to obtain internal parameters and external parameters of the left eye camera; secondly, calibrating the right-eye camera to obtain internal parameters and external parameters of the right-eye camera; and finally, calibrating the binocular camera to obtain the translation and rotation relation between the left eye camera and the right eye camera. Wherein, the internal parameters comprise focal length, image center, distortion coefficient and the like; the external parameters comprise a rotation matrix and a translation matrix.
The difference of the two images in the X direction can be obtained by correcting the two images of the screen, and the accuracy of parallax calculation can be improved. Specifically, the correction of the two images of the screen includes distortion correction and conversion to a standard form.
The binocular matching is a core part of binocular depth estimation, and the main purpose is to calculate the relative matching relation of pixels between two images. Specifically, the binocular matching comprises 5 steps of matching error calculation, error integration, disparity map calculation, disparity map optimization and disparity map correction.
And reconstructing three-dimensional information of a screen in the picture according to the matching information in the disparity map and the triangulation principle, and calculating a position coordinate of the remote control device 100 just facing the screen through a 3D coordinate.
Wherein the first operation comprises: displaying a pointer at a target position; or, the control corresponding to the target position is highlighted or shaded.
It should be noted that, when the user moves the remote control device 100, the image containing the screen outline acquired by the remote control device 100 changes, the position coordinates of the remote control device 100 facing the screen need to be recalculated, the position of the pointer is refreshed by the controlled device 200 according to the new position coordinates, and the pointer moves according to the pointing direction of the remote control device; alternatively, the controlled device 200 refreshes the highlighted or shaded control according to the new position coordinates.
Based on the architecture diagrams provided in fig. 1 and fig. 2, an embodiment of the present invention provides a remote control method applied to a controlled device 200, where the controlled device 200 includes a screen. Fig. 6 is a flowchart of a remote control method according to an embodiment of the present invention. As shown in fig. 6, the method includes:
and 102, receiving first information of the remote control equipment.
The first information comprises position coordinates of a target position of the screen, which is just opposite to the screen, of the remote control equipment when the remote control equipment points to the screen; alternatively, the first information includes an image including an outline of the screen photographed by a binocular camera of the remote control apparatus.
In the embodiment of the present invention, as shown in fig. 1, the first information includes an image including an outline of the screen captured by a binocular camera of the remote control apparatus, and the receiving module of the controlled apparatus 200 receives the first information of the remote control apparatus 100 and then transmits the first information to the algorithm module.
In the embodiment of the present invention, as shown in fig. 2, the first information includes the position coordinates of the target position where the remote control device 100 is facing the screen, and the receiving module of the controlled device 200 receives the first information sent by the remote control device 100 and then sends the first information to the screen.
The remote control equipment comprises a binocular camera, and the position coordinates of the target position are obtained according to an image which is shot by the binocular camera of the remote control equipment and contains the outline of the screen. Specifically, the binocular camera is located the front end of remote control equipment, can discern the outline of screen, shoots the screen through binocular camera and obtains the image of two outlines that contain the screen.
For example, the binocular camera includes a binocular depth camera.
And step 104, determining the target position of the remote control equipment just facing the screen according to the first information.
As an alternative, the first information includes position coordinates of a target position where the remote control device is facing the screen; step 104, comprising: and determining a target position corresponding to the position coordinates on the screen according to the position coordinates.
As shown in fig. 2, the receiving module of the controlled device 200 transmits the first information to the screen, and the screen determines a target position corresponding to the position coordinates according to the position coordinates in the first information.
As another alternative, the first information includes an image containing an outline of the screen taken by a binocular camera of the remote control apparatus; step 104 comprises: and determining a target position corresponding to the position coordinates on the screen according to the image.
Specifically, as shown in fig. 7, step 104 includes:
and step 1042, obtaining the position coordinates of the target position of the remote control equipment just facing the screen according to the image.
As shown in fig. 1, the algorithm module of the controlled device 200 obtains the position coordinates of the target position where the remote control device 100 is facing the screen from the image, and then transmits the position coordinates to the screen.
Specifically, the algorithm module calibrates the binocular camera in an off-line state; the distortion caused by a binocular camera is removed by correcting the two images of the screen; calculating a matching point between the two corrected images through binocular matching to obtain a disparity map; from the disparity map, the coordinates of the position where the remote control device 100 is facing the screen are calculated from the 3D coordinates.
And step 1044, determining a target position corresponding to the position coordinates on the screen according to the position coordinates.
As shown in fig. 1, the algorithm module of the controlled device 200 transmits the position coordinates to the screen, and the screen determines a target position corresponding to the position coordinates on the screen according to the position coordinates in the first information.
And 106, executing a first operation on the screen according to the target position.
As shown in fig. 1 and 2, the screen of the controlled device 200 performs the first operation on the target position corresponding to the position coordinates on the screen according to the position coordinates.
Wherein the first operation comprises: displaying a pointer at the target location; or highlighting or shading the control corresponding to the target position.
It should be noted that, when the user moves the remote control device, the image acquired by the remote control device changes, the position coordinates need to be recalculated, the controlled device refreshes the pointer position according to the new position coordinates, and the pointer moves according to the pointing direction of the remote control device; or the controlled device refreshes the highlighted or shaded control according to the new position coordinate.
In the technical scheme of the remote control method provided by the embodiment of the invention, first information of remote control equipment is received; determining a target position of the remote control equipment facing the screen according to the first information; and executing a first operation on the screen according to the target position. According to the embodiment of the invention, the remote operation of the controlled equipment can be realized on the premise that the user does not need to perform mechanical frequent operation on the five-way navigation key, and the experience of the user in the using process is improved.
Based on the architecture diagrams provided in fig. 1 and fig. 2, an embodiment of the present invention provides a remote control method applied to a remote control device 100. Fig. 8 is a flowchart of a remote control method according to another embodiment of the present invention. As shown in fig. 8, the method includes:
step 202, first information is obtained, and the first information is used for determining a target position of the screen of the remote control device facing the controlled device.
As an alternative, the remote control device includes a binocular camera, and the first information includes position coordinates of a target position where the remote control device is directly facing the screen; as shown in fig. 9, step 202 includes:
step 2022, when the remote control device points to the screen, shooting an image containing the outline of the screen through the binocular camera.
The binocular camera is located at the front end of the remote control device and can identify the outline of the screen, and the screen is shot through the binocular camera to obtain two images containing the outline of the screen.
For example, the binocular camera includes a binocular depth camera.
In the embodiment of the present invention, as shown in fig. 2, the remote control device 100 takes an image including the outline of the screen by a binocular camera and transmits the image to the algorithm module.
Step 2024, obtaining the position coordinates of the target position of the remote control device just facing the screen according to the image.
In the embodiment of the present invention, as shown in fig. 2, the remote control device 100 takes an image including the outline of the screen by a binocular camera and transmits the image to the algorithm module. The algorithm module determines the position coordinate of the target position of the remote control device just facing the screen according to the image and sends the position coordinate to the sending module. Specifically, the algorithm module calibrates the binocular camera in an off-line state; the distortion caused by a binocular camera is removed by correcting the two images of the screen; calculating a matching point between the two corrected images through binocular matching to obtain a disparity map; from the disparity map, the coordinates of the position where the remote control device 100 is facing the screen are calculated from the 3D coordinates. The sending module generates first information according to the position coordinates.
As another alternative, the remote control device includes a binocular camera, the first information includes two images including outlines of the screen taken by the binocular camera, and step 202 includes: when the remote control device points at the screen, images are shot through the binocular camera.
In the embodiment of the present invention, as shown in fig. 1, the remote control device 100 takes an image including an outline of a screen by using a binocular camera and transmits the image to the transmission module, and the transmission module generates first information according to the image.
And step 204, sending the first information to the controlled device, so that the controlled device determines a target position according to the first information, and executes a first operation on a screen according to the target position.
As shown in fig. 1, the first information includes an image including an outline of the screen captured by the binocular camera of the remote control apparatus 100. The transmitting module of the remote controlling apparatus 100 transmits the first information to the receiving module of the controlled apparatus 200. The receiving module receives the first information and sends the first information to the algorithm module. The algorithm module determines the position coordinates of the target position where the remote control device 100 is facing the screen according to the first information, and transmits the position coordinates to the screen. Specifically, the algorithm module calibrates the binocular camera in an off-line state; the distortion caused by a binocular camera is removed by correcting the two images of the screen; calculating a matching point between the two corrected images through binocular matching to obtain a disparity map; from the disparity map, the coordinates of the position where the remote control device 100 is facing the screen are calculated from the 3D coordinates. And the screen determines a target position corresponding to the position coordinates on the screen according to the position coordinates, and executes a first operation on the screen according to the target position.
As shown in fig. 2, the first information includes position coordinates of a target position where the remote control device 100 is facing the screen. The transmitting module of the remote controlling apparatus 100 transmits the first information to the receiving module of the controlled apparatus 200. The receiving module receives the first information and sends the first information to the screen. And the screen determines a target position corresponding to the position coordinates on the screen according to the position coordinates, and executes a first operation on the screen according to the target position.
Wherein the first operation comprises: displaying a pointer at the target location; or the control corresponding to the target position is highlighted or shaded.
It should be noted that, when the user moves the remote control device, the image of the screen acquired by the remote control device changes, the position coordinates need to be recalculated, the position of the pointer is refreshed by the controlled device according to the new position coordinates, and the pointer moves according to the pointing direction of the remote control device; or the controlled equipment refreshes the highlighted or shaded control according to the new position coordinate.
In the technical scheme of the remote control method provided by the embodiment of the invention, the remote control equipment sends the first information to the controlled equipment, so that the controlled equipment determines the target position of the remote control equipment, which is just opposite to the screen, according to the first information, and executes the first operation on the screen according to the target position. According to the embodiment of the invention, the remote operation of the controlled equipment can be realized on the premise that the user does not need to mechanically and frequently operate the five-way navigation key, and the experience of the user in the using process is improved.
The remote control method provided by the embodiment of the present invention is described in detail above with reference to fig. 1 to 9, and the embodiment of the apparatus of the present invention is described in detail below with reference to fig. 10. It should be understood that the electronic device in the embodiment of the present invention may execute various methods in the foregoing embodiments of the present invention, that is, specific working processes of various products below, and reference may be made to corresponding processes in the foregoing embodiments of the methods.
The embodiment of the invention provides electronic equipment, which can be terminal equipment or circuit equipment arranged in the terminal equipment. The electronic device may be adapted to perform the functions/steps of the above-described method embodiments.
Fig. 10 is a schematic structural diagram of an electronic device 300 according to an embodiment of the present invention. The electronic device 300 may include a processor 310, an external memory interface 320, an internal memory 321, a Universal Serial Bus (USB) interface 330, a charging management module 340, a power management module 341, a battery 342, an antenna 1, an antenna 2, a mobile communication module 350, a wireless communication module 360, an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an earphone interface 370D, a sensor module 380, a button 390, a motor 391, an indicator 392, a camera 393, a display screen, and a Subscriber Identification Module (SIM) card interface 395, etc. The sensor module 380 may include a pressure sensor 380A, a gyroscope sensor 380B, an air pressure sensor 380C, a magnetic sensor 380D, an acceleration sensor 380E, a distance sensor 380F, a proximity light sensor 380G, a fingerprint sensor 380H, a temperature sensor 380J, a touch sensor 380K, an ambient light sensor 380L, a bone conduction sensor 380M, and the like.
It is to be understood that the illustrated structure of the embodiment of the invention is not intended to limit the electronic device 300. In other embodiments of the invention, the electronic device 300 may include more or fewer components than illustrated, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 310 may include one or more processing units, such as: the processor 310 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 310 for storing instructions and data. In some embodiments, the memory in the processor 310 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 310. If the processor 310 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 310, thereby increasing the efficiency of the system.
In some embodiments, processor 310 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor 310 may include multiple sets of I2C buses. Processor 310 may be coupled to touch sensor 380K, a charger, a flash, a camera 393, etc., via different I2C bus interfaces. For example: the processor 310 may be coupled to the touch sensor 380K through an I2C interface, so that the processor 310 and the touch sensor 380K communicate through an I2C bus interface to implement a touch function of the electronic device 300.
The I2S interface may be used for audio communication. In some embodiments, the processor 310 may include multiple sets of I2S buses. The processor 310 may be coupled to the audio module 370 through an I2S bus, enabling communication between the processor 310 and the audio module 370. In some embodiments, the audio module 370 may transmit the audio signal to the wireless communication module 360 through an I2S interface, so as to implement a function of answering a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 370 and the wireless communication module 360 may be coupled by a PCM bus interface. In some embodiments, the audio module 370 may also transmit audio signals to the wireless communication module 360 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 310 with the wireless communication module 360. For example: the processor 310 communicates with the bluetooth module in the wireless communication module 360 through the UART interface to implement the bluetooth function. In some embodiments, the audio module 370 may transmit the audio signal to the wireless communication module 360 through a UART interface, so as to implement the function of playing music through a bluetooth headset.
The MIPI interface may be used to connect processor 310 with peripheral devices such as display 394, camera 393, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 310 and camera 393 communicate over a CSI interface to implement the capture functionality of electronic device 300. The processor 310 and the display screen 394 communicate via the DSI interface to implement the display functions of the electronic device 300.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 310 with the camera 393, the display 394, the wireless communication module 360, the audio module 370, the sensor module 380, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
The USB interface 330 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 330 may be used to connect a charger to charge the electronic device 300, and may also be used to transmit data between the electronic device 300 and peripheral devices. And the method can also be used for connecting a headset and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 300. In other embodiments of the present invention, the electronic device 300 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 340 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 340 may receive charging input from a wired charger via the USB interface 330. In some wireless charging embodiments, the charging management module 340 may receive a wireless charging input through a wireless charging coil of the electronic device 300. The charging management module 340 may also supply power to the electronic device through the power management module 341 while charging the battery 342.
The power management module 341 is configured to connect the battery 342, the charging management module 340 and the processor 310. The power management module 341 receives input from the battery 342 and/or the charge management module 340 and provides power to the processor 310, the internal memory 321, the display 394, the camera 393, and the wireless communication module 360. The power management module 341 may also be configured to monitor parameters such as battery capacity, battery cycle count, and battery state of health (leakage, impedance). In other embodiments, the power management module 341 may also be disposed in the processor 310. In other embodiments, the power management module 341 and the charging management module 340 may be disposed in the same device.
The wireless communication function of the electronic device 300 may be implemented by the antenna 1, the antenna 2, the mobile communication module 350, the wireless communication module 360, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 300 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 350 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 300. The mobile communication module 350 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 350 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the filtered electromagnetic wave to the modem processor for demodulation. The mobile communication module 350 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 350 may be provided in the processor 310. In some embodiments, at least some of the functional modules of the mobile communication module 350 may be disposed in the same device as at least some of the modules of the processor 310.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 370A, the receiver 370B, etc.) or displays images or video through the display 394. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 310, and may be disposed in the same device as the mobile communication module 350 or other functional modules.
The wireless communication module 360 may provide solutions for wireless communication applied to the electronic device 300, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like.
The wireless communication module 360 may be one or more devices integrating at least one communication processing module. The wireless communication module 360 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 310.
The wireless communication module 360 may also receive a signal to be transmitted from the processor 310, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 300 is coupled to mobile communication module 350 and antenna 2 is coupled to wireless communication module 360 such that electronic device 300 may communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 300 implements display functions via the GPU, the display 394, and the application processor, among other things. The GPU is an image processing microprocessor coupled to a display 394 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 310 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 394 is used to display images, video, and the like. The display screen 394 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 300 may include 1 or N display screens 394, N being a positive integer greater than 1.
The electronic device 300 may implement a shooting function through the ISP, the camera 393, the video codec, the GPU, the display 394, the application processor, and the like.
The ISP is used to process the data fed back by the camera 393. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be located in camera 393.
Camera 393 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 300 may include 1 or N cameras 393, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 300 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 300 may support one or more video codecs. In this way, the electronic device 300 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the electronic device 300, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 320 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 300. The external memory card communicates with the processor 310 through the external memory interface 320 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 321 may be used to store computer-executable program code, which includes instructions. The internal memory 321 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, phone book, etc.) created during use of the electronic device 300, and the like. In addition, the internal memory 321 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like. The processor 310 executes various functional applications of the electronic device 300 and data processing by executing instructions stored in the internal memory 321 and/or instructions stored in a memory provided in the processor.
The electronic device 300 may implement audio functions through the audio module 370, the speaker 370A, the receiver 370B, the microphone 370C, the earphone interface 370D, and the application processor. Such as music playing, recording, etc.
The audio module 370 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 370 may also be used to encode and decode audio signals. In some embodiments, the audio module 370 may be disposed in the processor 310, or some functional modules of the audio module 370 may be disposed in the processor 310.
The speaker 370A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic device 300 can listen to music through the speaker 370A or listen to a hands-free conversation.
The receiver 370B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic device 300 receives a call or voice information, it is possible to receive voice by placing the receiver 370B close to the human ear.
Microphone 370C, also known as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal into the microphone 370C by making a sound near the microphone 370C through the mouth. The electronic device 300 may be provided with at least one microphone 370C. In other embodiments, the electronic device 300 may be provided with two microphones 370C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 300 may further include three, four or more microphones 370C to collect sound signals, reduce noise, identify sound sources, and perform directional recording.
The earphone interface 370D is used to connect a wired earphone. The headset interface 370D may be the USB interface 330, or may be an open mobile equipment platform (OMTP) standard interface of 3.5mm, a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA.
The pressure sensor 380A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 380A may be disposed on the display screen 394.
The pressure sensor 380A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, or the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 380A, the capacitance between the electrodes changes. The electronic device 300 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen 394, the electronic apparatus 300 detects the intensity of the touch operation according to the pressure sensor 380A. The electronic apparatus 300 may also calculate the touched position from the detection signal of the pressure sensor 380A. In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 380B may be used to determine the motion pose of the electronic device 300. In some embodiments, the angular velocity of electronic device 300 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 380B. The gyro sensor 380B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 380B detects the shake angle of the electronic device 300, calculates the distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 300 through a reverse movement, thereby achieving anti-shake. The gyro sensor 380B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 380C is used to measure air pressure. In some embodiments, electronic device 300 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 380C.
The magnetic sensor 380D includes a hall sensor. The electronic device 300 may detect the opening and closing of the flip holster using the magnetic sensor 380D. In some embodiments, when the electronic device 300 is a flip phone, the electronic device 300 may detect the opening and closing of the flip according to the magnetic sensor 380D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 380E may detect the magnitude of acceleration of the electronic device 300 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 300 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 380F for measuring distance. The electronic device 300 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 300 may utilize the distance sensor 380F to range for fast focus.
The proximity light sensor 380G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 300 emits infrared light to the outside through the light emitting diode. The electronic device 300 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 300. When insufficient reflected light is detected, the electronic device 300 may determine that there are no objects near the electronic device 300. The electronic device 300 can utilize the proximity light sensor 380G to detect that the user holds the electronic device 300 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 380G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 380L is used to sense the ambient light level. The electronic device 300 may adaptively adjust the brightness of the display 394 based on the perceived ambient light level. The ambient light sensor 380L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 380L may also cooperate with the proximity light sensor 380G to detect whether the electronic device 300 is in a pocket to prevent inadvertent contact.
The fingerprint sensor 380H is used to capture a fingerprint. The electronic device 300 may utilize the collected fingerprint characteristics to implement fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering, and the like.
The temperature sensor 380J is used to detect temperature. In some embodiments, the electronic device 300 implements a temperature processing strategy using the temperature detected by the temperature sensor 380J. For example, when the temperature reported by the temperature sensor 380J exceeds a threshold, the electronic device 300 performs a reduction in performance of a processor located near the temperature sensor 380J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 300 heats the battery 342 when the temperature is below another threshold to avoid the low temperature causing the electronic device 300 to shut down abnormally. In other embodiments, when the temperature is below a further threshold, the electronic device 300 performs a boost on the output voltage of the battery 342 to avoid an abnormal shutdown due to low temperature.
The touch sensor 380K is also referred to as a "touch device". The touch sensor 380K may be disposed on the display screen 394, and the touch sensor 380K and the display screen 394 form a touch screen, which is also referred to as a "touch screen". The touch sensor 380K is used to detect a touch operation applied thereto or thereabout. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 394. In other embodiments, the touch sensor 380K can be disposed on a surface of the electronic device 300 at a different location than the display 394.
The bone conduction sensor 380M can acquire a vibration signal. In some embodiments, the bone conduction transducer 380M can acquire a vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 380M may also contact the human body pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 380M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 370 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 380M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 380M, so as to realize the heart rate detection function.
Keys 390 include a power-on key, a volume key, etc. The keys 390 may be mechanical keys. Or may be touch keys. The electronic device 300 may receive a key input, and generate a key signal input related to user setting and function control of the electronic device 300.
The motor 391 may generate a vibration cue. The motor 391 may be used for both incoming call vibration prompting and touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 391 may also respond to different vibration feedback effects in response to touch operations applied to different areas of the display screen 394. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 392 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 395 is for connecting a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 300 by being inserted into and pulled out of the SIM card interface 395. The electronic device 300 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 395 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 395 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 395 may also be compatible with different types of SIM cards. The SIM card interface 395 may also be compatible with an external memory card. The electronic device 300 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 300 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 300 and cannot be separated from the electronic device 300.
Embodiments of the present invention provide a computer-readable storage medium, which stores instructions that, when executed on a terminal device, cause the terminal device to perform the functions/steps in the above method embodiments.
Embodiments of the present invention also provide a computer program product comprising instructions for causing a computer to perform the functions/steps of the above-described method embodiments when the computer program product runs on a computer or any at least one processor.
In the embodiments of the present invention, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided by the present invention, any function, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present invention, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the protection scope of the present invention. The protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (30)

1. A remote control method applied to a controlled device including a screen, the method comprising:
receiving first information of a remote control device;
determining the target position of the remote control equipment facing the screen according to the first information;
and executing a first operation on the screen according to the target position.
2. The method of claim 1, wherein the first information includes position coordinates of the target location where the remote control device is facing the screen;
determining the target position of the remote control equipment facing the screen according to the first information, wherein the determining comprises the following steps:
and determining the target position corresponding to the position coordinate on the screen according to the position coordinate.
3. The method of claim 1, wherein the position coordinates of the target location are derived from an image containing an outline of the screen taken by a binocular camera of the remote control device.
4. The method according to claim 3, wherein the first information includes an image containing an outline of the screen taken by a binocular camera of the remote control apparatus;
the determining the target position of the remote control device facing the screen according to the first information includes: and determining the target position corresponding to the position coordinates on the screen according to the image.
5. The method of claim 3, wherein said determining from the image a target location on the screen corresponding to the location coordinates comprises:
obtaining the position coordinates of the target position of the remote control equipment facing the screen according to the image;
and determining the target position corresponding to the position coordinate on the screen according to the position coordinate.
6. The method of claim 2 or 4, wherein the first operation comprises: displaying a pointer at the target location; or highlighting or shading the control corresponding to the target position.
7. A remote control method, applied to a remote control device, the method comprising:
acquiring first information, wherein the first information is used for determining a target position of the remote control equipment facing a screen of controlled equipment;
and sending the first information to the controlled equipment so that the controlled equipment determines the target position according to the first information and executes a first operation on the screen according to the target position.
8. The method of claim 7, wherein the remote control device comprises a binocular camera, and the first information comprises position coordinates of the target location where the remote control device is facing the screen;
acquiring first information, including:
when the remote control equipment points to the screen, shooting an image containing the outline of the screen through the binocular camera;
and obtaining the position coordinate of the target position of the remote control equipment just facing the screen according to the image.
9. The method of claim 7, wherein the remote control device includes a binocular camera, and the first information includes an image taken through the binocular camera that includes an outline of the screen;
acquiring first information, including:
and when the remote control equipment points to the screen, shooting the image through the binocular camera.
10. The method of claim 7, wherein the first operation comprises: displaying a pointer at the target location; or highlighting or shading the control corresponding to the target position.
11. A remote control apparatus for use with a controlled device, the controlled device including a screen, the apparatus comprising:
the receiving module is used for receiving first information of the remote control equipment;
the determining module is used for determining the target position of the remote control equipment facing the screen according to the first information;
and the execution module is used for executing a first operation on the screen according to the target position.
12. The apparatus of claim 11, wherein the first information comprises position coordinates of the target location where the remote control device is facing the screen;
the determining module is specifically configured to determine the target position corresponding to the position coordinate on the screen according to the position coordinate.
13. The apparatus of claim 11, wherein the position coordinates of the target location are derived from an image containing an outline of the screen taken by a binocular camera of the remote control device.
14. The apparatus according to claim 13, wherein the first information includes an image containing an outline of the screen taken by a binocular camera of the remote control device;
the determining module is specifically configured to determine the target position corresponding to the position coordinate on the screen according to the image.
15. The apparatus of claim 13, wherein the determining module is specifically configured to: and obtaining the position coordinate of the target position of the remote control equipment, which is just opposite to the screen, according to the image, and determining the target position corresponding to the position coordinate on the screen according to the position coordinate.
16. The apparatus of claim 12 or 14, wherein the first operation comprises: displaying a pointer at the target location; or highlighting or shading the control corresponding to the target position.
17. A remote control apparatus, applied to a remote control device, the apparatus comprising:
the remote control device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring first information which is used for determining the target position of the screen of the remote control device facing the controlled device;
and the sending module is used for sending the first information to the controlled equipment so that the controlled equipment determines the target position according to the first information and executes a first operation on the screen according to the target position.
18. The apparatus of claim 17, wherein the remote control device comprises a binocular camera, and the first information comprises position coordinates of the target location where the remote control device is facing the screen;
the acquisition module is specifically used for shooting an image containing the outline of the screen through the binocular camera when the remote control equipment points to the screen, and obtaining the position coordinate of the target position, facing the screen, of the remote control equipment according to the image.
19. The apparatus of claim 17, wherein the remote control device comprises a binocular camera, and the first information comprises an image taken through the binocular camera containing an outline of the screen;
the acquisition module is specifically used for shooting the image through the binocular camera when the remote control equipment points to the screen.
20. The apparatus of claim 17, wherein the first operation comprises: displaying a pointer at the target location; or highlighting or shading the control corresponding to the target position.
21. A controlled device comprising a screen, a processor and a memory, wherein the memory is configured to store a computer program comprising program instructions that, when executed by the processor, cause the controlled device to perform the steps of:
receiving first information of a remote control device;
determining the target position of the remote control equipment facing the screen according to the first information;
and executing a first operation on the screen according to the target position.
22. The apparatus as claimed in claim 21, wherein the program instructions, when executed by the processor, cause the controlled apparatus to perform the steps of:
the first information comprises position coordinates of the target position of the screen, which is just opposite to the remote control equipment;
determining the target position of the remote control equipment facing the screen according to the first information, wherein the determining comprises the following steps:
and determining the target position corresponding to the position coordinate on the screen according to the position coordinate.
23. The apparatus of claim 21, wherein the position coordinates of the target location are derived from an image containing an outline of the screen taken by a binocular camera of the remote control apparatus.
24. The device of claim 23, wherein the program instructions, when executed by the processor, cause the controlled device to perform the steps of:
the first information includes an image including an outline of the screen photographed by a binocular camera of the remote control apparatus;
the determining the target position of the remote control device facing the screen according to the first information includes: and determining the target position corresponding to the position coordinates on the screen according to the image.
25. The apparatus as claimed in claim 23, wherein the program instructions, when executed by the processor, cause the controlled apparatus to perform the steps of:
the determining a target position on the screen corresponding to the position coordinates according to the image includes:
obtaining the position coordinate of the target position of the remote control equipment facing the screen according to the image;
and determining the target position corresponding to the position coordinate on the screen according to the position coordinate.
26. The apparatus of claim 22 or 24, wherein the first operation comprises: displaying a pointer at the target location; or highlighting or shading the control corresponding to the target position.
27. A remote control device comprising a processor and a memory, wherein the memory is configured to store a computer program comprising program instructions that, when executed by the processor, cause the remote control device to perform the steps of:
acquiring first information, wherein the first information is used for determining a target position of a screen of a controlled device just faced by the remote control device;
and sending the first information to the controlled equipment so that the controlled equipment determines the target position according to the first information and executes a first operation on the screen according to the target position.
28. The apparatus of claim 27, wherein the remote control device comprises a binocular camera, and the first information comprises position coordinates of the target location where the remote control device is facing the screen; when the processor executes the program instructions, causing the remote control device to perform the steps of:
acquiring first information, including:
when the remote control equipment points to the screen, shooting an image containing the outline of the screen through the binocular camera;
and obtaining the position coordinates of the target position of the remote control equipment just facing the screen according to the image.
29. The apparatus of claim 27, wherein the remote control device includes a binocular camera, and the first information includes an image including an outline of the screen taken by the binocular camera; when the processor executes the program instructions, causing the remote control device to perform the steps of:
acquiring first information, including:
and when the remote control equipment points to the screen, shooting the image through the binocular camera.
30. The apparatus of claim 27, wherein the first operation comprises: displaying a pointer at the target location; or highlighting or shading the control corresponding to the target position.
CN202210248607.6A 2021-08-30 2022-03-14 Remote control method, remote control device and controlled device Pending CN115762108A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/113924 WO2023030067A1 (en) 2021-08-30 2022-08-22 Remote control method, remote control device and controlled device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202122060808 2021-08-30
CN2021220608087 2021-08-30

Publications (1)

Publication Number Publication Date
CN115762108A true CN115762108A (en) 2023-03-07

Family

ID=85349034

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210248607.6A Pending CN115762108A (en) 2021-08-30 2022-03-14 Remote control method, remote control device and controlled device

Country Status (2)

Country Link
CN (1) CN115762108A (en)
WO (1) WO2023030067A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202976435U (en) * 2012-11-16 2013-06-05 青岛海尔电子有限公司 Remote controller and television remote control system
CN104064022A (en) * 2014-07-01 2014-09-24 北京新华春天教育科技有限公司 Remote control method and system
KR101595958B1 (en) * 2014-08-27 2016-02-18 엘지전자 주식회사 Image display device and operation method of the image display device
CN104270664B (en) * 2014-10-29 2017-09-05 上海联彤网络通讯技术有限公司 Light pen remote control, the system and method for realizing intelligent operating platform input control
US9866789B2 (en) * 2015-02-25 2018-01-09 Echostar Technologies L.L.C. Automatic program formatting for TV displays
JP6849870B1 (en) * 2020-07-09 2021-03-31 龍秀 河原 Management server and treatment screen provision method

Also Published As

Publication number Publication date
WO2023030067A1 (en) 2023-03-09

Similar Documents

Publication Publication Date Title
CN110458902B (en) 3D illumination estimation method and electronic equipment
CN113810601B (en) Terminal image processing method and device and terminal equipment
WO2020259542A1 (en) Control method for display apparatus, and related device
CN111182140B (en) Motor control method and device, computer readable medium and terminal equipment
CN111741284A (en) Image processing apparatus and method
CN114365482A (en) Large aperture blurring method based on Dual Camera + TOF
CN113596319A (en) Picture-in-picture based image processing method, apparatus, storage medium, and program product
CN114257920B (en) Audio playing method and system and electronic equipment
CN114339429A (en) Audio and video playing control method, electronic equipment and storage medium
CN113965693B (en) Video shooting method, device and storage medium
CN114500901A (en) Double-scene video recording method and device and electronic equipment
CN113518189B (en) Shooting method, shooting system, electronic equipment and storage medium
CN112272191B (en) Data transfer method and related device
CN112188094B (en) Image processing method and device, computer readable medium and terminal equipment
CN113467735A (en) Image adjusting method, electronic device and storage medium
CN115134640A (en) Synchronous playing method and device
CN112037157A (en) Data processing method and device, computer readable medium and electronic equipment
CN113781548A (en) Multi-device pose measurement method, electronic device and system
CN113364970A (en) Imaging method of non-line-of-sight object and electronic equipment
CN113923351B (en) Method, device and storage medium for exiting multi-channel video shooting
CN113596320B (en) Video shooting variable speed recording method, device and storage medium
CN115412678A (en) Exposure processing method and device and electronic equipment
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium
WO2023030067A1 (en) Remote control method, remote control device and controlled device
CN114812381A (en) Electronic equipment positioning method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination