CN111698545A - Remote controller, method and device for identifying operation action, terminal and storage medium - Google Patents

Remote controller, method and device for identifying operation action, terminal and storage medium Download PDF

Info

Publication number
CN111698545A
CN111698545A CN202010586254.1A CN202010586254A CN111698545A CN 111698545 A CN111698545 A CN 111698545A CN 202010586254 A CN202010586254 A CN 202010586254A CN 111698545 A CN111698545 A CN 111698545A
Authority
CN
China
Prior art keywords
remote controller
index
carrier
image
index points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010586254.1A
Other languages
Chinese (zh)
Inventor
方迟
马苏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202010586254.1A priority Critical patent/CN111698545A/en
Publication of CN111698545A publication Critical patent/CN111698545A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor

Abstract

The embodiment of the disclosure provides a remote controller, a method and a device for identifying operation actions, a terminal and a storage medium. The method for identifying the operation action comprises the following steps: acquiring an image of an operation carrier with a calibration point; identifying and calculating characteristic parameters of a calibration point in an image; acquiring physical parameters of the operation carrier according to the corresponding relation between the characteristic parameters of the pre-constructed calibration points and the physical parameters of the operation carrier; and determining the operation of the user on the operation carrier according to the physical parameters of the operation carrier. The remote control method and the remote control device can achieve the purpose of achieving remote control through the motion of the operation carrier without entity keys.

Description

Remote controller, method and device for identifying operation action, terminal and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of computers, and in particular relates to a remote controller, a method and a device for identifying an operation action, a terminal and a storage medium.
Background
At present, in far-field large-screen scenes such as televisions and the like, a common key type infrared remote controller is still used for operation, a user can gradually move a mouse pointer to a preset area by pressing an upper control key, a lower control key, a left control key and a right control key for multiple times, and finally a confirmation key is pressed to select functions, so that the operation is extremely complicated.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In order to solve the above problems, the present disclosure provides a method, an apparatus, a terminal and a storage medium for identifying an operation action.
An embodiment of the present disclosure provides a method for identifying an operation action, including: acquiring an image of an operation carrier with a calibration point; identifying and calculating characteristic parameters of the calibration points in the image; acquiring physical parameters of the operation carrier according to a pre-constructed corresponding relation between the characteristic parameters of the calibration point and the physical parameters of the operation carrier; and determining the operation of the user on the operation carrier according to the physical parameters of the operation carrier.
An embodiment of the present disclosure also provides an apparatus for recognizing an operation of a remote controller, the apparatus including: the image acquisition module is configured to acquire an image of the operation carrier with the calibration point; the parameter acquisition module is configured to identify and calculate the characteristic parameters of the calibration point in the image, and acquire the physical parameters of the operation carrier according to the pre-constructed corresponding relationship between the characteristic parameters of the calibration point and the physical parameters of the operation carrier; and the operation determining module is configured to determine the operation of the user on the operation carrier according to the physical parameters of the operation carrier.
An embodiment of the present disclosure also provides a remote controller, including: a remote controller main body; more than two index points, wherein at least two index points are positioned on the same side of the remote controller main body; wherein the at least two index points are spaced apart from each other by a predetermined distance.
According to another embodiment of the present disclosure, there is provided a terminal including: at least one memory and at least one processor; the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the method for identifying the operation action.
According to still another embodiment of the present disclosure, there is provided a computer storage medium storing program code for performing the above-described method of identifying an operational action.
According to the method and the device, the image of the operation carrier with the calibration point is obtained, the parameters of the calibration point in the image are identified and calculated, the physical parameters of the operation carrier are obtained according to the corresponding relation between the pre-constructed characteristic parameters of the calibration point and the physical parameters of the operation carrier, and then the operation of a user on the operation carrier is determined according to the physical parameters of the operation carrier, so that the aim of remote control can be achieved through the movement of the operation carrier without entity keys.
Drawings
In order to more clearly illustrate the solution in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present disclosure, and that other drawings can be obtained by those skilled in the art without inventive effort.
Fig. 1 shows a flow diagram of a method of identifying an operational action of an embodiment of the present disclosure.
Fig. 2 shows a schematic plan view of a manipulation carrier of the present disclosure.
Fig. 3 shows a schematic diagram of the operation carrier of an embodiment of the present disclosure.
Fig. 4 shows a schematic diagram of an apparatus for recognizing an operation action of an embodiment of the present disclosure.
Fig. 5 shows a schematic plan view of a remote control of an embodiment of the present disclosure.
Fig. 6 illustrates a perspective view of a remote controller of an embodiment of the present disclosure.
Fig. 7 illustrates a perspective view of a remote controller of an embodiment of the present disclosure.
FIG. 8 illustrates a schematic structural diagram of an electronic device suitable for use in implementing embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The present disclosure contemplates eliminating physical keys of an operation carrier (e.g., a remote control), acquiring manipulation information of the operation carrier by recognizing a motion of the operation carrier itself, and responding accordingly to the manipulation information. In modern intelligent large screen devices (e.g., televisions), there is typically one camera. The camera is used for capturing a picture, and the characteristic parameters of the calibration point of the operation carrier are identified from the picture, so that the physical parameters of the operation carrier are obtained, and the operation of a user on the operation carrier can be determined.
As shown in fig. 1, an embodiment of the present disclosure provides a method for recognizing an operation action, including step S101 of acquiring an image of an operation carrier with a calibration point. In some embodiments, the image of the operation carrier may be obtained through a camera of the television, and the camera may be an off-screen camera or an off-screen camera of the television, or may be an external camera fixed to the television. In some embodiments, the operation vehicle may comprise a remote control, it being understood that this is merely exemplary and that other suitable operation vehicles may be included.
The method of the present disclosure further includes a step S102 of identifying and calculating characteristic parameters of the index points in the image. In some embodiments, the operation may be performed locally to the device that acquired the image, or may be uploaded to the cloud for recognition and computation.
The method of the present disclosure further includes step S103, obtaining physical parameters of the operation carrier according to a correspondence between the pre-constructed characteristic parameters of the calibration point and the physical parameters of the operation carrier.
The method of the present disclosure further includes step S104, determining the operation of the user on the operation carrier according to the physical parameter of the operation carrier. Because the corresponding relation between the parameters of the index point and the physical parameters of the operation carrier is constructed in advance, after the image of the operation carrier is obtained in real time, the parameters of the index point on the operation carrier can be obtained, so that the real-time physical parameters of the operation carrier can be obtained, the motion track of the operation carrier bears the control information of the operation carrier, and the television makes a corresponding response based on the control information.
According to the method and the device, the image of the operation carrier with the calibration point is obtained, the parameters of the calibration point in the image are identified and calculated, the physical parameters of the operation carrier are obtained according to the corresponding relation between the pre-constructed characteristic parameters of the calibration point and the physical parameters of the operation carrier, and then the operation of a user on the operation carrier is determined according to the physical parameters of the operation carrier, so that the aim of remote control can be achieved through the movement of the operation carrier without entity keys.
In some embodiments, the index points of the handle carrier include at least two index points, and the at least two index points are separated by a predetermined distance. This is to facilitate the distinction of the two index points and thus to better identify the position and angle etc. of the work carrier.
In some embodiments, the characteristic parameters of the index points include pixel sizes of index points in the image, angles of lines between index points, and/or positions of index points in the image. By the pixel size of the calibration point, the distance between the calibration point and the device acquiring the image can be determined. The position and orientation of the work carrier can be determined by the angle of the line between the index points. The corresponding position of the index point can be obtained through the position of the index point in the image.
In some embodiments, the method of identifying a manipulation action further comprises constructing a correspondence between the characteristic parameters of the index point and the physical parameters of the manipulation carrier, wherein constructing a correspondence between the characteristic parameters of the index point and the physical parameters of the manipulation carrier comprises calculating the physical parameters of the manipulation carrier using geometric relationships based on the characteristic parameters of the index point in the image. In the present disclosure, a television is explained as an example, but the present disclosure is not limited thereto. In some embodiments, for each specific spatial position and angle, the size, shape, and angle of the image of the calibration point may be calculated according to the geometric perspective relationship, the physical shape, size, and distance of the calibration point, and the parameters of the camera itself.
In some embodiments, constructing the correspondence between the characteristic parameter of the index point and the physical parameter of the operation carrier includes: and acquiring an image of the operation carrier at a preset position and angle, and constructing a mapping relation between the characteristic parameters of the calibration point and the physical parameters of the operation carrier based on the characteristic parameters of the calibration point in the image. In some embodiments, a mapping table may be created from a predetermined image of the work carrier at a predetermined position and at a predetermined angle, so that a large number of predetermined measurements may be taken. Thus, based on the difference and the like, the physical parameters of the operation carrier at the rest positions and angles can be calculated.
In some embodiments, the physical parameter of the operation carrier comprises coordinates of the operation carrier in a preset spatial coordinate system. In some embodiments, a three-dimensional spatial coordinate system may be constructed based on the positions of the camera and the screen, which is a finite plane under the coordinate system, or an infinitely extending plane may be used to represent the plane of the screen. Thus, the position of the remote controller is a coordinate point in a three-dimensional space, the angle is a three-dimensional vector, and the screen can be represented by plane coordinates in the three-dimensional space, so that the intersection point of a ray from an operation carrier (for example, the remote controller) along the actual angle of the remote controller and the plane coordinates where the screen is located, namely the operation focus of the remote controller, can be directly calculated.
In some embodiments, when the image of the operation carrier at the predetermined position and angle is obtained in advance to establish the corresponding relationship, the corresponding relationship between the image of the index point and the operation focus may be directly established, instead of the corresponding relationship with the physical parameter of the operation carrier. This is essentially finding the operating focus on the screen corresponding to the calibration point.
The following describes a simple structure of the operation carrier and the television and a construction process of the association relationship by taking the remote controller and the television as an example. As shown in fig. 2 and 3, the remote controller includes a remote controller body 10 and at least two index points 12, 13. In some embodiments, the remote control body may be a rectangular housing, but the disclosure is not so limited and may be any suitable shape that is convenient for a user to hold or manipulate. In some embodiments, the index points 12, 13 are located on the same side of the remote control body 10. In some embodiments, to facilitate the capture of an image of the index points, the index points 12, 13 are disposed on the same side of the remote controller body 10, which may result in the electronic device, such as a television, not recognizing all the index points well if the index points 12, 13 are not on the same side. In some embodiments, at least two index points 12, 13 are separated by a predetermined distance D, which is to facilitate the distinction of the two index points, thereby better identifying the position and angle of the remote control.
As shown in fig. 3, taking a television 20 as an example for explanation, the television 20 of the present disclosure has a camera 21. It should be understood that while fig. 3 shows camera 21 as a camera built into television 20, camera 21 may also be an external or off-screen camera that is fixed to television 20. First, the remote control is calibrated when it is used for the first time. The distance D between the index points 12, 13 of the remote control is generally known or can be measured. During calibration, the remote controller is placed to a preset distance and angle, the camera 21 is used for acquiring an image of the remote controller, and the parameter of the index point of the image acquired by the camera 21 corresponds to the distance D between the index points and the distance and angle between the television and the remote controller to form an association relationship.
When the remote controller is used after calibration, the camera 21 is used for collecting a real-time image of the remote controller to obtain real-time parameters of the calibration point, and the real-time position, distance and angle of the remote controller can be known through the established association relation, so that the motion track of the remote controller can be identified, and a corresponding response is made. For example, when it is recognized that the remote controller moves rightward by a preset distance, a response of sliding the page rightward is made; and when the fact that the remote controller moves forwards for the preset distance is recognized, a response of enlarging the current page is made, and the like.
It will be appreciated that the establishment of the association may involve the computation of the geometric relationship, which may be done locally on the television or uploaded to a server. The pixel width of the distance D between the calibration points in the acquired image can reflect the distance between the remote controller and the television, and the larger the distance is, the smaller the pixel width of the distance D is. In addition, the inclination or angle of the connecting line between the calibration points can reflect the angle of the remote controller, and the position of the calibration point of the remote controller in the view field of the camera 21 can reflect the position of the remote controller. In addition, the establishment of the association relationship may actually be the establishment of a mapping relationship between parameters of the calibration point of the remote controller and physical parameters of the remote controller. And then acquiring the physical parameters of the remote controller after acquiring the image of the remote controller based on the mapping relation.
According to the method and the device, the relation between the calibration point of the remote controller and the position, distance and angle of the remote controller relative to the screen is established, then the real-time physical position, distance and angle of the remote controller corresponding to the collected data of the calibration point can be known by acquiring the real-time image of the remote controller, the motion track of the remote controller can be further identified, the control information of the remote controller can be acquired, and corresponding response can be made. In some embodiments, as described above, the focal point of the remote control operation can be obtained by calculating the coordinates of the remote control in the preset spatial coordinate system.
The embodiment of the present disclosure also provides an apparatus 400 for recognizing an operation action, which includes an image acquisition module 401, a parameter acquisition module 402, and an operation determination module 403. The image acquisition module 401 is configured to acquire an image of the operation carrier with the index point. The parameter obtaining module 402 is configured to identify and calculate feature parameters of the index points in the image, and obtain physical parameters of the operation carrier according to a correspondence between the pre-constructed feature parameters of the index points and the physical parameters of the operation carrier. The operation determination module 403 is configured to determine the operation of the operation carrier by the user according to the physical parameter of the operation carrier. It will be appreciated that what has been described with respect to the above method is equally applicable to the apparatus and will not be discussed in detail here for the sake of simplicity.
The structure of the remote controller will be further described below. As shown in fig. 2, the remote controller of the present disclosure includes: a remote control body 10 and two or more index points 12, 13, wherein at least two index points are located on the same side of the remote control body. If the calibration holes 12, 13 are not on the same side, this may result in the electronic device, such as a television, not recognizing the calibration points well. In some embodiments, the remote control body may be a rectangular housing, but the disclosure is not so limited and may be any suitable shape that is convenient for a user to hold or manipulate. It should be understood that at least two index points may be located on the same side of the remote control body, and that other index points may be located on other sides of the remote control body. In some embodiments, at least two of the index points are spaced apart from each other by a predetermined distance D. This is to facilitate the differentiation of the two calibrated holes, so as to better identify the position and angle of the remote control.
In some embodiments, the color of the index point is different from the color of the side of the remote control body on which the index point is located. This is to better identify the index point, and if the color is the same, after the television acquires the image of the remote controller, the index point may not be effectively identified, and thus information such as the angle of the remote controller may not be effectively acquired.
In some embodiments, the index point includes at least one of paper adhered to the remote controller body, fluorescent material or light reflecting material coated to the remote controller body, as long as the color of the index point is distinguished from the color of the remote controller body.
In some embodiments, the remote controller further comprises a light emitting device 11 in the remote controller body 10, and at least two of the calibration points are calibration holes for passing light emitted from the light emitting device. Since the light emitted from the light emitting device 11 can be emitted from the calibration hole of the present disclosure, the position of the calibration point can be easily identified from the image obtained by the camera 21. In some embodiments, light-emitting device 11 comprises a device capable of generating at least two different wavelengths of light. Therefore, the wavelengths of the light emitted from the calibration holes are different, and the wavelength difference can be identified through the television sensor, so that the calibration holes are distinguished, and the angle of the remote controller is better identified. In some embodiments, the remote control further comprises a transparent cover covering the at least two calibration holes. As shown in fig. 6, a transparent cover 31 is shown. The transparent cover 31 seals the calibration hole to prevent dust and the like from entering the remote controller body 10. In addition, since the transparent cover 31 is transparent (e.g., glass), the emergence of light is not affected. In some embodiments, the shape of the transparent cover 31 matches the shape of the corresponding calibrated holes. For example, the transparent cover 31 is flush with the side surface of the remote controller body 10 where the calibration hole is located. As shown in fig. 7, in some embodiments, the remote control further comprises a movable mask 41. A movable mask 41 is arranged between the light emitting device 11 and the transparent cover 31, the movable mask 41 being used to control the diameter of the light passing through the at least two calibrated holes. In some embodiments, the shape of the apertures in the movable mask 41 are the same as the corresponding calibrated apertures, so that the scaling of the diameters of the calibrated apertures can be controlled.
In some embodiments, the predetermined distance D is between 0.5cm-3 cm. If the preset distance D is less than 0.5cm, the calibration holes are not easy to distinguish, and the judgment of the positions of the calibration holes is influenced, so that the judgment of the positions, angles and distances of the remote controller is influenced finally. If the distance is more than 3cm, the volume of the remote controller may be increased, which is inconvenient for miniaturization of the remote controller.
In some embodiments, at least two index points have different shapes. For example, one index point is circular and the other is square. By making the shapes of the index points different, the index points of the remote controller can be better distinguished in the collected image of the remote controller, so that the angle and the like of the remote controller can be better identified. In some embodiments, at least one of the at least two index points has an asymmetric irregular shape. Compared with the adoption of a symmetrical shape, the angle of the remote controller can be better recognized by adopting an asymmetrical irregular shape, and the confusion of the rotation angle of the remote controller caused by the symmetry can be avoided. In some embodiments, the at least two index points are asymmetric with respect to a center of the side on which the at least two index points are located. By adopting the arrangement, the angle of the remote controller can be better identified similar to the situation that the calibration point adopts an asymmetric irregular shape, and the confusion of the rotation angle of the remote controller caused by symmetry can be avoided, especially when the calibration point has the same shape.
According to the remote controller, at least two calibration points are arranged, and the camera of a large-screen electronic device such as a television can identify the position, distance and angle of the remote controller by capturing the moving picture of the remote controller, so that the operation information of the remote controller can be acquired. Therefore, the remote controller disclosed by the invention can realize limb type remote control without keys, and the remote control experience of a user is improved.
In addition, the present disclosure also provides a terminal, including: at least one memory and at least one processor; the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the method for identifying the operation action.
In addition, the present disclosure also provides a computer storage medium storing program code for executing the above-described method of recognizing an operation action.
Referring now to FIG. 8, shown is a schematic diagram of an electronic device 800 suitable for use in implementing embodiments of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 8, an electronic device 800 may include a processing means (e.g., central processing unit, graphics processor, etc.) 801 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)802 or a program loaded from a storage means 806 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data necessary for the operation of the electronic apparatus 800 are also stored. The processing apparatus 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
Generally, the following devices may be connected to the I/O interface 805: input devices 806 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 807 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage 806 including, for example, magnetic tape, hard disk, etc.; and a communication device 809. The communication means 809 may allow the electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While fig. 8 illustrates an electronic device 800 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 809, or installed from the storage means 806, or installed from the ROM 802. The computer program, when executed by the processing apparatus 801, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText transfer protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring at least two internet protocol addresses; sending a node evaluation request comprising the at least two internet protocol addresses to node evaluation equipment, wherein the node evaluation equipment selects the internet protocol addresses from the at least two internet protocol addresses and returns the internet protocol addresses; receiving an internet protocol address returned by the node evaluation equipment; wherein the obtained internet protocol address indicates an edge node in the content distribution network.
Alternatively, the computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: receiving a node evaluation request comprising at least two internet protocol addresses; selecting an internet protocol address from the at least two internet protocol addresses; returning the selected internet protocol address; wherein the received internet protocol address indicates an edge node in the content distribution network.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided a method of recognizing an operation action, the method including: acquiring an image of an operation carrier with a calibration point; identifying and calculating characteristic parameters of the calibration points in the image; acquiring physical parameters of the operation carrier according to a pre-constructed corresponding relation between the characteristic parameters of the calibration point and the physical parameters of the operation carrier; and determining the operation of the user on the operation carrier according to the physical parameters of the operation carrier.
According to one or more embodiments of the present disclosure, the index points include at least two index points, and the at least two index points are separated by a predetermined distance.
According to one or more embodiments of the present disclosure, the characteristic parameters of the index points include pixel sizes of the index points in the image, angles of connecting lines between the index points, and/or positions of the index points in the image.
According to one or more embodiments of the present disclosure, the method further includes constructing a correspondence between the characteristic parameter of the index point and the physical parameter of the operation carrier, where constructing the correspondence between the characteristic parameter of the index point and the physical parameter of the operation carrier includes calculating the physical parameter of the operation carrier using a geometric relationship based on the characteristic parameter of the index point in the image.
According to one or more embodiments of the present disclosure, the method further includes constructing a corresponding relationship between the characteristic parameter of the index point and the physical parameter of the operation carrier, where constructing the corresponding relationship between the characteristic parameter of the index point and the physical parameter of the operation carrier includes: acquiring an image of the operation carrier at a preset position and at a preset angle, and constructing a mapping relation between the characteristic parameters of the calibration point and the physical parameters of the operation carrier based on the characteristic parameters of the calibration point in the image.
According to one or more embodiments of the present disclosure, the physical parameter of the operation carrier includes a coordinate of the operation carrier in a preset spatial coordinate system.
According to one or more embodiments of the present disclosure, there is provided an apparatus for recognizing an operation action, the apparatus including: the image acquisition module is configured to acquire an image of the operation carrier with the calibration point; the parameter acquisition module is configured to identify and calculate the characteristic parameters of the calibration point in the image, and acquire the physical parameters of the operation carrier according to the pre-constructed corresponding relationship between the characteristic parameters of the calibration point and the physical parameters of the operation carrier; and the operation determining module is configured to determine the operation of the user on the operation carrier according to the physical parameters of the operation carrier.
According to one or more embodiments of the present disclosure, there is also provided a remote controller including: a remote controller main body; more than two index points, wherein at least two index points are positioned on the same side of the remote controller main body; wherein the at least two index points are spaced apart from each other by a predetermined distance.
According to one or more embodiments of the present disclosure, the color of the index point is different from the color of the side surface of the remote controller main body where the index point is located.
According to one or more embodiments of the present disclosure, the index point includes at least one of paper adhered to the remote controller main body, fluorescent material or reflective material coated to the remote controller main body.
According to one or more embodiments of the present disclosure, the remote controller further includes a light emitting device located in the remote controller main body, the at least two calibration points are calibration holes, and the calibration holes are used for allowing light emitted by the light emitting device to pass through.
According to one or more embodiments of the present disclosure, the predetermined distance is between 0.5cm and 3 cm.
According to one or more embodiments of the present disclosure, the at least two index points have different shapes; and/or at least one of the at least two index points has an asymmetric irregular shape; and/or the at least two index points are asymmetrical with respect to the center of the side on which the at least two index points are located.
According to one or more embodiments of the present disclosure, there is provided a terminal including: at least one memory and at least one processor; the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the method for identifying the operation action.
According to one or more embodiments of the present disclosure, a computer storage medium is provided, which stores program code for performing the above-described method of identifying an operational action.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (15)

1. A method of recognizing an operation action, the method comprising:
acquiring an image of an operation carrier with a calibration point;
identifying and calculating characteristic parameters of the calibration points in the image;
acquiring physical parameters of the operation carrier according to a pre-constructed corresponding relation between the characteristic parameters of the calibration point and the physical parameters of the operation carrier;
and determining the operation of the user on the operation carrier according to the physical parameters of the operation carrier.
2. The method of claim 1, wherein the index points comprise at least two index points, and the at least two index points are separated by a predetermined distance.
3. The method according to claim 1, wherein the characteristic parameters of the index points comprise pixel sizes of the index points in the image, angles of lines between the index points, and/or positions of the index points in the image.
4. The method according to claim 1, further comprising constructing a correspondence between the characteristic parameters of the index point and the physical parameters of the handle carrier, wherein constructing the correspondence between the characteristic parameters of the index point and the physical parameters of the handle carrier comprises calculating the physical parameters of the handle carrier using geometric relationships based on the characteristic parameters of the index point in the image.
5. The method according to claim 1, further comprising constructing a correspondence between the characteristic parameters of the index point and the physical parameters of the handle carrier, wherein constructing the correspondence between the characteristic parameters of the index point and the physical parameters of the handle carrier comprises: acquiring an image of the operation carrier at a preset position and at a preset angle, and constructing a mapping relation between the characteristic parameters of the calibration point and the physical parameters of the operation carrier based on the characteristic parameters of the calibration point in the image.
6. The method of claim 1, wherein the physical parameters of the operation carrier comprise coordinates of the operation carrier in a preset spatial coordinate system.
7. An apparatus for recognizing an operation action, the apparatus comprising:
the image acquisition module is configured to acquire an image of the operation carrier with the calibration point;
the parameter acquisition module is configured to identify and calculate the characteristic parameters of the calibration point in the image, and acquire the physical parameters of the operation carrier according to the pre-constructed corresponding relationship between the characteristic parameters of the calibration point and the physical parameters of the operation carrier;
and the operation determining module is configured to determine the operation of the user on the operation carrier according to the physical parameters of the operation carrier.
8. A remote control, comprising:
a remote controller main body;
more than two index points, wherein at least two index points are positioned on the same side of the remote controller main body;
wherein the at least two index points are spaced apart from each other by a predetermined distance.
9. The remote controller according to claim 8, wherein the color of the index point is different from the color of the side surface of the remote controller main body where the index point is located.
10. The remote controller of claim 9, wherein the index point comprises at least one of paper affixed to the remote controller body, fluorescent material or light reflective material applied to the remote controller body.
11. The remote controller according to claim 8, further comprising a light emitting device in the remote controller main body, wherein the at least two index points are index points for passing light emitted from the light emitting device.
12. The remote control of claim 8, wherein the predetermined distance is between 0.5cm and 3 cm.
13. The remote control of claim 8,
the at least two index points have different shapes; and/or
At least one of the at least two index points has an asymmetric irregular shape; and/or
The at least two index points are asymmetric with respect to a center of a side on which the at least two index points are located.
14. A terminal, characterized in that the terminal comprises:
at least one memory and at least one processor;
wherein the memory is configured to store program code and the processor is configured to invoke the program code stored by the memory to perform the method of any of claims 1 to 6.
15. A computer storage medium characterized in that the computer storage medium stores program code for executing the method of any one of claims 1 to 6.
CN202010586254.1A 2020-06-24 2020-06-24 Remote controller, method and device for identifying operation action, terminal and storage medium Pending CN111698545A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010586254.1A CN111698545A (en) 2020-06-24 2020-06-24 Remote controller, method and device for identifying operation action, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010586254.1A CN111698545A (en) 2020-06-24 2020-06-24 Remote controller, method and device for identifying operation action, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN111698545A true CN111698545A (en) 2020-09-22

Family

ID=72483739

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010586254.1A Pending CN111698545A (en) 2020-06-24 2020-06-24 Remote controller, method and device for identifying operation action, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111698545A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112565866A (en) * 2020-11-30 2021-03-26 深圳创维-Rgb电子有限公司 Focus control method, system, device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001020441A1 (en) * 1999-09-10 2001-03-22 Technomics Co., Ltd. Remote control method and apparatus for remote control mouse
CN101504728A (en) * 2008-10-10 2009-08-12 深圳先进技术研究院 Remote control system and method of electronic equipment
CN101673139A (en) * 2008-09-10 2010-03-17 Tcl集团股份有限公司 Remote controller and input system and method thereof
CN201570011U (en) * 2010-01-13 2010-09-01 北京视博数字电视科技有限公司 Terminal control device and terminal
CN102662501A (en) * 2012-03-19 2012-09-12 Tcl集团股份有限公司 Cursor positioning system and method, remotely controlled device and remote controller
US8773512B1 (en) * 2011-06-30 2014-07-08 Aquifi, Inc. Portable remote control device enabling three-dimensional user interaction with at least one appliance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001020441A1 (en) * 1999-09-10 2001-03-22 Technomics Co., Ltd. Remote control method and apparatus for remote control mouse
CN101673139A (en) * 2008-09-10 2010-03-17 Tcl集团股份有限公司 Remote controller and input system and method thereof
CN101504728A (en) * 2008-10-10 2009-08-12 深圳先进技术研究院 Remote control system and method of electronic equipment
CN201570011U (en) * 2010-01-13 2010-09-01 北京视博数字电视科技有限公司 Terminal control device and terminal
US8773512B1 (en) * 2011-06-30 2014-07-08 Aquifi, Inc. Portable remote control device enabling three-dimensional user interaction with at least one appliance
CN102662501A (en) * 2012-03-19 2012-09-12 Tcl集团股份有限公司 Cursor positioning system and method, remotely controlled device and remote controller

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112565866A (en) * 2020-11-30 2021-03-26 深圳创维-Rgb电子有限公司 Focus control method, system, device and storage medium
CN112565866B (en) * 2020-11-30 2023-12-05 深圳创维-Rgb电子有限公司 Focus control method, system, device and storage medium

Similar Documents

Publication Publication Date Title
CN111242881B (en) Method, device, storage medium and electronic equipment for displaying special effects
CN113255619B (en) Lane line recognition and positioning method, electronic device, and computer-readable medium
CN113607185B (en) Lane line information display method, lane line information display device, electronic device, and computer-readable medium
CN111243049A (en) Face image processing method and device, readable medium and electronic equipment
WO2022166868A1 (en) Walkthrough view generation method, apparatus and device, and storage medium
CN111339880A (en) Target detection method and device, electronic equipment and storage medium
CN110956128A (en) Method, apparatus, electronic device, and medium for generating lane line image
CN111698545A (en) Remote controller, method and device for identifying operation action, terminal and storage medium
CN110070617B (en) Data synchronization method, device and hardware device
WO2023109564A1 (en) Video image processing method and apparatus, and electronic device and storage medium
CN113766293B (en) Information display method, device, terminal and storage medium
CN112164066B (en) Remote sensing image layered segmentation method, device, terminal and storage medium
CN112132909B (en) Parameter acquisition method and device, media data processing method and storage medium
CN110634159A (en) Target detection method and device
CN109472873B (en) Three-dimensional model generation method, device and hardware device
CN114897688A (en) Video processing method, video processing device, computer equipment and medium
CN112037280A (en) Object distance measuring method and device
CN113703704A (en) Interface display method, head-mounted display device and computer readable medium
CN110070600B (en) Three-dimensional model generation method, device and hardware device
CN111354070A (en) Three-dimensional graph generation method and device, electronic equipment and storage medium
CN111782050B (en) Image processing method and apparatus, storage medium, and electronic device
CN111368015B (en) Method and device for compressing map
CN112991542B (en) House three-dimensional reconstruction method and device and electronic equipment
CN117389407A (en) Electronic equipment control method and device and electronic equipment
CN117726740A (en) Information generating method, apparatus, device, computer readable medium, and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200922