CN108924614B - Control method, control system and electronic equipment - Google Patents

Control method, control system and electronic equipment Download PDF

Info

Publication number
CN108924614B
CN108924614B CN201810558894.4A CN201810558894A CN108924614B CN 108924614 B CN108924614 B CN 108924614B CN 201810558894 A CN201810558894 A CN 201810558894A CN 108924614 B CN108924614 B CN 108924614B
Authority
CN
China
Prior art keywords
operation identifier
information
electronic device
position information
obtaining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810558894.4A
Other languages
Chinese (zh)
Other versions
CN108924614A (en
Inventor
李国锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201810558894.4A priority Critical patent/CN108924614B/en
Publication of CN108924614A publication Critical patent/CN108924614A/en
Application granted granted Critical
Publication of CN108924614B publication Critical patent/CN108924614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home

Abstract

The application provides a control method, a control system and electronic equipment, wherein the control method is applied to first electronic equipment, and the method comprises the following steps: obtaining and outputting content information from the second electronic device; processing the content information, and determining a plurality of objects in the content information and position information of each object; wherein the object comprises a focus object and at least one candidate object; obtaining position information of the operation identifier; wherein, the content information does not include the operation identifier; if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object; and sending a control instruction to the second electronic device, so that the second electronic device can respond to the control instruction and set the first object as a focus object. The user can correspondingly control the content information of the second electronic equipment through the operation identifier, and the use experience of the user is improved.

Description

Control method, control system and electronic equipment
Technical Field
The application relates to a control method, a control system and an electronic device.
Background
At present, along with the continuous improvement of the silent living standard of people, a television, a set top box, a DVD machine and the like become electronic equipment frequently used in the life of people, and people realize the corresponding functions of the set top box, the DVD machine and the like connected with the television in a remote controller mode in the past, but when different equipment is actually controlled in the process of operating the television, the set top box connected with the television, the DVD machine connected with the television and the like, the remote controller corresponding to the equipment needs to be used, namely, the equipment-crossing operation is needed, so that the operation is not convenient enough.
Disclosure of Invention
The application provides a control method, a control system and electronic equipment which can reduce user operations and are convenient for users to use.
In order to solve the above problem, the present application provides a control method applied to a first electronic device, the method including:
obtaining and outputting content information from the second electronic device;
processing the content information, and determining a plurality of objects in the content information and position information of each object; wherein the objects include a focus object and at least one candidate object;
obtaining position information of the operation identifier; wherein the operation identifier is not included in the content information;
if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object;
and sending the control instruction to the second electronic equipment, so that the second electronic equipment can respond to the control instruction and set the first object as a focus object.
In some embodiments of the present disclosure, the content information is interface information, and the processing the content information includes:
and carrying out image analysis on the interface information.
In some embodiments of the present disclosure, the performing image analysis on the interface information includes:
and carrying out image analysis on the interface information by adopting a contour recognition algorithm or an edge detection algorithm.
In some embodiments of the present disclosure, determining the focal object comprises:
and determining the focus object through a display mode and/or a display effect.
In some embodiments of the disclosure, the obtaining the location information of the operation identifier includes:
obtaining a determination instruction, and obtaining the position information of the operation identifier based on the determination instruction;
wherein the position information of the operation identifier can be changed according to the obtained input operation.
In some embodiments of the present disclosure, determining location information for each object includes:
determining the region range included by each object and determining the interface layout including the plurality of objects;
if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object, including:
if the position information identified by the operation is within the area range of the candidate object, determining direction and/or quantity information of the position of the focus object moving to the position of the first object from the position of the focus object in the interface layout based on the position of the focus object and the position of the first object;
and generating the control instruction based on the direction and/or quantity information.
In some embodiments of the present disclosure, before obtaining the location information of the operation identifier, the method includes:
obtaining position information of the operation identifier;
and determining to enter a state capable of sending the control instruction to the second electronic equipment at least based on the position information of the operation identifier and the position information of the focus object which meet the condition.
The present application also provides a control system, comprising:
a second electronic device for providing content information;
a first electronic device for obtaining and outputting content information from a second electronic device; processing the content information, and determining a plurality of objects in the content information and position information of each object; wherein the objects include a focus object and at least one candidate object; obtaining position information of the operation identifier; wherein the operation identifier is not included in the content information; if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object; and sending the control instruction to the second electronic equipment, so that the second electronic equipment can respond to the control instruction and set the first object as a focus object.
In some embodiments of the present disclosure, the control system further comprises:
and a third electronic device for causing the first electronic device to obtain the location information of the operation identifier.
The present application further provides an electronic device, including:
a display unit for obtaining and outputting content information from the second electronic device;
a processing unit, configured to process the content information, determine a plurality of objects in the content information, and position information of each object, where the objects include a focus object and at least one candidate object; obtaining position information of the operation identifier; wherein the operation identifier is not included in the content information; if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object;
a sending unit, configured to send the control instruction to the second electronic device, so that the second electronic device can set the first object as a focus object in response to the control instruction.
According to the control method, the control system and the electronic device, the content information of the second electronic device can be obtained and output through the first electronic device, the position information of the operation identifier can be obtained after the content information is processed, and when the position information of the operation identifier corresponds to the first object in the selected objects, the control instruction is generated and sent to the second electronic device based on the position of the focus object and the position relation of the first object, so that the second electronic device can respond to the control instruction and set the first object as the focus object, and a user can realize corresponding control over the content information of the second electronic device through the operation identifier, and the use experience of the user is improved.
Drawings
FIG. 1 is a flow chart of a control method in an embodiment of the present application;
FIG. 2 is a diagram illustrating a display of the position of a focus object according to an embodiment of the present disclosure;
FIG. 3 is another illustration of the position of a focus object in the embodiment of the present application;
FIG. 4 is an example of an operation identifier in a region corresponding to a focus object in the embodiment of the present application;
FIG. 5 is an example of an operation identifier in an area range corresponding to a first object in the embodiment of the present application;
FIG. 6 is a block diagram of a control system in an embodiment of the present application;
fig. 7 is a block diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the present application is described in detail below with reference to the accompanying drawings and the detailed description.
Various aspects and features of the present application are described herein with reference to the drawings.
These and other characteristics of the present application will become apparent from the following description of preferred forms of embodiment, given as non-limiting examples, with reference to the attached drawings.
It should also be understood that, although the present application has been described with reference to some specific examples, a person of skill in the art shall certainly be able to achieve many other equivalent forms of application, having the characteristics as set forth in the claims and hence all coming within the field of protection defined thereby.
The above and other aspects, features and advantages of the present application will become more apparent in view of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present application are described hereinafter with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely exemplary of the application and that it may be embodied in various forms. Well-known and/or repeated functions and structures have not been described in detail so as to not unnecessarily obscure the present application with unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present application in virtually any appropriately detailed structure.
The specification may use the phrases "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," which may each refer to one or more of the same or different embodiments in accordance with the application.
The application provides a control method, which is applied to first electronic equipment and comprises the following steps:
obtaining and outputting content information from the second electronic device;
processing the content information, and determining a plurality of objects in the content information and position information of each object; wherein the objects include a focus object and at least one candidate object;
obtaining position information of the operation identifier; wherein the operation identifier is not included in the content information;
if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object;
and sending the control instruction to the second electronic equipment, so that the second electronic equipment can respond to the control instruction and set the first object as a focus object.
According to the control method, the content information of the second electronic device can be obtained and output through the first electronic device, after the content information is processed, the position information of the operation identifier can be obtained, and when the position information of the operation identifier corresponds to the first object in the selected objects, a control instruction is generated and sent to the second electronic device based on the position of the focus object and the position relation of the first object, so that the second electronic device can respond to the control instruction and set the first object as the focus object, a user can achieve corresponding control over the content information of the second electronic device through the operation identifier, and the use experience of the user is improved.
In order to better understand the technical solution, a specific flow of the control method will be described below with reference to the drawings of the specification and specific embodiments.
As shown in fig. 1, fig. 1 is a flowchart of a control method in an embodiment of the present application, where the control method is applied to a first electronic device, where the first electronic device may be an electronic device capable of displaying an interface, such as a television, a projector, and the like, and the method includes the following steps:
step 101: content information from the second electronic device is obtained and output.
By way of example, the second electronic device may comprise a set-top box, DVD player, cell phone, PC, or the like capable of providing content information. The first electronic device outputs the content of the second electronic device, and the content output by the first electronic device is isolated from the content of the first electronic device itself, that is, in the current scene, the first electronic device is originally only a display output device relative to the second electronic device, but not a peer electronic device; in addition, the first electronic device obtains the content information through the multimedia interface or the data interface, no additional application program is needed, and after the second electronic device is connected with the first electronic device, the first electronic device can display the content information of the second electronic device.
In order to explain the above technical solutions in more detail, the following description will take the first electronic device as a television and the second electronic device as a set-top box as an example. After the set-top box is connected with the television, when the signal source of the television is set as the set-top box, the content information of the set-top box is displayed on a display interface of the television, so that a user can watch the content information through corresponding operation.
Step 102: processing the content information, and determining a plurality of objects in the content information and position information of each object; wherein the objects include a focus object and at least one candidate object.
In some embodiments of the present application, the content information is interface information, that is, interface information provided for the second electronic device displayed on the display interface of the first electronic device. Accordingly, the processing the content information includes: and carrying out image analysis on the interface information. And further determining a plurality of objects in the interface information and position information of each object, wherein the objects include a focus object and at least one candidate object, which can be specifically shown in fig. 2 and fig. 3.
In some embodiments of the present application, a position of the focus object in the interface information may be set in advance, or may be set by default, for example, a position of an object located at the leftmost side of the first row in the interface information may be set, as shown in fig. 3 specifically, or a position of an object located at a central position in the interface information may be set, as shown in fig. 2 specifically, and the position may be set accordingly according to an actual situation, which is not limited explicitly herein.
Specifically, the image analysis of the interface information includes: and carrying out image analysis on the interface information by adopting a contour recognition algorithm or an edge detection algorithm. The contour recognition algorithm or the edge detection algorithm is a process for recognizing a target contour or an edge by adopting a certain technology and method, the contour recognition algorithm is used for recognizing a contour to be obtained, namely, a contour line in the interface information is recognized firstly, then a contour line range with common characteristics is used as an object correspondingly, wherein the contour line with the common characteristics comprises a closed image and is distributed uniformly, and the recognition of a focus object and at least one candidate object in the interface information is completed; the edge detection algorithm is to detect points with obvious brightness change in the image, and can be to perform gray processing on the image, convert the image into a gray image, remove colors in the image, and identify parts with the same gray level, meanwhile, the significant change of the brightness in the image attribute usually reflects important events and changes of the attribute, including one or more of discontinuity in depth, discontinuity in surface direction, change in material attribute, or change in scene illumination. Of course, in the actual application process, the content recognition of the whole interface may be performed on the interface information, including image recognition and text recognition, for example, it is known that a name of a movie is recognized as one of the interface information, and the name corresponds to an object, and when a poster of a tv series is recognized as one of the interface information, it may also be known that the image corresponds to an object, and so on.
In order to explain the above technical solutions in more detail, the following description is given by taking the first electronic device as a television and the second electronic device as a set-top box as an example.
The method comprises the steps that after a set top box is connected with a television and content information of the set top box appears on a display interface of the television, corresponding processing is carried out on the content information of the set top box, wherein the content information presented on the display interface of the television by the set top box is interface information, and correspondingly, image analysis is carried out on the interface information of the set top box by processing the content information so as to identify a plurality of objects in the interface information of the set top box and position information of each object, wherein the objects comprise a focus object and at least one candidate object, and specifically, the objects can be menu options, title options and the like provided by the set top box; when the television set watches television programs or other videos through the set-top box, the first menu icon at the upper left corner in the interface information is generally highlighted in the interface information provided by the set-top box to the television set, that is, used as a focus object, and of course, the menu icon at the central position in the interface information may also be highlighted (focus object). When menu options, title options and the like provided by the set-top box are identified on a display interface of the television, the edges and the outlines of the menu options, the title options and the like in the interface information provided by the set-top box are identified, or points with obvious brightness changes in the interface information provided by the set-top box are detected, so that the edges and the outlines of the menu options, the title options and the like are determined through the obvious changes of highlights.
In some embodiments of the present application, determining the focal object includes: and determining the focus object through a display mode and/or a display effect. The focus object is an object which is displayed in high brightness or highlighted in contrast in the interface information, so that when the interface information is subjected to image recognition, the focus object can be determined through a display mode and/or a display effect. Specifically, the focus object is generally displayed in a highlighted state, including a manner of displaying the focus object in a manner of having a relatively obvious contrast with other objects, for example, the focus object is displayed in black, and other objects are displayed in white or a lighter color, so as to highlight the display of the focus object, which is convenient for a user to identify.
In some embodiments of the present disclosure, the electronic device may be in a state capable of sending control instructions to the second electronic device, or may need to enter a state capable of sending control instructions to the second electronic device.
Specifically, after image recognition is performed on interface information provided by a second electronic device, a focus object is determined, and certainly, the focus object includes an area range where the focus object is determined, and further, when corresponding control over the interface information needs to be completed by using an operation identifier, the operation identifier may be moved into the area range corresponding to the focus object, please refer to fig. 4, where when the operation identifier is located in the area range corresponding to the focus object, it is determined that the operation identifier enters a state where the control instruction can be sent to the second electronic device, and at this time, the first obtaining of position information of the operation identifier is completed; of course, when the position information of the operation identifier is obtained for the first time, a suitable condition for limiting activation of the operation identifier may also be set, for example, after the operation identifier is located in the area range corresponding to the focus object, a preset time duration, which may be 1 second, 0 second, 5 seconds, or the like, is required to last, and after the preset time duration is met, activation of the operation identifier may be implemented to determine that the operation identifier actually needs to be activated, so as to prevent generation of an incorrect operation; in addition, in order to quickly implement the corresponding control of the interface information using the operation identifier, the activation of the operation identifier may be implemented only by inputting a specific instruction, where the specific instruction is the first specific instruction, and of course, the specific instruction may be implemented by operating a key on the electronic device (such as the third electronic device) of the operation identifier. And after the activation of the operation identifier is completed, namely after the operation identifier is determined to enter a state capable of sending the control command to the second electronic equipment, executing the subsequent steps.
Step 103: obtaining position information of the operation identifier; wherein the operation identifier is not included in the content information.
It can be seen that, before obtaining the position information of the operation identifier, the method in step 103 further includes: obtaining position information of the operation identifier; and determining to enter a state capable of sending the control instruction to the second electronic equipment at least based on the position information of the operation identifier and the position information of the focus object which meet the condition.
And the position information of the operation identifier is obtained at step 103 after determining that the operation identifier enters a state capable of sending the control instruction to the second electronic device, that is, the position information of the operation identifier is obtained for the second time. The operation identifier may be provided by a third electronic device connected to the first electronic device, and is used to perform a corresponding control operation on the first electronic device, or may be used to implement a corresponding control on a second electronic device provided with content information of the first electronic device by using the operation identifier.
In some embodiments of the disclosure, the obtaining the location information of the operation identifier includes: obtaining a determination instruction (the determination instruction here is a second determination instruction, and the trigger operation of the second determination instruction and the trigger operation of the first determination instruction may be the same or different), and obtaining the location information of the operation identifier based on the determination instruction; wherein the position information of the operation identifier can be changed according to the obtained input operation.
The determining instruction may be used to confirm that the user selects the candidate object corresponding to the operation identifier, that is, after the user completes the determination operation on the operation identifier, based on the determining instruction, the position information of the candidate object corresponding to the operation identifier is obtained when the determining instruction is received, and then the candidate object is determined to be selected. Specifically, the operation identifier may be correspondingly controlled by a third electronic device (e.g., an air mouse/a remote controller) connected to the first electronic device, and then the third electronic device may move the operation identifier, so as to facilitate corresponding selection of each object on the display interface of the first electronic device through the operation identifier.
Step 104: and if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object.
Wherein the control instruction may be an instruction to confirm selection of the first object.
In some embodiments of the present disclosure, determining location information for each object includes: determining the region range included by each object and determining the interface layout including the plurality of objects; the interface layout includes distribution conditions of a plurality of objects in the display interface, and may be specifically shown in fig. 2 to 5. If the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object, including: if the position information identified by the operation is within the area range of the candidate object, determining direction and/or quantity information of the position of the focus object moving to the position of the first object from the position of the focus object in the interface layout based on the position of the focus object and the position of the first object; and generating the control instruction based on the direction and/or quantity information.
In order to explain the above technical solutions in more detail, the following description is given by taking the first electronic device as a television and the second electronic device as a set-top box as an example.
After the set-top box is connected to the television, the content information of the set-top box appears on the display interface of the television and is processed, if a layout of 12 objects in 3 rows and 4 columns is presented on the display interface, specifically, as shown in fig. 2 to 5, and when the first object at the upper left corner is the focus object, if the first object corresponding to the operation identifier is the last object at the right side of the 2 nd row, it may be known that the focus object needs to be moved to the right by 4 object positions, and then moved to the downward by the position of one object, so as to select the first object, as shown in fig. 5, or the focus object is moved to the downward by the position of one object and then moved to the right by the positions of four objects, so as to realize the selection of the first object, of course, it may also be its moving mode, and further based on the moving mode (moving direction and/or object number information), and generating a control instruction.
Step 105: and sending the control instruction to the second electronic equipment, so that the second electronic equipment can respond to the control instruction and set the first object as a focus object.
The generated control instruction is sent to the second electronic device by the first electronic device, and the control instruction can be converted by the first electronic device to generate a related control instruction which is convenient for the second electronic device to identify, so that the processing efficiency of the second electronic device is improved, and meanwhile, the situation that the second electronic device cannot identify the control instruction is avoided; further, after the first object is set as a focus object, transmission of the control instruction is stopped.
For example, when the first electronic device is a television and the second electronic device is a set-top box, when the television sends a control instruction to the set-top box, the control instruction may be converted into a key value for pressing up, down, left, and right, and the key value is sent to the set-top box through a cec (consumer Electronics control) command, so that the converted command can be quickly executed when the set-top box receives the converted command. In addition, the mode of generating the control instruction also reduces the operation of the user, and avoids the complex mode that in the prior art, when the user operates the cross-device, the user needs to operate the first remote control device of the first device to complete the corresponding operation first, and then operates the second remote control device of the second device to complete the corresponding operation, for example, when the television is connected with the set-top box, the user needs to complete the operation of the television through the remote controller of the television first, select the set-top box as the video source of the television, and complete the corresponding control of the set-top box through the remote controller of the set-top box, such as channel changing, video program selection and the like. By the above embodiments disclosed in the present application, a user can implement cross-device control, that is, control over the first electronic device and the second electronic device, only by using the remote controller of the first electronic device.
As shown in fig. 6, the present application also provides a control system including:
a second electronic device 1 for providing content information;
a first electronic device 2 for obtaining and outputting content information from the second electronic device 1; processing the content information, and determining a plurality of objects in the content information and position information of each object; wherein the objects include a focus object and at least one candidate object; obtaining position information of the operation identifier; wherein the operation identifier is not included in the content information; if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object; sending the control instruction to the second electronic device 1, so that the second electronic device 1 can set the first object as a focus object in response to the control instruction.
The first electronic device 2 outputs the content of the second electronic device 1, and the content output by the first electronic device 2 is isolated from the content of the first electronic device 2 itself, that is, in the current scene, the first electronic device 2 is originally only a display output device relative to the second electronic device 1, but not a peer electronic device; in addition, the first electronic device 2 obtains the content information through the multimedia interface or the data interface, no additional application program is needed, and after the second electronic device 1 establishes connection with the first electronic device 2, the first electronic device 2 can display the content information of the second electronic device 1.
And the first electronic device 2 may include an electronic device capable of displaying an interface, such as a television, a projector, or the like; the second electronic device 1 may comprise a set-top box, DVD player, mobile phone, PC or the like capable of providing content information.
In some embodiments of the present disclosure, the control system further comprises:
a third electronic device 3 for making the first electronic device 2 obtain the position information of the operation identifier.
Wherein the third electronic device 3 may be a remote control of the first electronic device 2, such as a mouse remote control or the like.
In some embodiments of the present disclosure, wherein the content information is interface information, and the processing the content information includes: and carrying out image analysis on the interface information.
In some embodiments of the present disclosure, the performing image analysis on the interface information includes: and carrying out image analysis on the interface information by adopting a contour recognition algorithm or an edge detection algorithm.
In some embodiments of the present disclosure, determining the focal object comprises: and determining the focus object through a display mode and/or a display effect.
In some embodiments of the disclosure, the obtaining the location information of the operation identifier includes: obtaining a determination instruction, and obtaining the position information of the operation identifier based on the determination instruction; wherein the position information of the operation identifier can be changed according to the obtained input operation.
In some embodiments of the present disclosure, determining location information for each object includes: determining the region range included by each object and determining the interface layout including the plurality of objects; if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object, including: if the position information identified by the operation is within the area range of the candidate object, determining direction and/or quantity information of the position of the focus object moving to the position of the first object from the position of the focus object in the interface layout based on the position of the focus object and the position of the first object; and generating the control instruction based on the direction and/or quantity information.
In some embodiments of the present disclosure, before obtaining the location information of the operation identifier, the method includes: obtaining position information of the operation identifier; determining to enter a state in which the control instruction can be transmitted to the second electronic device 1, based on at least the position information of the operation identifier and the position information of the focus object that satisfy the condition.
According to the control system, the content information of the second electronic device 1 can be obtained and output through the first electronic device 2, after the content information is processed, the position information of the operation identifier can be obtained, and when the position information of the operation identifier corresponds to a first object in the selected objects, a control instruction is generated based on the position of the focus object and the position relation of the first object and is sent to the second electronic device 1, so that the second electronic device 1 can respond to the control instruction and set the first object as the focus object, a user can realize corresponding control over the content information of the second electronic device 1 through the operation identifier, and the use experience of the user is improved.
As shown in fig. 7, the present application also provides an electronic device including:
a display unit 4 for obtaining and outputting content information from the second electronic device;
a processing unit 5, configured to process the content information, determine a plurality of objects in the content information, and position information of each object, where the objects include a focus object and at least one candidate object; obtaining position information of the operation identifier; wherein the operation identifier is not included in the content information; if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object;
a sending unit 6, configured to send the control instruction to the second electronic device, so that the second electronic device can set the first object as a focus object in response to the control instruction.
The display unit 4 of the first electronic device outputs the content of the second electronic device, and the content output by the first electronic device is isolated from the content of the first electronic device itself, that is, in the current scene, the first electronic device is originally only a display output device relative to the second electronic device, but not a peer electronic device; in addition, the first electronic device obtains the content information through the multimedia interface or the data interface, no additional application program is needed, and after the second electronic device is connected with the first electronic device, the first electronic device can display the content information of the second electronic device.
The electronic device may include a television, a projector, and other devices capable of displaying an interface; the second electronic device may comprise a set-top box, DVD player, mobile phone, PC, etc. capable of providing content information.
In some embodiments of the present disclosure, wherein the content information is interface information, and the processing unit 5 is configured to process the content information, including: and carrying out image analysis on the interface information.
In some embodiments of the present disclosure, the processing unit 5 is configured to perform image analysis on the interface information, and includes: and carrying out image analysis on the interface information by adopting a contour recognition algorithm or an edge detection algorithm.
In some embodiments of the present disclosure, the processing unit 5 is configured to perform image analysis on the interface information, and includes:
and carrying out image analysis on the interface information by adopting a contour recognition algorithm or an edge detection algorithm.
In some embodiments of the present disclosure, the processing unit 5 is configured to determine the focus object, and includes: and determining the focus object through a display mode and/or a display effect.
In some embodiments of the present disclosure, the processing unit 5 is configured to obtain the location information of the operation identifier, including: obtaining a determination instruction, and obtaining the position information of the operation identifier based on the determination instruction; wherein the position information of the operation identifier can be changed according to the obtained input operation.
In some embodiments of the present disclosure, the processing unit 5 is configured to determine the position information of each object, including: determining the region range included by each object and determining the interface layout including the plurality of objects; if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object, including: if the position information identified by the operation is within the area range of the candidate object, determining direction and/or quantity information of the position of the focus object moving to the position of the first object from the position of the focus object in the interface layout based on the position of the focus object and the position of the first object; and generating the control instruction based on the direction and/or quantity information.
In some embodiments of the present disclosure, the processing unit 5, before obtaining the location information of the operation identifier, is further configured to: obtaining position information of the operation identifier; and determining to enter a state capable of sending the control instruction to the second electronic equipment at least based on the position information of the operation identifier and the position information of the focus object which meet the condition.
The electronic device can obtain and output content information of a second electronic device through the electronic device, after the content information is processed, position information of an operation identifier can be obtained, and when the position information of the operation identifier corresponds to a first object in selected objects, a control instruction is generated and sent to the second electronic device based on the position of the focus object and the position relation of the first object, so that the second electronic device can respond to the control instruction and set the first object as the focus object, a user can realize corresponding control over the content information of the second electronic device through the operation identifier, and use experience of the user is improved.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing module of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing module of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above embodiments are only exemplary embodiments of the present application, and are not intended to limit the present application, and the protection scope of the present application is defined by the claims. Various modifications and equivalents may be made by those skilled in the art within the spirit and scope of the present application and such modifications and equivalents should also be considered to be within the scope of the present application.

Claims (9)

1. A control method is applied to a first electronic device, and comprises the following steps:
obtaining and outputting content information from the second electronic device;
processing the content information, and determining a plurality of objects in the content information and position information of each object; wherein the objects include a focus object and at least one candidate object;
obtaining position information of the operation identifier; wherein the operation identifier is not included in the content information;
if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object;
sending the control instruction to the second electronic device, so that the second electronic device can respond to the control instruction and set the first object as a focus object;
before obtaining the position information of the operation identifier, the method further comprises the following steps: when the operation identifier is determined to move into the area range corresponding to the focus object, determining that the operation identifier is activated; wherein the content of the first and second substances,
before obtaining the position information of the operation identifier, the method comprises the following steps:
obtaining position information of the operation identifier;
and determining to enter a state capable of sending the control instruction to the second electronic equipment at least based on the position information of the operation identifier and the position information of the focus object which meet the condition.
2. The control method according to claim 1, wherein the content information is interface information, and the processing the content information includes:
and carrying out image analysis on the interface information.
3. The control method of claim 2, the image analyzing the interface information, comprising:
and carrying out image analysis on the interface information by adopting a contour recognition algorithm or an edge detection algorithm.
4. The control method of claim 2, determining the focal object, comprising:
and determining the focus object through a display mode and/or a display effect.
5. The control method according to claim 1, wherein the obtaining of the location information of the operation identifier includes:
obtaining a determination instruction, and obtaining the position information of the operation identifier based on the determination instruction;
wherein the position information of the operation identifier can be changed according to the obtained input operation.
6. The control method of claim 1, determining the location information of each object, comprising:
determining the region range included by each object and determining the interface layout including the plurality of objects;
if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object, including:
if the position information identified by the operation is within the area range of the candidate object, determining direction and/or quantity information of the position of the focus object moving to the position of the first object from the position of the focus object in the interface layout based on the position of the focus object and the position of the first object;
and generating the control instruction based on the direction and/or quantity information.
7. A control system, comprising:
a second electronic device for providing content information;
a first electronic device for obtaining and outputting content information from a second electronic device; processing the content information, and determining a plurality of objects in the content information and position information of each object; wherein the objects include a focus object and at least one candidate object; obtaining position information of the operation identifier; wherein the operation identifier is not included in the content information; if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object; sending the control instruction to the second electronic device, so that the second electronic device can respond to the control instruction and set the first object as a focus object;
before obtaining the position information of the operation identifier, the method further comprises the following steps: when the operation identifier is determined to move into the area range corresponding to the focus object, determining that the operation identifier is activated;
before obtaining the position information of the operation identifier, the method comprises the following steps:
obtaining position information of the operation identifier;
and determining to enter a state capable of sending the control instruction to the second electronic equipment at least based on the position information of the operation identifier and the position information of the focus object which meet the condition.
8. The control system of claim 7, further comprising:
and a third electronic device for causing the first electronic device to obtain the location information of the operation identifier.
9. An electronic device, comprising:
a display unit for obtaining and outputting content information from the second electronic device;
a processing unit, configured to process the content information, determine a plurality of objects in the content information, and position information of each object, where the objects include a focus object and at least one candidate object; obtaining position information of the operation identifier; wherein the operation identifier is not included in the content information; if the position information of the operation identifier corresponds to a first object in the candidate objects, generating a control instruction based on the position of the focus object and the position relation of the first object;
a transmitting unit configured to transmit the control instruction to the second electronic device so that the second electronic device can set the first object as a focus object in response to the control instruction;
before obtaining the position information of the operation identifier, the method further comprises the following steps: when the operation identifier is determined to move into the area range corresponding to the focus object, determining that the operation identifier is activated;
before obtaining the position information of the operation identifier, the method comprises the following steps:
obtaining position information of the operation identifier;
and determining to enter a state capable of sending the control instruction to the second electronic equipment at least based on the position information of the operation identifier and the position information of the focus object which meet the condition.
CN201810558894.4A 2018-06-01 2018-06-01 Control method, control system and electronic equipment Active CN108924614B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810558894.4A CN108924614B (en) 2018-06-01 2018-06-01 Control method, control system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810558894.4A CN108924614B (en) 2018-06-01 2018-06-01 Control method, control system and electronic equipment

Publications (2)

Publication Number Publication Date
CN108924614A CN108924614A (en) 2018-11-30
CN108924614B true CN108924614B (en) 2021-07-16

Family

ID=64418122

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810558894.4A Active CN108924614B (en) 2018-06-01 2018-06-01 Control method, control system and electronic equipment

Country Status (1)

Country Link
CN (1) CN108924614B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114510186A (en) * 2020-10-28 2022-05-17 华为技术有限公司 Cross-device control method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103777788A (en) * 2012-10-22 2014-05-07 联想(北京)有限公司 Control method and electronic devices
CN103809856A (en) * 2014-02-24 2014-05-21 联想(北京)有限公司 Information processing method and first electronic device
CN105912220A (en) * 2015-12-15 2016-08-31 乐视致新电子科技(天津)有限公司 Method and device for displaying icon
CN107133291A (en) * 2017-04-24 2017-09-05 武汉斗鱼网络科技有限公司 A kind of method and apparatus of content of pages displaying

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9811737B2 (en) * 2013-07-03 2017-11-07 Ortiz And Associates Consulting, Llc Methods and systems enabling access by portable wireless handheld devices to data associated with programming rendering on flat panel displays
CN105338386B (en) * 2015-10-22 2019-03-26 深圳创想未来机器人有限公司 Video equipment control device and method based on image procossing and speech processes

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103777788A (en) * 2012-10-22 2014-05-07 联想(北京)有限公司 Control method and electronic devices
CN103809856A (en) * 2014-02-24 2014-05-21 联想(北京)有限公司 Information processing method and first electronic device
CN105912220A (en) * 2015-12-15 2016-08-31 乐视致新电子科技(天津)有限公司 Method and device for displaying icon
CN107133291A (en) * 2017-04-24 2017-09-05 武汉斗鱼网络科技有限公司 A kind of method and apparatus of content of pages displaying

Also Published As

Publication number Publication date
CN108924614A (en) 2018-11-30

Similar Documents

Publication Publication Date Title
RU2557457C2 (en) Control function gestures
US20200007944A1 (en) Method and apparatus for displaying interactive attributes during multimedia playback
US20160232871A1 (en) Display method and display device
US9961394B2 (en) Display apparatus, controlling method thereof, and display system
CN104378688A (en) Mode switching method and device
WO2020006004A1 (en) Video subtitle display method and apparatus
TWI779207B (en) Modifying playback of replacement content based on control messages
KR20180070297A (en) Display apparatus and control method thereof
KR20160104493A (en) roadcasting receiving apparatus and control method thereof
CN104822078A (en) Shielding method and apparatus for video caption
CN112913331B (en) Determining light effects based on video and audio information according to video and audio weights
TWI728387B (en) Modifying playback of replacement content responsive to detection of remote control signals that control a device providing video to the playback device
US10609305B2 (en) Electronic apparatus and operating method thereof
KR102208893B1 (en) Display apparatus and channel map manage method thereof
CN108924614B (en) Control method, control system and electronic equipment
JP5279482B2 (en) Image processing apparatus, method, and program
US20180173399A1 (en) Display device for adjusting transparency of indicated object and display method for the same
CN110958473A (en) Remote control method, television and storage medium in low-brightness environment
US10613622B2 (en) Method and device for controlling virtual reality helmets
KR102414783B1 (en) Electronic apparatus and controlling method thereof
CN112367487B (en) Video recording method and electronic equipment
EP3319329A1 (en) Electronic device and electronic device operation method
US9621837B1 (en) Methods and devices for switching between different TV program accompanying sounds
US9813755B2 (en) Display apparatus and controlling methods thereof
CN106970759A (en) Method, device and terminal that a kind of VR contents quickly start

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant