CN112527170A - Equipment visualization control method and device and computer readable storage medium - Google Patents

Equipment visualization control method and device and computer readable storage medium Download PDF

Info

Publication number
CN112527170A
CN112527170A CN202011464002.8A CN202011464002A CN112527170A CN 112527170 A CN112527170 A CN 112527170A CN 202011464002 A CN202011464002 A CN 202011464002A CN 112527170 A CN112527170 A CN 112527170A
Authority
CN
China
Prior art keywords
equipment
scene
instruction
control
page
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011464002.8A
Other languages
Chinese (zh)
Inventor
喻建成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chengda Weiye Electronics Co ltd
Original Assignee
Shenzhen Chengda Weiye Electronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chengda Weiye Electronics Co ltd filed Critical Shenzhen Chengda Weiye Electronics Co ltd
Priority to CN202011464002.8A priority Critical patent/CN112527170A/en
Publication of CN112527170A publication Critical patent/CN112527170A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention discloses a visual control method, a visual control device and a computer readable storage medium of equipment, wherein the method is applied to a user terminal and comprises the following steps: acquiring a scene picture, and acquiring an equipment page corresponding to the scene picture, wherein the scene picture is a picture acquired by a camera for a scene where equipment is located, and the equipment page is a control page corresponding to the equipment; the scene picture and the equipment page are displayed in a split screen mode on the user terminal, so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page; when the control instruction is detected, determining equipment to be controlled according to the control instruction; and sending the control instruction to the equipment to be controlled so as to enable the equipment to be controlled to execute the control instruction. The invention can realize visual remote control of the equipment and improve the accuracy of the remote control of the equipment.

Description

Equipment visualization control method and device and computer readable storage medium
Technical Field
The invention relates to the technical field of Internet of things, in particular to a visual control method and device for equipment and a computer readable storage medium.
Background
With the rapid development of science and technology, intelligent devices related to the internet of things technology are more and more widely applied. For example, smart homes are becoming more popular with the development of internet of things technology. Compared with the common home, the intelligent home has an omnidirectional information interaction function, so that a user can remotely control the intelligent home through an APP (Application).
However, when the user remotely controls the smart device, the user cannot view the current state of the smart device and the surrounding environment of the smart device, the smart device may be damaged due to blind operation, and the user cannot accurately control the smart device. For example, when a user remotely controls a television to be turned off, the television is not successfully turned off due to network delay and the like, and the user does not think that the user successfully turns off the television, so that the user cannot accurately control the intelligent device, and the use experience of the user is greatly reduced. Therefore, how to improve the accuracy of the remote control of the equipment is a problem which needs to be solved urgently at present.
Disclosure of Invention
The invention mainly aims to provide a visual control method and device for equipment and a computer readable storage medium, aiming at realizing visual remote control of the equipment so as to improve the accuracy of the remote control of the equipment.
In order to achieve the above object, the present invention provides a device visualization control method, which is applied to a user terminal, and the device visualization control method includes the following steps:
acquiring a scene picture, and acquiring an equipment page corresponding to the scene picture, wherein the scene picture is a picture acquired by a camera for a scene where equipment is located, and the equipment page is a control page corresponding to the equipment;
the scene picture and the equipment page are displayed in a split screen mode on the user terminal, so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page;
when the control instruction is detected, determining equipment to be controlled according to the control instruction;
and sending the control instruction to the equipment to be controlled so as to enable the equipment to be controlled to execute the control instruction.
Optionally, the step of acquiring the scene picture includes:
when the user terminal displays a scene selection page, if a scene selection triggering instruction is detected, determining a scene to be viewed based on the scene selection instruction, wherein the scene selection page comprises a plurality of scenes, so that a user triggers the scene selection instruction based on the scene selection page;
determining a camera for collecting the scene to be checked, and sending a camera shooting instruction to the camera so that the camera collects a scene picture corresponding to the scene to be checked, and sending the scene picture to the user terminal;
and receiving the scene picture sent by the camera.
Optionally, the step of acquiring the device page corresponding to the scene picture includes:
the step of acquiring the device page corresponding to the scene picture includes:
determining a controllable device belonging to the scene picture;
acquiring equipment information of the controllable equipment, wherein the equipment information comprises an equipment name, an equipment state and an equipment icon option;
and constructing an equipment page corresponding to the scene picture according to the equipment information.
Optionally, the device page includes a device menu option, and when the control instruction is detected, before the step of determining the device to be controlled according to the control instruction, the method further includes:
when a menu access instruction triggered by the equipment menu option is detected, equipment to be controlled corresponding to the equipment menu option is determined;
and replacing the device page with a control menu page corresponding to the device to be controlled, wherein the control menu page can trigger a plurality of control instructions, so that a user can trigger one or more of the control instructions based on the scene picture and the control menu page.
Optionally, the device page includes a device icon option, where the device icon option is used for a user to trigger a device switch instruction, and when the control instruction is detected, before the step of determining the device to be controlled according to the control instruction, the method further includes:
when the device switch instruction triggered by the device icon option is detected, determining the device to be controlled corresponding to the device icon option, and detecting whether the device to be controlled is started;
if the equipment to be controlled is started, triggering a closing control instruction to close the equipment to be controlled;
and if the equipment to be controlled is closed, triggering a starting control instruction to start the equipment to be controlled.
Optionally, the step of sending the control instruction to the device to be controlled includes:
acquiring a secret key for encrypting the control instruction;
encrypting the control command through the secret key to generate a command ciphertext;
and sending the instruction ciphertext to a control server, and forwarding the instruction ciphertext to the equipment to be controlled by the control server so that the equipment to be controlled decrypts the instruction ciphertext to obtain the control instruction.
Optionally, the apparatus visualization control method further includes:
when the scene picture is displayed, if a scene change instruction is detected, determining a changed scene to be checked according to the scene change instruction;
determining a camera for collecting the scene to be checked, and sending a camera shooting instruction to the camera so that the camera collects a current scene picture corresponding to the scene to be checked, and sending the current scene picture to the user terminal;
and receiving the current scene picture sent by the camera, and replacing the scene picture with the current scene picture.
Optionally, the apparatus visualization control method further includes:
detecting whether the scene picture completely comprises the picture of the device to be controlled;
if the scene picture is detected not to completely include the picture of the equipment to be controlled, determining a camera rotation instruction, and determining a camera corresponding to the equipment to be controlled;
and sending the camera rotation instruction to the camera so that the camera executes the camera rotation instruction.
In addition, to achieve the above object, the present invention further provides an apparatus visualization control apparatus disposed at a user terminal, the apparatus visualization control apparatus including:
the device comprises a picture acquisition module, a picture processing module and a picture processing module, wherein the picture acquisition module is used for acquiring a scene picture and acquiring a device page corresponding to the scene picture, the scene picture is a picture acquired by a camera for a scene where the device is located, and the device page is a control page corresponding to the device;
the picture display module is used for displaying the scene picture and the equipment page on the user terminal in a split screen mode so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page;
the device determining module is used for determining a device to be controlled according to the control instruction when the control instruction is detected;
and the instruction sending module is used for sending the control instruction to the equipment to be controlled so as to enable the equipment to be controlled to execute the control instruction.
In addition, to achieve the above object, the present invention also provides an electronic device including: the device visualization control method comprises a memory, a processor and a device visualization control program stored on the memory and executable on the processor, wherein the device visualization control program, when executed by the processor, implements the steps of the device visualization control method as described above.
Further, to achieve the above object, the present invention also provides a computer readable storage medium having stored thereon a device visualization control program which, when executed by a processor, implements the steps of the device visualization control method as described above.
The invention provides a device visualization control method, a device, electronic equipment and a computer readable storage medium, wherein the device visualization control method is applied to a user terminal, and the device visualization control method acquires a scene picture and an equipment page corresponding to the scene picture, wherein the scene picture is a picture acquired by a camera for a scene where the equipment is located, and the equipment page is a control page corresponding to the equipment; the method comprises the steps that a scene picture and an equipment page are displayed in a split screen mode at a user terminal, so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page; when a control instruction is detected, determining equipment to be controlled according to the control instruction; and sending the control instruction to the equipment to be controlled so as to enable the equipment to be controlled to execute the control instruction. According to the invention, the scene picture and the equipment page are displayed in a split screen manner, so that a user can remotely control the equipment while remotely watching the equipment state and the ambient environment condition in the scene, the equipment damage caused by blind control of the equipment is prevented, and the accuracy of remote control of the equipment is improved.
Drawings
Fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a first embodiment of a visual control method of the apparatus according to the present invention;
FIG. 3 is a schematic diagram of a first page according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a second page according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a third page according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a fourth page according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a fifth page according to an embodiment of the present invention;
FIG. 8 is a diagram illustrating a sixth page according to an embodiment of the present invention;
FIG. 9 is a diagram illustrating a seventh page according to an embodiment of the present invention;
fig. 10 is a functional block diagram of a visual control apparatus according to a first embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The main idea of the embodiment scheme of the invention is as follows: when a user remotely controls an intelligent device, the intelligent device may not successfully execute a control instruction triggered by the user due to reasons such as network delay, or the intelligent device may be remotely controlled on the premise that the ambient environment condition needs to be observed, for example, the operation condition of a machine tool in a factory needs to be observed to perform corresponding control operation, the state of a patient needs to be observed to perform corresponding control operation, and the state of an observer needs to be observed to perform corresponding control operation when there are vulnerable groups such as the elderly and children at home. Therefore, the method and the device for acquiring the scene picture and the device page corresponding to the scene picture are achieved, wherein the scene picture is a picture acquired by the camera for the scene where the device is located, and the device page is a control page corresponding to the device; the method comprises the steps that a scene picture and an equipment page are displayed in a split screen mode at a user terminal, so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page; when a control instruction is detected, determining equipment to be controlled according to the control instruction; and sending the control instruction to the equipment to be controlled so as to enable the equipment to be controlled to execute the control instruction. The visual remote control of the equipment is realized, and the accuracy of the remote control of the equipment is improved.
Referring to fig. 1, fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention.
The terminal in the embodiment of the present invention is an apparatus visual control apparatus, and the apparatus visual control apparatus may be a user terminal having an interactive function, such as a smart phone, a Personal Computer (PC), a microcomputer, a notebook computer, and a server.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU (Central Processing Unit), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a device visualization control program.
In the terminal shown in fig. 1, the processor 1001 may be configured to call a device visualization control program stored in the memory 1005, and perform the following operations:
acquiring a scene picture, and acquiring an equipment page corresponding to the scene picture, wherein the scene picture is a picture acquired by a camera for a scene where equipment is located, and the equipment page is a control page corresponding to the equipment;
the scene picture and the equipment page are displayed in a split screen mode on the user terminal, so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page;
when the control instruction is detected, determining equipment to be controlled according to the control instruction;
and sending the control instruction to the equipment to be controlled so as to enable the equipment to be controlled to execute the control instruction.
Further, the processor 1001 may be configured to call the device visualization control program stored in the memory 1005, and further perform the following operations:
when the user terminal displays a scene selection page, if a scene selection triggering instruction is detected, determining a scene to be viewed based on the scene selection instruction, wherein the scene selection page comprises a plurality of scenes, so that a user triggers the scene selection instruction based on the scene selection page;
determining a camera for collecting the scene to be checked, and sending a camera shooting instruction to the camera so that the camera collects a scene picture corresponding to the scene to be checked, and sending the scene picture to the user terminal;
and receiving the scene picture sent by the camera.
Further, the processor 1001 may be configured to call the device visualization control program stored in the memory 1005, and further perform the following operations:
determining a controllable device belonging to the scene picture;
acquiring equipment information of the controllable equipment, wherein the equipment information comprises an equipment name, an equipment state and an equipment icon option;
and constructing an equipment page corresponding to the scene picture according to the equipment information.
Further, the processor 1001 may be configured to call the device visualization control program stored in the memory 1005, and further perform the following operations:
when a menu access instruction triggered by the equipment menu option is detected, equipment to be controlled corresponding to the equipment menu option is determined;
and replacing the device page with a control menu page corresponding to the device to be controlled, wherein the control menu page can trigger a plurality of control instructions, so that a user can trigger one or more of the control instructions based on the scene picture and the control menu page.
Further, the processor 1001 may be configured to call the device visualization control program stored in the memory 1005, and further perform the following operations:
when the device switch instruction triggered by the device icon option is detected, determining the device to be controlled corresponding to the device icon option, and detecting whether the device to be controlled is started;
if the equipment to be controlled is started, triggering a closing control instruction to close the equipment to be controlled;
and if the equipment to be controlled is closed, triggering a starting control instruction to start the equipment to be controlled.
Further, the processor 1001 may be configured to call the device visualization control program stored in the memory 1005, and further perform the following operations:
acquiring a secret key for encrypting the control instruction;
encrypting the control command through the secret key to generate a command ciphertext;
and sending the instruction ciphertext to a control server, and forwarding the instruction ciphertext to the equipment to be controlled by the control server so that the equipment to be controlled decrypts the instruction ciphertext to obtain the control instruction.
Further, the processor 1001 may be configured to call the device visualization control program stored in the memory 1005, and further perform the following operations:
when the scene picture is displayed, if a scene change instruction is detected, determining a changed scene to be checked according to the scene change instruction;
determining a camera for collecting the scene to be checked, and sending a camera shooting instruction to the camera so that the camera collects a current scene picture corresponding to the scene to be checked, and sending the current scene picture to the user terminal;
and receiving the current scene picture sent by the camera, and replacing the scene picture with the current scene picture.
Further, the processor 1001 may be configured to call the device visualization control program stored in the memory 1005, and further perform the following operations:
detecting whether the scene picture completely comprises the picture of the device to be controlled;
if the scene picture is detected not to completely include the picture of the equipment to be controlled, determining a camera rotation instruction, and determining a camera corresponding to the equipment to be controlled;
and sending the camera rotation instruction to the camera so that the camera executes the camera rotation instruction.
Based on the hardware structure, various embodiments of the device visualization control method are provided.
The invention provides a visual control method for equipment.
Referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the device visualization control method of the present invention.
In this embodiment, the device visualization control method is applied to a user terminal, and the device visualization control method includes:
step S10, acquiring a scene picture, and acquiring an equipment page corresponding to the scene picture, wherein the scene picture is a picture acquired by a camera for a scene where the equipment is located, and the equipment page is a control page corresponding to the equipment;
in this embodiment, an execution subject of the device visualization control method is a user terminal, and the user terminal may be a terminal device with an interactive function, such as a smart phone, a PC, a microcomputer, a notebook computer, and a server. The user terminal is illustrated by taking a smart phone as an example, and the smart phone comprises software and hardware for operating a server and the like. The smart phone is communicated with the smart device through the internet, and the smart device can be a device with an internet communication function, such as a smart home, a camera, a factory smart device, a market and a hospital using device, and is not limited here.
Firstly, in order to realize the visual remote control of the device, a scene picture of the intelligent device and a corresponding device page need to be acquired, so that when a user enters the home page of the visual remote control through a scene selection page or other manners, the scene picture is acquired, and the device page corresponding to the scene picture is acquired, wherein the scene picture is a picture acquired by a camera for a scene where the device is located, and the device page is a control page corresponding to the device.
It should be noted that the scene picture includes a picture of the device to be controlled and an environment picture of a scene where the device to be controlled is located, and is used for a user to check information such as a current state of the device to be controlled and a surrounding environment condition of the device to be controlled, for example, to check whether the device to be controlled is turned on, check an orientation of the device to be controlled, and check a state of the device to be controlled. The user can rotate the camera through sliding the scene picture, specifically, when detecting the camera rotation instruction triggered by the scene picture, confirm the camera that the current scene corresponds, then, rotate the instruction with the camera and send to the camera to make the camera carry out this camera and rotate the instruction. The camera rotation instruction comprises an upward rotation instruction, a downward rotation instruction, a leftward rotation instruction and a rightward rotation instruction. Of course, other camera control instructions may also be triggered by the scene picture, for example, the user may slide the two fingers from inside to outside, and the scene picture may be enlarged, so that the user may view the device to be controlled more clearly, and accordingly, the scene picture may also be reduced, which is not limited herein.
In addition, it should be noted that the device page is a page corresponding to the scene picture, so that the user can click an option or an icon on the device page to control the device to be controlled based on the device page. Different scene pictures correspond to different device pages, that is, different scenes include different numbers and different types of intelligent devices. Specifically, the smart home is taken as an example, and in a study scene, the smart home is provided with devices such as a lamp, a socket, a desk lamp, and a television, so icons or options corresponding to the devices such as the lamp, the socket, the desk lamp, and the television are displayed on a device page. If there are multiple devices of the same type, the devices may be numbered, such as socket number 1, socket number 2, etc. For understanding, reference may be made to fig. 3, and fig. 3 is a schematic diagram of a first page according to an embodiment of the present invention.
In addition, it should be noted that scene images are acquired by cameras, and each scene includes one or more cameras. This camera is located the top or the place ahead of smart machine usually to guarantee that the camera can clearly, shoot smart machine completely.
In an embodiment, the scene picture includes a scene name display area, specifically, referring to fig. 4, fig. 4 is a second schematic page diagram according to an embodiment of the present invention, in which a main page 1 is a main page of a visual remote control, the main page 1 includes a scene picture 2 and a device page 3, the scene picture 2 includes a scene name display area 4, a middle position of the scene name display area 4 generally displays a current scene name, that is, a name 1, and is in highlight display, and names 2 and 3 are located at two sides of the name 1, and display names of other scenes. When a user slides a scene picture or clicks a name in a scene name display area, a scene change instruction can be triggered, so that the scene picture is dynamically changed into a picture of other scenes, and it can be understood that a corresponding device page is also changed. Specifically, when a scene picture is displayed, if a scene change instruction is detected, determining a changed scene to be checked according to the scene change instruction; determining a camera for acquiring a scene to be checked, and sending a camera shooting instruction to the camera so that the camera acquires a current scene picture corresponding to the scene to be checked, and sending the current scene picture to a user terminal; and receiving a current scene picture sent by the camera, and replacing the scene picture with the current scene picture.
In an embodiment, the scene picture may be obtained through a scene selection page in the user terminal APP, specifically, when the user terminal displays the scene selection page, if a trigger scene selection instruction is detected, determining a scene to be viewed based on the scene selection instruction, where the scene selection page includes multiple scenes, so that a user triggers the scene selection instruction based on the scene selection page; determining a camera for acquiring a scene to be checked, and sending a camera shooting instruction to the camera so that the camera acquires a scene picture corresponding to the scene to be checked, and sending the scene picture to a user terminal; and finally, receiving the scene picture sent by the camera. For convenience of understanding, referring to fig. 5, fig. 5 is a schematic diagram of a third page according to an embodiment of the present invention, where a main page 1 is an application page of a user terminal, the main page 1 includes a scene selection page 5, and the scene selection page 5 includes a scene 1, a scene 2, and a scene 3. It is understood that different smart devices are connected, and all smart devices may include a plurality of scenes, so that a specific visualized remote control main page can be precisely accessed through the scene selection page, namely, the page of fig. 4. In other embodiments, the information may also be obtained through a device selection page in the user terminal APP, that is, the selection device may enter a scene picture corresponding to the device, and a specific execution flow is substantially the same as the scene selection page, which is not described herein again.
In an embodiment, the device page may be obtained by determining a controllable device belonging to a scene, and then obtaining device information of the controllable device, where the device information includes a device name, a device state, a device icon option, and the like; and finally, constructing an equipment page corresponding to the scene picture according to the equipment information. For convenience of understanding, reference may be made to fig. 6, and fig. 6 is a schematic diagram of a fourth page according to an embodiment of the present invention, where a main page 1 is a main page of a visual remote control, the main page 1 includes a scene screen 2 and a device page 3, an icon 1, an icon 2, and an icon 3 respectively represent different devices, and the icon may be used as a device icon option, so as to enter a control menu page or perform control operations such as turning on and off the device based on the device icon option. And the device name and the device state are displayed below the icon and are used for a user to view related information of the device. In other embodiments, the device page may further include other device information or corresponding control operations, which are not limited herein.
Step S20, the scene picture and the equipment page are displayed in the user terminal in a split screen mode, so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page;
after the scene picture and the equipment page thereof are obtained, the scene picture and the equipment page are displayed in a split screen mode at the user terminal, so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page. The split-screen display area is a main page of an application page of the user terminal, and the main page is a user interface on the APP and used for a user to check the behavior state and the surrounding environment condition of the intelligent terminal based on the application interface and control the intelligent terminal.
It should be noted that the split screen mode of the scene picture and the device page may be an up-down split screen, or a left-right split screen, etc. The layout size of the scene picture and the device page in the whole main page can be set according to actual needs, such as five-fifth, seven-third, and the like. Of course, the layout may be set by the user, and specifically, the layout may be rearranged by sliding the boundary between the scene screen and the device page. For the sake of understanding of the split screen, reference is made to fig. 4, in which the scene 2 is on the top and the device page 3 is on the bottom.
In addition, it should be noted that different intelligent devices have different control instruction sets, and the control instruction set includes a plurality of control instructions for instructing the intelligent terminal to perform various operations. For example, control commands such as an on control command, an off control command, a turn control command, and an adjust device attribute. In addition, the control instruction set is bound with the device page, so that the user can trigger the corresponding control instruction based on the device page.
And step S30, when the control instruction is detected, determining the equipment to be controlled according to the control instruction.
In the present embodiment, when a control instruction is detected, the device to be controlled is determined according to the control instruction. When the control instruction is triggered, the control instruction carries the equipment identifier of the equipment to be controlled. The device identification may be used to determine the specific object to which the control instruction is sent. It can be understood that the control instruction is triggered by a device icon or option on a device page, and therefore, when the control instruction is triggered, a corresponding device identifier can be embedded.
In an embodiment, the device page includes a device menu option, which may be a device icon option in the device page, and the user may trigger the corresponding menu access instruction by clicking or by long-pressing. Each menu access instruction corresponds to a device, and therefore, the entered control menu page is a control page corresponding to a specific device. A control menu page is typically overlaid on top of the device page, which may include a number of control operations. For convenience of understanding, fig. 7 may be referred to fig. 7, which is a schematic diagram of a fifth page according to an embodiment of the present invention, where a main page 1 is a visualized remote-controlled main page, the main page 1 includes a scene screen 2 and a control menu page 6, the control menu page 6 includes an apparatus image area and a plurality of control operations, the apparatus image area is used for displaying an image of the apparatus, and the control operations 1, 2, 3, and 4 are different control operations, and may be an opening operation, a closing operation, an increasing attribute operation, and a decreasing attribute operation, respectively, corresponding to different control instructions. In other embodiments, the control menu page may include more or less content, and is not limited herein.
In an embodiment, the device page includes a device icon option, where the device icon option is used for a user to click or long-press a device switch triggering instruction, and specifically, when a device switch triggering instruction triggered by the device icon option is detected, the device to be controlled corresponding to the device icon option is determined, and whether the device to be controlled is turned on is detected; if the equipment to be controlled is started, triggering a closing control instruction to close the equipment to be controlled; and if the equipment to be controlled is closed, triggering an opening control instruction to open the equipment to be controlled. In other embodiments, the device switch command may be triggered by other operations, which are not limited herein.
Further, before the step S30, the method further includes:
step A, when a menu access instruction triggered by the equipment menu option is detected, equipment to be controlled corresponding to the equipment menu option is determined;
and B, replacing the equipment page with a control menu page corresponding to the equipment to be controlled, wherein the control menu page can trigger a plurality of control instructions so that a user can trigger one or more of the control instructions based on the scene picture and the control menu page.
In this embodiment, when a menu access instruction triggered by a device menu option is detected, a device to be controlled corresponding to the device menu option is determined, and then, a device page is replaced with a control menu page corresponding to the device to be controlled, where the control menu page may trigger a plurality of control instructions, so that a user triggers one or more of the plurality of control instructions based on a scene picture and the control menu page. The menu access instruction is used for entering a control menu page, and the control menu page is a page of all control operations of the equipment to be controlled. Specifically, the user may trigger a menu access instruction by long-pressing a device icon in the device page. Of course, the menu access instruction may also be triggered by clicking on a device icon in the device page.
It should be noted that each device to be controlled includes a control instruction set, and the control instruction set includes a plurality of control instructions for instructing the intelligent terminal to perform various operations. And the control instruction sets corresponding to different intelligent terminals are different. In addition, the control instruction set is bound with the control menu page, so that the user can trigger the corresponding control instruction based on the control menu page.
Further, before the step S30, the method further includes:
step C, when the device switch instruction triggered by the device icon option is detected, determining the device to be controlled corresponding to the device icon option, and detecting whether the device to be controlled is started;
d, if the equipment to be controlled is started, triggering a closing control instruction to close the equipment to be controlled;
and E, if the equipment to be controlled is closed, triggering a starting control instruction to start the equipment to be controlled.
In this embodiment, when an apparatus switching instruction triggered by an apparatus icon is detected, the apparatus to be controlled corresponding to the apparatus icon is determined, and whether the apparatus to be controlled is turned on is detected, if the apparatus to be controlled is turned on, a turn-off control instruction is triggered, and if the apparatus to be controlled is turned off, a turn-on control instruction is triggered. The device page comprises a device icon option, and the device icon option is used for triggering a device switch instruction by a user.
Wherein the device switch command is used to switch the activation state of the device, i.e. to switch between on and off. Specifically, the user may trigger the device switch command by clicking a device icon on the device page.
It should be noted that when the device to be controlled is turned on, if the device icon of the device to be controlled is clicked, the control closing instruction is triggered, and then the control closing instruction is sent to the device to be controlled, so that the device to be controlled is closed; when the equipment to be controlled is closed, if the equipment icon of the equipment to be controlled is clicked, an opening control instruction is triggered, and then the opening control instruction is sent to the equipment to be controlled so as to open the current equipment. Of course, the device switch command may be triggered by pressing the device icon for a long time, or may be triggered by other icons or options, which is not limited herein.
Step S40, sending the control instruction to the device to be controlled, so that the device to be controlled executes the control instruction.
In the present embodiment, the control instruction is sent to the device to be controlled, so that the device to be controlled executes the control instruction. The sending mode may be direct sending, or forwarding through the control server. Certainly, the control instruction can be sent after being encrypted so as to carry out source verification on the control instruction and ensure the remote control safety of the intelligent device.
For easy understanding, reference may be made to fig. 3, where the upper side is a scene and the lower side is a device page. When the user selects the study room scene, the user can watch the states of the lamp, the No. 1 socket, the desk lamp and the infrared light of the study room, for example, the current state of the lamp of the study room is on, and when the user watches the study room scene picture, no person is found, so that the user can turn off the lamp by clicking the study room lamp icon. Of course, the user can also enter the control menu page corresponding to the study light by long-pressing the study light icon to implement more control operations, such as adjusting the brightness. In addition, the user can remotely check the activity state of the personnel in the scene through the scene picture, for example, the people such as children or old people who need to look after can remotely look after the weak personnel in the home by checking the scene picture and controlling the intelligent equipment. Certainly, in some dangerous work scenes, a user can remotely watch the workflow state and the running state of the equipment to realize remote visual control, so that an engineer is replaced to enter the dangerous work scene.
The embodiment of the invention provides an equipment visualization control method, which is applied to a user terminal, and the equipment visualization control method comprises the steps of obtaining a scene picture and obtaining an equipment page corresponding to the scene picture, wherein the scene picture is a picture acquired by a camera for a scene where equipment is located, and the equipment page is a control page corresponding to the equipment; the method comprises the steps that a scene picture and an equipment page are displayed in a split screen mode at a user terminal, so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page; when a control instruction is detected, determining equipment to be controlled according to the control instruction; and sending the control instruction to the equipment to be controlled so as to enable the equipment to be controlled to execute the control instruction. According to the embodiment of the invention, the scene picture and the equipment page are displayed in a split screen mode, so that a user can remotely control the equipment while remotely watching the equipment state and the ambient environment condition in the scene, the equipment damage caused by blind control of the equipment is prevented, and the accuracy of remote control of the equipment is improved.
Further, based on the first embodiment described above, a second embodiment of the apparatus visualization control method of the present invention is proposed.
In this embodiment, in step S10, the acquiring a scene picture includes:
a11, when the user terminal displays a scene selection page, if a trigger scene selection instruction is detected, determining a scene to be viewed based on the scene selection instruction, wherein the scene selection page comprises a plurality of scenes, so that a user triggers the scene selection instruction based on the scene selection page;
a12, determining a camera for acquiring the scene to be viewed, and sending a camera shooting instruction to the camera so that the camera acquires a scene picture corresponding to the scene to be viewed, and sending the scene picture to the user terminal;
and a13, receiving the scene picture sent by the camera.
In this embodiment, when a scene selection page is displayed, whether a scene selection instruction is triggered is detected, if the scene selection instruction is detected, a scene to be checked is determined based on the scene selection instruction, then, a camera for shooting the scene to be checked is determined, a camera shooting instruction is sent to the camera, so that the camera collects a scene picture corresponding to the scene to be checked, the scene picture is sent to a user terminal, and finally, the scene picture sent by the camera is received. The scene selection page comprises a plurality of scenes, so that a user triggers a scene selection instruction based on the scene selection page.
It should be noted that the scene selection page includes a plurality of scenes, such as a stair scene, a study, a garage, a balcony, a master bedroom, an entertainment room, and the like. Referring to fig. 8, fig. 8 is a schematic diagram of a sixth page according to an embodiment of the present invention, where a lower scene option is used to enter a scene selection page, and specifically, after a user clicks the scene option, an upper page is changed to a scene option page. The scene selection page may refer to fig. 5, where a main page 1 is an application page of the user terminal, the main page 1 includes a scene selection page 5, and the scene selection page 5 includes a scene 1, a scene 2, and a scene 3. It is understood that different smart devices are connected, and all smart devices may include a plurality of scenes, so that a specific visualized remote control main page can be precisely accessed through the scene selection page, namely, the page of fig. 4.
In addition, it should be further noted that the scene selection may also enter the device selection page through the device option at the bottom in fig. 8, and the device selection page may be presented in the form of a device list. The device list is used for managing all the smart devices so that the smart devices can accurately access to Applications (APPs) of the smart phones. In addition, the device identifier is a unique identifier of the smart device, and can be used for determining the unique smart device. It can be understood that the intelligent devices are managed through the device list, so that only one APP is needed for controlling the plurality of intelligent devices.
Wherein, every smart machine all possesses the camera in order to shoot smart machine, that is to say there is at least 1 camera in every scene. This camera is located the top or the place ahead of smart machine usually to guarantee that the camera can clearly, shoot smart machine completely. It can be understood that different intelligent devices may have different cameras, and in a specific embodiment, the specific step of determining to shoot the camera of the device to be controlled is to determine the camera identifier of the device to be controlled, so as to accurately send a shooting instruction to the corresponding camera according to the identifier.
In addition, it should be further described that the camera shooting instruction is used to instruct the camera to start shooting the device to be controlled, so that the camera returns the shot device picture to the APP.
In this embodiment, the camera corresponding to the device to be controlled is instructed to shoot through the shooting instruction, so that the device picture can be subsequently displayed on the application page of the APP, the user can perform control operation based on the picture of the device to be controlled, and accuracy and safety of visual control of the device are further ensured.
Further, based on the first embodiment described above, a third embodiment of the apparatus visualization control method of the present invention is proposed.
In this embodiment, in step S10, the acquiring the device page corresponding to the scene picture includes:
a step a14, determining controllable devices belonging to the scene picture;
step a15, acquiring device information of the controllable device, wherein the device information includes a device name, a device state and a device icon option;
step a16, constructing an equipment page corresponding to the scene picture according to the equipment information.
In this embodiment, a controllable device is determined according to a scene, then device information is acquired according to the controllable device, and finally a device page is determined according to the device information. The device information includes a device name, a device status, a device icon option, and the like.
It should be noted that, according to the scene picture, the currently viewed scene may be determined, and according to the currently viewed scene, the corresponding controllable device may be determined, and the controllable device includes 1 or more controllable devices. And respectively acquiring the equipment options and the equipment states of the determined controllable equipment, wherein the equipment options can be displayed in the form of icons, and the equipment states comprise online and disconnected. Specifically, the device page includes device options and device states, and referring to fig. 9, fig. 9 is a seventh page schematic diagram according to an embodiment of the present invention, where the device options are icons of electric lamps, a name garage is displayed below the icons, and a device state online is displayed below the name. In addition, the device status is displayed as the online status to indicate that the intelligent device is successfully networked, otherwise, the intelligent device is failed to be networked.
In this embodiment, an equipment page including equipment options and an equipment state is acquired, so that the equipment page can be subsequently displayed on an application page of the APP, and a user can control and operate the equipment to be controlled based on the equipment page, thereby implementing a remote control function of the equipment, and combining with a scene picture, which can improve convenience of visual control of the equipment.
Further, based on the first embodiment described above, a fourth embodiment of the apparatus visualization control method of the present invention is proposed.
In this embodiment, the step S40 includes:
step a41, obtaining a secret key for encrypting the control command;
step a42, encrypting the control command by the secret key to generate a command ciphertext;
step a43, sending the instruction ciphertext to a control server, and forwarding the instruction ciphertext to the device to be controlled by the control server, so that the device to be controlled decrypts the instruction ciphertext to obtain the control instruction.
In this embodiment, a key for encrypting the control instruction is obtained, then the control instruction is encrypted by the key to generate an instruction ciphertext, and finally the instruction ciphertext is sent to the control server, and the control server forwards the instruction ciphertext to the device to be controlled, so that the device to be controlled decrypts the instruction ciphertext to obtain the control instruction.
It should be noted that the key is a symmetric key for performing symmetric encryption on the control instruction. It should be noted that, performing symmetric encryption requires using a symmetric encryption algorithm to implement encrypted transmission of the control command. The symmetric Encryption algorithm may be set according to actual needs, such as a DES (Data Encryption Standard) algorithm, an AES (Advanced Encryption Standard) algorithm, and the like, and is not limited herein.
In particular, the control command may be asymmetrically encrypted. However, the encryption process of the asymmetric encryption is complicated, and thus the encryption is performed using a symmetric encryption algorithm to increase the encryption speed.
It will be appreciated that decryption may require the use of an inverse of an asymmetric encryption algorithm or an inverse of a symmetric encryption algorithm. The symmetric encryption algorithm or the asymmetric encryption algorithm is the same as the algorithm for performing symmetric encryption and asymmetric encryption on the control command. In addition, the decryption process requires decryption using a public key.
In this embodiment, the control instruction is encrypted by using the key, so that the control instruction is prevented from being decrypted, and the transmission security of the control instruction is improved. Meanwhile, the intelligent device decrypts the ciphertext of the control instruction, so that the source of the control instruction can be verified, and an illegal user is prevented from controlling the intelligent device.
Further, based on the above-described first embodiment, a fifth embodiment of the apparatus visualization control method of the present invention is proposed.
In this embodiment, the apparatus visualization control method of the present invention further includes:
step F, when the scene picture is displayed, if a scene change instruction is detected, determining a changed scene to be checked according to the scene change instruction;
step G, determining a camera for acquiring the scene to be checked, and sending a camera shooting instruction to the camera so as to enable the camera to acquire a current scene picture corresponding to the scene to be checked, and sending the current scene picture to the user terminal;
and step H, receiving the current scene picture sent by the camera, and replacing the scene picture with the current scene picture.
When a scene picture is displayed, a user can independently change the scene, specifically, when the scene picture is displayed, if a scene change instruction is detected, the changed scene to be checked is determined according to the scene change instruction; determining a camera for acquiring a scene to be checked, and sending a camera shooting instruction to the camera so that the camera acquires a current scene picture corresponding to the scene to be checked, and sending the current scene picture to a user terminal; and receiving a current scene picture sent by the camera, and replacing the scene picture with the current scene picture.
In an embodiment, the scene picture includes a scene name display area, specifically, referring to fig. 4, a main page 1 in the figure is a visualized remote-controlled main page, the main page 1 includes a scene picture 2 and a device page 3, the scene picture 2 includes a scene name display area 4, a middle position of the scene name display area 4 generally displays a current scene name, that is, a name 1, and is highlighted, and names 2 and 3 are located at two sides of the name 1, and display names of other scenes. When a user slides a scene picture or clicks a name in a scene name display area, a scene change instruction can be triggered, so that the scene picture is dynamically changed into a picture of other scenes, and it can be understood that a corresponding device page is also changed. Specifically, when a scene picture is displayed, if a scene change instruction is detected, determining a changed scene to be checked according to the scene change instruction; determining a camera for acquiring a scene to be checked, and sending a camera shooting instruction to the camera so that the camera acquires a current scene picture corresponding to the scene to be checked, and sending the current scene picture to a user terminal; and receiving a current scene picture sent by the camera, and replacing the scene picture with the current scene picture.
In this embodiment, after the scene picture is displayed, the scene picture can still be changed by triggering the scene change instruction, so as to improve the flexibility of the remote control of the device and improve the use experience of the user.
Further, based on the above-described first embodiment, a sixth embodiment of the apparatus visualization control method of the present invention is proposed.
In this embodiment, the apparatus visualization control method of the present invention further includes:
step I, detecting whether the scene picture completely comprises the picture of the equipment to be controlled;
step J, if the scene picture is detected not to completely include the picture of the equipment to be controlled, determining a camera rotation instruction, and determining a camera corresponding to the equipment to be controlled;
and step J, sending the camera rotation instruction to the camera so that the camera executes the camera rotation instruction.
In this embodiment, a camera rotation instruction is determined according to the device to be controlled and the scene picture, a camera corresponding to the device to be controlled is determined, then the camera rotation instruction is sent to the control server, and the control server forwards the camera rotation instruction to the camera, so that the camera executes the camera rotation instruction.
It should be noted that the scene may only include a partial picture of the device to be controlled, so that analysis may be performed according to the scene to detect whether the operation picture executed by the control instruction can be collected by the camera, so as to determine whether to rotate the camera. Specifically, the control instruction is analyzed, a scene model of the device to be controlled is established to simulate the operation executed by the control instruction, and then the operation is compared with a scene picture to detect whether the operation picture executed by the control instruction can be acquired by the camera.
In addition, it should be noted that the camera rotation instruction includes an upward rotation instruction, a downward rotation instruction, a leftward rotation instruction, and a rightward rotation instruction, and of course, other control instructions may also be included.
In the embodiment, whether the operation picture executed by the control instruction can be acquired by the camera or not is detected, so that the behavior state of the equipment to be controlled can be displayed in real time by the scene picture, a user can obtain a more accurate reference standard, and the accuracy of visual control of the equipment is further improved.
The invention also provides a device visualization control device.
Referring to fig. 10, fig. 10 is a functional block diagram of a visual control apparatus according to a first embodiment of the present invention.
In this embodiment, the device visualization control apparatus includes:
the image obtaining module 10 is configured to obtain a scene image, and obtain an equipment page corresponding to the scene image, where the scene image is an image acquired by a camera for a scene where the equipment is located, and the equipment page is a control page corresponding to the equipment;
the picture display module 20 is configured to perform split-screen display on the scene picture and the device page at the user terminal, so that a user can trigger a control instruction through the device page while watching the scene picture;
the device determining module 30 is configured to determine, when the control instruction is detected, a device to be controlled according to the control instruction;
and the instruction sending module 40 is configured to send the control instruction to the device to be controlled, so that the device to be controlled executes the control instruction.
Each virtual function module of the device visualization control device is stored in the memory 1005 of the device visualization control device shown in fig. 1, and is used for realizing all functions of the device visualization control program; when executed by the processor 1001, the modules may implement device visualization control functions.
Further, the picture acquiring module 10 includes:
the instruction detection unit is used for determining a scene to be viewed based on a scene selection instruction if the scene selection triggering instruction is detected when the user terminal displays the scene selection page, wherein the scene selection page comprises a plurality of scenes so that a user triggers the scene selection instruction based on the scene selection page;
the instruction sending unit is used for determining a camera for acquiring the scene to be checked and sending a camera shooting instruction to the camera so that the camera can acquire a scene picture corresponding to the scene to be checked and send the scene picture to the user terminal;
and the picture receiving unit is used for receiving the scene picture sent by the camera.
Further, the picture acquiring module 10 includes:
a device determination unit for determining a controllable device belonging to the scene picture;
the information acquisition unit is used for acquiring equipment information of the controllable equipment, wherein the equipment information comprises an equipment name, an equipment state and an equipment icon option;
and the page construction unit is used for constructing the equipment page corresponding to the scene picture according to the equipment information.
Further, the device visualization control apparatus further includes:
the instruction detection module is used for determining the equipment to be controlled corresponding to the equipment menu option when detecting a menu access instruction triggered by the equipment menu option;
and the page replacing module is used for replacing the equipment page with a control menu page corresponding to the equipment to be controlled, wherein the control menu page can trigger a plurality of control instructions so that a user triggers one or more of the control instructions based on the scene picture and the control menu page.
Further, the device visualization control apparatus further includes:
the device detection module is used for determining the device to be controlled corresponding to the device icon option and detecting whether the device to be controlled is started or not when the device switch instruction triggered by the device icon option is detected;
the instruction triggering module is used for triggering a control closing instruction to close the equipment to be controlled if the equipment to be controlled is opened;
and the instruction triggering module is further used for triggering a starting control instruction to start the equipment to be controlled if the equipment to be controlled is closed.
Further, the instruction sending module 40 includes:
a key obtaining unit, configured to obtain a key used for encrypting the control instruction;
the instruction encryption unit is used for encrypting the control instruction through the secret key to generate an instruction ciphertext;
and the ciphertext sending unit is used for sending the instruction ciphertext to a control server, and the control server forwards the instruction ciphertext to the equipment to be controlled so that the equipment to be controlled decrypts the instruction ciphertext to obtain the control instruction.
Further, the device visualization control apparatus further includes:
the scene determining module is used for determining a scene to be checked after the scene is replaced according to a scene replacing instruction if the scene replacing instruction is detected when the scene picture is displayed;
the camera determining module is used for determining a camera for acquiring the scene to be checked and sending a camera shooting instruction to the camera so that the camera acquires a current scene picture corresponding to the scene to be checked and sends the current scene picture to the user terminal;
and the picture receiving module is used for receiving the current scene picture sent by the camera and replacing the scene picture with the current scene picture.
Further, the device visualization control apparatus further includes:
the picture detection module is used for detecting whether the scene picture completely comprises the picture of the equipment to be controlled;
the camera determining module is used for determining a camera rotating instruction and determining a camera corresponding to the equipment to be controlled if the scene picture is detected not to completely include the picture of the equipment to be controlled;
the instruction sending module 40 is further configured to send the camera rotation instruction to the camera, so that the camera executes the camera rotation instruction.
The function implementation of each module in the device visualization control device corresponds to each step in the device visualization control method embodiment, and the function and implementation process thereof are not described in detail herein.
The present invention also provides an electronic device, including: the device visualization control method includes a memory, a processor, and a device visualization control program stored on the memory and executable on the processor, and when executed by the processor, implements the steps of the device visualization control method according to any of the above embodiments.
The specific embodiment of the electronic device of the present invention is substantially the same as the embodiments of the device visualization control method, and will not be described herein again.
The present invention also provides a computer-readable storage medium, on which a device visualization control program is stored, which, when executed by a processor, implements the steps of the device visualization control method according to any one of the above embodiments.
The specific embodiment of the computer-readable storage medium of the present invention is substantially the same as the embodiments of the device visualization control method described above, and is not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. The equipment visualization control method is applied to a user terminal, and comprises the following steps:
acquiring a scene picture, and acquiring an equipment page corresponding to the scene picture, wherein the scene picture is a picture acquired by a camera for a scene where equipment is located, and the equipment page is a control page corresponding to the equipment;
the scene picture and the equipment page are displayed in a split screen mode on the user terminal, so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page;
when the control instruction is detected, determining equipment to be controlled according to the control instruction;
and sending the control instruction to the equipment to be controlled so as to enable the equipment to be controlled to execute the control instruction.
2. The device visualization control method according to claim 1, wherein the step of acquiring the scene picture includes:
when the user terminal displays a scene selection page, if a scene selection triggering instruction is detected, determining a scene to be viewed based on the scene selection instruction, wherein the scene selection page comprises a plurality of scenes, so that a user triggers the scene selection instruction based on the scene selection page;
determining a camera for collecting the scene to be checked, and sending a camera shooting instruction to the camera so that the camera collects a scene picture corresponding to the scene to be checked, and sending the scene picture to the user terminal;
and receiving the scene picture sent by the camera.
3. The device visualization control method according to claim 1, wherein the step of acquiring the device page corresponding to the scene picture comprises:
determining a controllable device belonging to the scene picture;
acquiring equipment information of the controllable equipment, wherein the equipment information comprises an equipment name, an equipment state and an equipment icon option;
and constructing an equipment page corresponding to the scene picture according to the equipment information.
4. The device visualization control method according to claim 1, wherein the device page includes a device menu option, and when the control instruction is detected, before the step of determining the device to be controlled according to the control instruction, the method further includes:
when a menu access instruction triggered by the equipment menu option is detected, equipment to be controlled corresponding to the equipment menu option is determined;
and replacing the device page with a control menu page corresponding to the device to be controlled, wherein the control menu page can trigger a plurality of control instructions, so that a user can trigger one or more of the control instructions based on the scene picture and the control menu page.
5. The device visualization control method according to claim 1, wherein the device page includes a device icon option, the device icon option is used for a user to trigger a device switch command, and when the control command is detected, before the step of determining the device to be controlled according to the control command, the method further includes:
when the device switch instruction triggered by the device icon option is detected, determining the device to be controlled corresponding to the device icon option, and detecting whether the device to be controlled is started;
if the equipment to be controlled is started, triggering a closing control instruction to close the equipment to be controlled;
and if the equipment to be controlled is closed, triggering a starting control instruction to start the equipment to be controlled.
6. The device visualization control method according to claim 1, wherein the step of sending the control instruction to the device to be controlled includes:
acquiring a secret key for encrypting the control instruction;
encrypting the control command through the secret key to generate a command ciphertext;
and sending the instruction ciphertext to a control server, and forwarding the instruction ciphertext to the equipment to be controlled by the control server so that the equipment to be controlled decrypts the instruction ciphertext to obtain the control instruction.
7. The device visualization control method according to any one of claims 1 to 6, characterized by further comprising:
when the scene picture is displayed, if a scene change instruction is detected, determining a changed scene to be checked according to the scene change instruction;
determining a camera for collecting the scene to be checked, and sending a camera shooting instruction to the camera so that the camera collects a current scene picture corresponding to the scene to be checked, and sending the current scene picture to the user terminal;
and receiving the current scene picture sent by the camera, and replacing the scene picture with the current scene picture.
8. The device visualization control method according to any one of claims 1 to 6, characterized by further comprising:
detecting whether the scene picture completely comprises the picture of the device to be controlled;
if the scene picture is detected not to completely include the picture of the equipment to be controlled, determining a camera rotation instruction, and determining a camera corresponding to the equipment to be controlled;
and sending the camera rotation instruction to the camera so that the camera executes the camera rotation instruction.
9. An apparatus visualization control apparatus deployed in a user terminal, the apparatus visualization control apparatus comprising:
the device comprises a picture acquisition module, a picture processing module and a picture processing module, wherein the picture acquisition module is used for acquiring a scene picture and acquiring a device page corresponding to the scene picture, the scene picture is a picture acquired by a camera for a scene where the device is located, and the device page is a control page corresponding to the device;
the picture display module is used for displaying the scene picture and the equipment page on the user terminal in a split screen mode so that a user can watch the scene picture and simultaneously trigger a control instruction through the equipment page;
the device determining module is used for determining a device to be controlled according to the control instruction when the control instruction is detected;
and the instruction sending module is used for sending the control instruction to the equipment to be controlled so as to enable the equipment to be controlled to execute the control instruction.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a device visualization control program which, when executed by a processor, implements the steps of the device visualization control method according to any one of claims 1 to 8.
CN202011464002.8A 2020-12-11 2020-12-11 Equipment visualization control method and device and computer readable storage medium Pending CN112527170A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011464002.8A CN112527170A (en) 2020-12-11 2020-12-11 Equipment visualization control method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011464002.8A CN112527170A (en) 2020-12-11 2020-12-11 Equipment visualization control method and device and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN112527170A true CN112527170A (en) 2021-03-19

Family

ID=74999394

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011464002.8A Pending CN112527170A (en) 2020-12-11 2020-12-11 Equipment visualization control method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112527170A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114885031A (en) * 2022-03-21 2022-08-09 深圳绿米联创科技有限公司 Interactive processing method and device based on equipment control and electronic equipment
CN116827932A (en) * 2023-08-30 2023-09-29 中航金网(北京)电子商务有限公司 Multi-server remote control method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105549403A (en) * 2015-12-23 2016-05-04 南京物联传感技术有限公司 Visual scene control method based on camera
CN109597558A (en) * 2018-11-30 2019-04-09 维沃移动通信有限公司 A kind of display control method and terminal device
CN110554831A (en) * 2019-09-06 2019-12-10 腾讯科技(深圳)有限公司 Operation synchronization method, device, equipment and storage medium
CN111665970A (en) * 2019-03-05 2020-09-15 邓爱红 Method and device for controlling display state of touch screen control of mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105549403A (en) * 2015-12-23 2016-05-04 南京物联传感技术有限公司 Visual scene control method based on camera
CN109597558A (en) * 2018-11-30 2019-04-09 维沃移动通信有限公司 A kind of display control method and terminal device
CN111665970A (en) * 2019-03-05 2020-09-15 邓爱红 Method and device for controlling display state of touch screen control of mobile terminal
CN110554831A (en) * 2019-09-06 2019-12-10 腾讯科技(深圳)有限公司 Operation synchronization method, device, equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114885031A (en) * 2022-03-21 2022-08-09 深圳绿米联创科技有限公司 Interactive processing method and device based on equipment control and electronic equipment
CN116827932A (en) * 2023-08-30 2023-09-29 中航金网(北京)电子商务有限公司 Multi-server remote control method and device, electronic equipment and storage medium
CN116827932B (en) * 2023-08-30 2023-10-31 中航金网(北京)电子商务有限公司 Multi-server remote control method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US8502780B1 (en) Head mount display and method for controlling the same
US10055094B2 (en) Method and apparatus for dynamically displaying device list
DK2996015T3 (en) PROCEDURE TO USE IMPROVED REALITY AS HMI VIEW
CN107272879B (en) Display apparatus, system and method for controlling external device
CN109561271B (en) Method for guiding terminal operation, first terminal and second terminal
CN110636354A (en) Display device
US20160165170A1 (en) Augmented reality remote control
US9595124B2 (en) Adding user-selected mark-ups to a video stream
CA2868106C (en) E-map based intuitive video searching system and method for surveillance systems
EP3035306B1 (en) System and method of interactive image and video based contextual alarm viewing
KR20170015622A (en) User terminal apparatus and control method thereof
CN103763469A (en) Simulation camera and parameter configuration method thereof
CN104333449A (en) Picture encryption method and system
CN112527170A (en) Equipment visualization control method and device and computer readable storage medium
CN112073798B (en) Data transmission method and equipment
JP2017505081A (en) Method and apparatus for prompting device connection
CN109614021A (en) Exchange method, device and equipment
JP6322485B2 (en) Information provision device
CN112416200A (en) Display method, display device, electronic equipment and readable storage medium
CN107924296A (en) Display device and its control method
EP3310064A2 (en) Method and device for editing channels of smart tv and smart tv
KR101959507B1 (en) Mobile terminal
TW201508605A (en) A device and a system with a interface displaying information of internet and a interface generation method
KR101810108B1 (en) A remote control system for eletronic devices using the mobile interface on the web
JP5266416B1 (en) Test system and test program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 501 Xingguangbao Industrial Park, No. 9 Lirong Road, Xinshi Community, Dalang Street, Longhua District, Shenzhen City, Guangdong Province, 518000

Applicant after: SHENZHEN CHENGDA WEIYE ELECTRONICS Co.,Ltd.

Address before: 518110 No.4, zone 4, Huayi Industrial Park, Tongsheng community, Dalang street, Longhua District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN CHENGDA WEIYE ELECTRONICS Co.,Ltd.

CB02 Change of applicant information