CN108353151A - The control method and device of target device - Google Patents

The control method and device of target device Download PDF

Info

Publication number
CN108353151A
CN108353151A CN201680066789.8A CN201680066789A CN108353151A CN 108353151 A CN108353151 A CN 108353151A CN 201680066789 A CN201680066789 A CN 201680066789A CN 108353151 A CN108353151 A CN 108353151A
Authority
CN
China
Prior art keywords
user
input information
image
equipment
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201680066789.8A
Other languages
Chinese (zh)
Inventor
辛志华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN108353151A publication Critical patent/CN108353151A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of control method of equipment and device, this method includes:The first input information of user is obtained by user interface, the user interface includes the image of three-dimensional scenic, the three-dimensional scenic includes at least one equipment for the user's control, and first input information is used for position of the selection target equipment in described image at least one equipment;According to first input information, the target device is determined;Control the target device.The control program of equipment is provided a user by the user interface based on image so that user can quick, the accurate and intuitive selection equipment to be managed, i.e. target device, so as to improve user experience.

Description

Target equipment control method and device Technical Field
The present invention relates to the field of communications, and in particular, to a method and an apparatus for controlling a target device.
Background
The Internet of things (IoT) is an important component of new-generation information technology and is also an important development stage of the "information-oriented" era. As the name implies, the Internet of things is the Internet with connected objects. This has two layers: firstly, the core and the foundation of the internet of things are still the internet, and the internet is an extended and expanded network on the basis of the internet; and secondly, the user side extends and expands to any article to perform information exchange and communication, namely, the article information. The internet of things is widely applied to network fusion through communication perception technologies such as intelligent perception, identification technology and pervasive computing, and is also called as the third wave of development of the world information industry after computers and the internet.
With the popularization of the internet of things market, the requirements of users for managing more devices are required to be met. In the prior art, a User Interface (UI) of a control device is used for a User to select based on a location of a text description device, for example: bedroom, lamp 01. When a user selects a target device through a user interface based on text description, if the target device needs to be selected from multiple devices of different types or the same type, the location distribution of the scene where the target device is located needs to be presented in the mind, and the target device is found.
Disclosure of Invention
The application aims to provide an improved control method of a target device, so that the times of selecting the target device by a user are reduced, and the user experience is improved.
In a first aspect, the present application provides a method of controlling a target device. The method comprises the following steps: acquiring first input information of a user through a user interface, wherein the user interface comprises an image of a three-dimensional scene, the three-dimensional scene comprises at least one device which can be controlled by the user, and the first input information is used for selecting the position of a target device in the image in the at least one device; determining the target equipment according to the first input information; controlling the target device.
The control method of the target equipment is provided for the user through the user interface based on the image, so that the user can quickly, accurately and intuitively select the target equipment to be managed, the problem that the target equipment is selected based on text description in the prior art, and the target equipment is selected by multiple attempts, and the user experience can be improved.
With reference to the first aspect, in a possible implementation manner of the first aspect, the controlling the target device includes: acquiring second input information input by the user through the user interface, wherein the user interface provides a function type which can be controlled by the user through the target equipment of the user, and the second input information is used for controlling the operating parameters of the target equipment; and controlling the target equipment according to the second input information.
The user can further control other functions of the target device through the second information input by the user interface, and the user experience is improved.
With reference to the first aspect and any one of the foregoing implementation manners, in a possible implementation manner of the first aspect, the controlling the target device further includes: acquiring third input information input by the user through the user interface, wherein the user interface provides a management project which can be used by the target equipment of the user for equipment management; and managing the target equipment according to the third input information.
The user inputs the third information through the user interface, the target equipment bound in the user interface can be managed, and the user experience can be improved.
With reference to the first aspect and any one of the foregoing implementation manners, in a possible implementation manner of the first aspect, the determining the target device according to the first input information includes: determining the coordinates of the target device in the image according to the first input information; determining the equipment identifier of the target equipment according to the coordinates and the corresponding relation between the prestored coordinates and the equipment identifier; and determining the target equipment according to the equipment identification of the target equipment.
With reference to the first aspect and any one of the foregoing implementation manners, in a possible implementation manner of the first aspect, the image includes a two-dimensional photograph, a panoramic photograph, or a 360-degree spherical photograph.
The user may control the target device through a user interface presented based on a two-dimensional photograph, a panoramic photograph, or a 360 degree spherical photograph. The panoramic photo or the 360-degree spherical photo can provide strong scene substitution feeling for a user, and user experience is improved.
With reference to the first aspect and any one of the foregoing implementation manners, in a possible implementation manner of the first aspect, the acquiring, by a user interface, first input information of a user includes: the method includes the steps that first input information of a user is obtained through a user interface of virtual reality equipment, an image of a three-dimensional scene in the user interface is a three-dimensional image, and the first input information is information input by the user through interactive equipment of VR equipment.
The user can input first input information through a user interface of the virtual reality equipment to control the target equipment. The user interface of the virtual reality equipment can provide a strong scene substitution feeling for a user, and user experience is improved.
With reference to the first aspect and any one of the foregoing implementation manners, in a possible implementation manner of the first aspect, the stereoscopic image is a 360-degree spherical image, and the 360-degree spherical image is obtained by acquiring a planar image of the three-dimensional scene and projecting the planar image of the three-dimensional scene onto a surface of a spherical model.
The user controls the target device through the user interface presented based on the 360-degree spherical image, so that a strong scene substitution feeling can be provided for the user, and the user experience is improved.
In a second aspect, the present application provides an apparatus for controlling a target device, the apparatus comprising: a first obtaining module, configured to obtain first input information of a user through a user interface, where the user interface includes an image of a three-dimensional scene, where the three-dimensional scene includes at least one device controllable by the user, and the first input information is used to select, among the at least one device, a position of a target device in the image; the determining module is used for determining the target equipment according to the first input information acquired by the acquiring module; and the control module is used for controlling the target equipment determined by the determination module.
The control device of the target equipment is provided for the user through the user interface based on the image, so that the user can quickly, accurately and intuitively select the target equipment to be managed, the problem that the target equipment is selected based on text description in the prior art, and the target equipment is selected by multiple attempts, and the user experience can be improved.
With reference to the second aspect, in a possible implementation manner of the second aspect, the control module is specifically configured to obtain second input information input by the user through the user interface, where the user interface provides a function type that can be controlled by the user for the target device of the user, and the second input information is used to control an operating parameter of the target device; and controlling the target equipment according to the second input information.
The user can further control other functions of the target device through the second information input by the user interface, and the user experience is improved.
With reference to the second aspect and any one of the foregoing possible implementation manners, in a possible implementation manner of the second aspect, the apparatus further includes: the second obtaining module is used for obtaining third input information input by the user through the user interface, and the user interface provides a management project which can be used by the target equipment of the user for equipment management of the user; and the management module is used for managing the target equipment according to the third input information.
The user inputs the third information through the user interface, the target equipment bound in the user interface can be managed, and the user experience can be improved.
With reference to the second aspect and any one of the foregoing possible implementations, in a possible implementation of the second aspect, the determining module is specifically configured to: determining the coordinates of the target device in the image according to the first input information; determining the equipment identifier of the target equipment according to the coordinates and the corresponding relation between the prestored coordinates and the equipment identifier; and determining the target equipment according to the equipment identification of the target equipment.
With reference to the second aspect and any one of the foregoing possible implementation manners, in one possible implementation manner of the second aspect, the image includes a two-dimensional photograph, a panoramic photograph, or a 360-degree spherical photograph.
The user may control the target device through a user interface presented based on a two-dimensional photograph, a panoramic photograph, or a 360 degree spherical photograph. The panoramic photo or the 360-degree spherical photo can provide strong scene substitution feeling for a user, and user experience is improved.
With reference to the second aspect and any one of the foregoing implementation manners, in a possible implementation manner of the second aspect, the first obtaining module is specifically configured to: the method comprises the steps of obtaining first input information of a user through a user interface of virtual reality equipment, wherein an image of a three-dimensional scene in the user interface is a three-dimensional image, and the first input information is information input by the user through interactive equipment of the equipment.
The user can input first input information through a user interface of the virtual reality equipment to control the target equipment. The user interface of the virtual reality equipment can provide a strong scene substitution feeling for a user, and user experience is improved.
With reference to the second aspect and any one of the foregoing implementation manners, in a possible implementation manner of the second aspect, the stereoscopic image is a 360-degree spherical image, and the 360-degree spherical image is obtained by acquiring a planar image of the three-dimensional scene and projecting the planar image of the three-dimensional scene onto a surface of a spherical model.
The user controls the target device through the user interface presented based on the 360-degree spherical image, so that a strong scene substitution feeling can be provided for the user, and the user experience is improved.
In a third aspect, the present application provides a device for controlling a target device, where the device includes a memory, a processor, an input/output interface, a communication interface, and a bus system, where the memory, the processor, the input/output interface, and the communication interface are connected via the bus system, the input/output interface is configured to obtain first input information of a user via a user interface, the user interface includes an image of a three-dimensional scene, the three-dimensional scene includes at least one device controllable by the user, and the first input information is used to select, among the at least one device, a position of the target device in the image; the processor is used for determining the target equipment according to the first input information acquired by the input/output interface; and for controlling the target device determined by the determination module.
With reference to the third aspect, in a possible implementation manner of the third aspect, the processor is specifically configured to: acquiring second input information input by the user through the user interface, wherein the user interface provides a function type which can be controlled by the user through the target equipment of the user, and the second input information is used for controlling the operating parameters of the target equipment; and controlling the target equipment according to the second input information.
The user can further control other functions of the target device through the second information input by the user interface, and the user experience is improved.
With reference to the third aspect and any one of the foregoing possible implementation manners, in a possible implementation manner of the third aspect, the input/output interface is further configured to: acquiring third input information input by the user through the user interface, wherein the user interface provides a management project which can be used by the target equipment of the user for equipment management; the processor is further configured to manage the target device according to the third input information.
The user inputs the third information through the user interface, the target equipment bound in the user interface can be managed, and the user experience can be improved.
With reference to the third aspect and any one of the foregoing possible implementation manners, in a possible implementation manner of the third aspect, the processor is specifically configured to: determining the coordinates of the target device in the image according to the first input information; determining the equipment identifier of the target equipment according to the coordinates and the corresponding relation between the prestored coordinates and the equipment identifier; and determining the target equipment according to the equipment identification of the target equipment.
With reference to the third aspect and any one of the foregoing possible implementation manners, in a possible implementation manner of the third aspect, the image includes a two-dimensional photograph, a panoramic photograph, or a 360-degree spherical photograph.
The user may control the target device through a user interface presented based on a two-dimensional photograph, a panoramic photograph, or a 360 degree spherical photograph. The panoramic photo or the 360-degree spherical photo can provide strong scene substitution feeling for a user, and user experience is improved.
With reference to the third aspect and any one of the foregoing possible implementation manners, in a possible implementation manner of the third aspect, the input/output interface is specifically configured to: the method comprises the steps of obtaining first input information of a user through a user interface of virtual reality equipment, wherein an image of a three-dimensional scene in the user interface is a three-dimensional image, and the first input information is information input by the user through interactive equipment of the equipment.
The user can input first input information through a user interface of the virtual reality equipment to control the target equipment. The user interface of the virtual reality equipment can provide a strong scene substitution feeling for a user, and user experience is improved.
With reference to the third aspect and any one of the foregoing possible implementation manners, in a possible implementation manner of the third aspect, the stereoscopic image is a 360-degree spherical image, and the 360-degree spherical image is obtained by acquiring a planar image of the three-dimensional scene and projecting the planar image of the three-dimensional scene onto a surface of a spherical model.
The user controls the target device through the user interface presented based on the 360-degree spherical image, so that a strong scene substitution feeling can be provided for the user, and the user experience is improved.
In a fourth aspect, the present application provides a computer-readable storage medium for storing program code for a method of controlling a target device, the program code comprising instructions for performing the method of the first aspect.
In some implementations, the device identifier and/or the target device identifier may be a name of a scene where the device is located and a name of the device.
The application provides an improved control scheme of equipment, and a user can quickly, accurately and visually select target equipment through the scheme, so that the user experience can be improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments of the present invention will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a schematic flow chart of a control method of a target device according to an embodiment of the present invention.
Fig. 2 shows a schematic diagram of a menu for controlling functions of a target device according to an embodiment of the present invention.
Fig. 3 illustrates a schematic diagram of a menu for managing a target device according to an embodiment of the present invention.
Fig. 4 shows a schematic diagram of the mapping principle of the coordinates of the touch point based on the panoramic photo to the coordinates in the image.
Fig. 5 shows a schematic block diagram of a control apparatus of a target device of an embodiment of the present invention.
Fig. 6 shows a schematic block diagram of a control apparatus of a target device of an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 shows a schematic flow chart of a method of controlling a target device according to an embodiment of the invention, the method of fig. 1 being executable by a terminal device. The method of fig. 1 comprises:
acquiring first input information of a user through a user interface, wherein the user interface comprises an image of a three-dimensional scene, the three-dimensional scene comprises at least one device which can be controlled by the user, and the first input information is used for selecting the position of a target device in the image in the at least one device.
It should be understood that the three-dimensional scene image may be constructed based on any one or a combination of two or more of a photograph of a scene in which the target device is located, an effect diagram of the scene, and an engineering drawing of the scene. The image of the three-dimensional scene may be a two-dimensional (2D) photograph, a panoramic photograph, or a 360-sphere photograph.
It should be further understood that the first input information may be input by a user through a touch of the user interface on the touch screen where the user interface is located, or may be input by the user through voice information.
And 120, determining the target device according to the first input information.
130, controlling the target device.
According to the control method of the target device, the method for managing the device can be provided for the user through the user interface based on the image, so that the user can quickly, accurately and intuitively select the device to be managed, namely the target device, and the problem that the target device is selected based on characters in the prior art and is selected repeatedly is avoided, and the user experience can be improved.
Optionally, as an embodiment, the step 130 may include obtaining, through the user interface, second input information input by the user, where the user interface provides a function type that can be controlled by the user for the target device by the user, and the second input information is used to control an operating parameter of the target device; and controlling the target equipment according to the second input information.
Specifically, the user selects the target device in the user interface, and the user interface may pop up a menu for controlling the operation parameters of the device, and fig. 2 is a schematic diagram illustrating a menu for controlling the functions of the target device according to an embodiment of the present invention. In the menu shown in fig. 2, the functions of the target device are presented to the user in the form of icons. It should be understood that the menu may also be represented in a text form, and the present invention is not limited to the presentation form of the menu for the user in any way. The user may control the operating parameters of the device through the menu, as shown in fig. 2. For example, after the user selects the air conditioner 200 in the image through the user interface, the user interface pops up a menu 210 for controlling the operation parameters of the air conditioner, so that the user can control the temperature 220 and the operation mode 230 of the air conditioner.
Optionally, as an embodiment, the method shown in fig. 1 may include obtaining, through the user interface, third input information input by the user, where the user interface provides a management item that the target device of the user can provide for device management by the user; and managing the target equipment according to the third input information.
Specifically, the user may press the target device for a long time through the user interface, which may pop up a menu for managing the device. Fig. 3 is a diagram illustrating a menu for managing a target device according to an embodiment of the present invention, in which functions of the target device are presented to a user in the form of icons in the menu illustrated in fig. 3. It should be understood that the menu for managing the device may also be presented to the user in the form of text, and the present embodiment does not specifically limit the presentation form of the menu for the user. It should also be understood that the second input information may be the same as the third input information, that is, the user may cause the user interface to simultaneously present a menu for managing the device and a menu for the user to control the functions of the device by inputting the second input information or the third input information.
In the menu for managing the target devices shown in fig. 3, a dome lamp 300 will be described as an example. The user may control the brightness of the dome light 300 by adjusting the dome light brightness button 310 in the user interface shown in fig. 3; the user can also control the on-off state of the dome light 300 through a button 320 for adjusting the on-off state of the dome light in the user interface shown in fig. 3; the user can also control the light color of the dome light 300 when it is turned on by the button 330 for adjusting the color of the dome light in the user interface shown in fig. 3; the user may also control the device name of the dome light 300 via the button 340 that renames the dome light 300 in the user interface shown in fig. 3; the user can also unbind the dome lamp 300 in the bound state through the button 350 for unbinding the dome lamp 300 in the user interface shown in fig. 3, and the device identifier of the dome lamp 300 after unbinding can be stored in the list of unbound devices again for the next binding operation.
Optionally, as an embodiment, step 120 may include determining coordinates of the target device in the image according to the first input information; determining the equipment identifier of the target equipment according to the coordinates and the corresponding relation between the prestored coordinates and the equipment identifier; and determining the target equipment according to the equipment identification of the target equipment.
Specifically, the correspondence between the pre-stored coordinates and the device identifier may be generated by selecting, by the user, the device identifier that is not bound in the image from the unbound device library, and binding the device identifier to the corresponding coordinates in the image.
It should be understood that, during the process of registering the device in the unbound device library, the device and the functions that the device has available for the user to manage may be bound, that is, the device may be bound to the types of the operation parameters that can be controlled by the user in the menu for controlling the operation parameters of the device.
It will also be appreciated that in the process of binding device identification and coordinates by a user, if the device has a recognition mode, the user may be assisted in determining whether the selected device identification is the device identification that the user needs to bind to the selected coordinates. For example, a user needs to bind a table lamp in a bedroom, and after the user selects a device identifier corresponding to the table lamp, the table lamp may flash to help the user confirm whether the device identifier selected by the user corresponds to a device that the user wishes to bind to, that is, a target device.
Alternatively, the image may be a 2D photograph, a panoramic photograph, or a 360-sphere photograph, as one example.
Specifically, when the image is a 2D photo, the mapping relationship of the coordinates of the touch point of the touch screen presenting the user interface by the user to the image is a linear mapping relationship, and (x) is satisfied, where1,y1) Indicating the coordinates in the image, (x, y) indicating the coordinates of the touch point, and a, b, c, d are constants, respectively.
When the image is a panoramic photo, fig. 4 shows a schematic diagram of a mapping principle of coordinates of touch points based on the panoramic photo to coordinates in the image. Since the panorama photograph is in a cylindrical mode, as shown in fig. 4, the cylindrical mode can be developed into a planar mode by extending the x-axis, and the mapping relationship of the coordinates of the touch point to the image can also satisfy a linear relationship.
When the image is a 360-sphere photo, the mapping relationship between the coordinates of the touch points and the image can be determined by software for mapping calculation based on the projection type and the relevant parameters of the image file format of the image. The embodiment of the present invention does not specifically limit the mapping relationship between the coordinates of the touch point and the coordinates in the image.
Optionally, as an embodiment, the acquiring, through the user interface, first input information of the user includes: the method includes the steps that first input information of a user is obtained through a user interface of Virtual Reality (VR) equipment, an image of a three-dimensional scene in the user interface is a three-dimensional image, and the first input information is information input by the user through interactive equipment of the VR equipment.
It should be understood that the VR device described above may refer to a three-dimensional visual display device, such as a 3D presentation system, a large projection system, a head-mounted stereoscopic display, and the like. The VR interaction device may be a data glove, a 3D input device (e.g., a three-dimensional mouse), a motion capture device, an eye tracker, a force feedback device, and the like.
Optionally, as an embodiment, the stereoscopic image is a 360-degree spherical image, and the 360-degree spherical image is obtained by acquiring a planar image of the three-dimensional scene and projecting the planar image of the three-dimensional scene onto a surface of a spherical model.
Specifically, the 360-degree spherical photograph can be formed by splicing source images acquired by rotating the camera around the nodes of the camera. Namely, a mapping relation between coordinates of the source images and spherical coordinates is established, and the source images are spliced into a 360-degree spherical photo.
In VR equipment, the image can adopt panorama photo and 360 spherical photos, and VR equipment can present the scene to the user with three-dimensional mode through the rotation of gyro sensor response user's head, and the user can select and control target device through the interactive device of VR equipment, can provide better experience for the user.
The control method of the target device according to the embodiment of the present invention is described above in detail with reference to fig. 1 to 4, and the control apparatus of the target device according to the embodiment of the present invention is described below in detail with reference to fig. 5 to 6. It should be understood that the apparatus shown in fig. 5 and 6 can implement the steps of fig. 1, and in order to avoid repetition, the detailed description is omitted here.
Fig. 5 shows a schematic block diagram of a control apparatus of a target device of an embodiment of the present invention. The apparatus 500 shown in fig. 5 includes a first acquisition module 510, a determination module 520, and a control module 530.
A first obtaining module 510, configured to obtain first input information of a user through a user interface, where the user interface includes an image of a three-dimensional scene including at least one device controllable by the user, and the first input information is used to select a position of a target device in the image among the at least one device;
a determining module 520, configured to determine the target device according to the first input information acquired by the acquiring module 510;
a control module 530, configured to control the target device determined by the determination module 520.
The control device of the equipment is provided for the user through the user interface based on the image, so that the user can quickly, accurately and visually select the equipment to be managed, namely the target equipment, and the problem that the target equipment is selected based on characters in the prior art and is selected by multiple attempts is avoided, and the user experience can be improved.
Fig. 6 shows a schematic block diagram of a control apparatus of a target device of an embodiment of the present invention. The apparatus 600 shown in fig. 6 comprises a memory 610, a processor 620, an input/output interface 630, a communication interface 640 and a bus system 650. The memory 610, the processor 620, the input/output interface 630 and the communication interface 640 are connected through a bus system 650, the memory 610 is used for storing instructions, and the processor 620 is used for executing the instructions stored in the memory 610, so as to control the input/output interface 630 to receive input data and information, output data such as operation results, and control the communication interface 640 to send signals.
An input/output interface 630 for obtaining first input information of a user through a user interface, the user interface comprising an image of a three-dimensional scene comprising at least one device controllable by the user, the first input information for selecting a position of a target device in the image among the at least one device;
a processor 620, configured to determine the target device according to the first input information obtained by the input/output interface 630; controlling the target device determined by the determination module.
The control device of the equipment is provided for the user through the user interface based on the image, so that the user can quickly, accurately and visually select the equipment to be managed, namely the target equipment, and the problem that the target equipment is selected based on characters in the prior art and is selected by multiple attempts is avoided, and the user experience can be improved.
It should be understood that the apparatus 600 shown in fig. 6 may be a terminal device, the input/output interface 630 may be a touch screen of the terminal device 600, and the terminal device 600 may present the user interface through the touch screen to obtain the first input information of the user.
It should be understood that, in the embodiment of the present invention, the processor 620 may adopt a general-purpose Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, for executing related programs to implement the technical solutions provided by the embodiments of the present invention.
It is to be further appreciated that communication interface 640 enables communication between apparatus 600 and other devices or communication networks using transceiver devices such as, but not limited to, transceivers.
The memory 610 may include a read-only memory and a random access memory, and provides instructions and data to the processor 620. A portion of processor 620 may also include non-volatile random access memory. For example, the processor 620 may also store information of the device type.
The bus system 650 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. For clarity of illustration, the various buses are designated in the figure as the bus system 650.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 620. The steps of the control method for the target device disclosed in the embodiments of the present invention may be directly implemented by a hardware processor, or implemented by a combination of hardware and software modules in the processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 610, and the processor 620 reads the information in the memory 610 and completes the steps of the method shown in fig. 1 in combination with the hardware thereof. To avoid repetition, it is not described in detail here.
It should be understood that in the present embodiment, "B corresponding to a" means that B is associated with a, from which B can be determined. It should also be understood that determining B from a does not mean determining B from a alone, but may be determined from a and/or other information.
It should be understood that the term "and/or" herein is merely one type of association relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that, in various embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (21)

  1. A control method of a target apparatus, characterized by comprising:
    acquiring first input information of a user through a user interface, wherein the user interface comprises an image of a three-dimensional scene, the three-dimensional scene comprises at least one device which can be controlled by the user, and the first input information is used for selecting the position of a target device in the image in the at least one device;
    determining the target equipment according to the first input information;
    controlling the target device.
  2. The method of claim 1, wherein the controlling the target device comprises:
    acquiring second input information input by the user through the user interface, wherein the user interface provides a function type which can be controlled by the user through the target equipment of the user, and the second input information is used for controlling the operating parameters of the target equipment;
    and controlling the target equipment according to the second input information.
  3. The method of claim 1 or 2, wherein the method further comprises:
    acquiring third input information input by the user through the user interface, wherein the user interface provides a management project which can be used by the target equipment of the user for equipment management;
    and managing the target equipment according to the third input information.
  4. The method of any of claims 1-3, wherein said determining the target device based on the first input information comprises:
    determining the coordinates of the target device in the image according to the first input information;
    determining the equipment identifier of the target equipment according to the coordinates and the corresponding relation between the prestored coordinates and the equipment identifier;
    and determining the target equipment according to the equipment identification of the target equipment.
  5. The method of any of claims 1 to 4, wherein the image comprises a two-dimensional 2D photograph, a panoramic photograph, or a 360 degree spherical photograph.
  6. The method of any one of claims 1 to 5, wherein the obtaining first input information of a user through a user interface comprises:
    the method includes the steps that first input information of a user is obtained through a user interface of Virtual Reality (VR) equipment, an image of a three-dimensional scene in the user interface is a three-dimensional image, and the first input information is information input by the user through interactive equipment of the VR equipment.
  7. The method of claim 6, wherein the stereoscopic image is a 360 degree spherical image, the 360 degree spherical image being obtained by acquiring a planar image of the three dimensional scene and projecting the planar image of the three dimensional scene onto a surface of a spherical model.
  8. A control apparatus of a target device, characterized by comprising:
    a first obtaining module, configured to obtain first input information of a user through a user interface, where the user interface includes an image of a three-dimensional scene, where the three-dimensional scene includes at least one device controllable by the user, and the first input information is used to select, among the at least one device, a position of a target device in the image;
    the determining module is used for determining the target equipment according to the first input information acquired by the acquiring module;
    and the control module is used for controlling the target equipment determined by the determination module.
  9. The apparatus of claim 8, wherein the control module is specifically configured to:
    acquiring second input information input by the user through the user interface, wherein the user interface provides a function type which can be controlled by the user through the target equipment of the user, and the second input information is used for controlling the operating parameters of the target equipment;
    and controlling the target equipment according to the second input information.
  10. The apparatus of claim 8 or 9, wherein the apparatus further comprises:
    the second obtaining module is used for obtaining third input information input by the user through the user interface, and the user interface provides a management project which can be used by the target equipment of the user for equipment management of the user;
    and the management module is used for managing the target equipment according to the third input information.
  11. The apparatus according to any one of claims 8 to 10, wherein the determining module is specifically configured to:
    determining the coordinates of the target device in the image according to the first input information;
    determining the equipment identifier of the target equipment according to the coordinates and the corresponding relation between the prestored coordinates and the equipment identifier;
    and determining the target equipment according to the equipment identification of the target equipment.
  12. The apparatus of any of claims 8 to 11, wherein the image comprises a two-dimensional 2D photograph, a panoramic photograph, or a 360 degree spherical photograph.
  13. The apparatus of any one of claims 8 to 12, wherein the first obtaining module is specifically configured to:
    the method includes the steps that first input information of a user is obtained through a user interface of Virtual Reality (VR) equipment, an image of a three-dimensional scene in the user interface is a three-dimensional image, and the first input information is information input by the user through interactive equipment of the VR equipment.
  14. The apparatus of claim 13, wherein the stereoscopic image is a 360 degree spherical image, the 360 degree spherical image being obtained by acquiring a planar image of the three dimensional scene and projecting the planar image of the three dimensional scene onto a surface of a spherical model.
  15. A control apparatus of a target device, characterized by comprising: a memory, a processor, an input/output interface, a communication interface and a bus system, wherein the memory, the processor, the input/output interface and the communication interface are connected by the bus system,
    the input/output interface is used for acquiring first input information of a user through a user interface, the user interface comprises an image of a three-dimensional scene, the three-dimensional scene comprises at least one device which can be controlled by the user, and the first input information is used for selecting the position of a target device in the image in the at least one device;
    the processor is used for determining the target equipment according to the first input information acquired by the input/output interface; and for controlling the target device determined by the determination module.
  16. The apparatus of claim 15, wherein the processor is specifically configured to:
    acquiring second input information input by the user through the user interface, wherein the user interface provides a function type which can be controlled by the user through the target equipment of the user, and the second input information is used for controlling the operating parameters of the target equipment;
    and controlling the target equipment according to the second input information.
  17. The apparatus of claim 15 or 16, wherein the input/output interface is further to:
    acquiring third input information input by the user through the user interface, wherein the user interface provides a management project which can be used by the target equipment of the user for equipment management;
    the processor is further configured to manage the target device according to the third input information.
  18. The apparatus of any one of claims 15 to 17, wherein the processor is specifically configured to:
    determining the coordinates of the target device in the image according to the first input information;
    determining the equipment identifier of the target equipment according to the coordinates and the corresponding relation between the prestored coordinates and the equipment identifier;
    and determining the target equipment according to the equipment identification of the target equipment.
  19. The apparatus of any of claims 15 to 18, wherein the image comprises a two-dimensional 2D photograph, a panoramic photograph, or a 360 degree spherical photograph.
  20. The apparatus of any one of claims 15 to 19, wherein the input/output interface is specifically configured to:
    the method includes the steps that first input information of a user is obtained through a user interface of Virtual Reality (VR) equipment, an image of a three-dimensional scene in the user interface is a three-dimensional image, and the first input information is information input by the user through interactive equipment of the VR equipment.
  21. The apparatus of claim 20, wherein the stereoscopic image is a 360 degree spherical image, the 360 degree spherical image being obtained by acquiring a planar image of the three dimensional scene and projecting the planar image of the three dimensional scene onto a surface of a spherical model.
CN201680066789.8A 2016-03-04 2016-03-04 The control method and device of target device Pending CN108353151A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/075667 WO2017147909A1 (en) 2016-03-04 2016-03-04 Target device control method and apparatus

Publications (1)

Publication Number Publication Date
CN108353151A true CN108353151A (en) 2018-07-31

Family

ID=59743403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680066789.8A Pending CN108353151A (en) 2016-03-04 2016-03-04 The control method and device of target device

Country Status (2)

Country Link
CN (1) CN108353151A (en)
WO (1) WO2017147909A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112968819A (en) * 2021-01-18 2021-06-15 珠海格力电器股份有限公司 Household appliance control method and device based on TOF

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10559194B2 (en) * 2018-02-23 2020-02-11 Samsung Electronics Co., Ltd. System and method for providing customized connected device functionality and for operating a connected device via an alternate object
CN109507904B (en) * 2018-12-18 2022-04-01 珠海格力电器股份有限公司 Household equipment management method, server and management system
CN110047135A (en) * 2019-04-22 2019-07-23 广州影子科技有限公司 Management method, managing device and the management system of cultivation task
CN110780598B (en) * 2019-10-24 2023-05-16 深圳传音控股股份有限公司 Intelligent device control method and device, electronic device and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294024A (en) * 2013-04-09 2013-09-11 宁波杜亚机电技术有限公司 Intelligent home system control method
CN104181884A (en) * 2014-08-11 2014-12-03 厦门立林科技有限公司 Device and method for controlling intelligent home based on panoramic view
CN104468837A (en) * 2014-12-29 2015-03-25 小米科技有限责任公司 Intelligent device binding method and device
CN105141913A (en) * 2015-08-18 2015-12-09 华为技术有限公司 Method and system for visually and remotely controlling touch control equipment and relevant equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662374A (en) * 2012-05-11 2012-09-12 刘书军 Home furnishing control system and method based on real-scene interface
KR20140109020A (en) * 2013-03-05 2014-09-15 한국전자통신연구원 Apparatus amd method for constructing device information for smart appliances control
CN103246267B (en) * 2013-04-29 2016-07-06 鸿富锦精密工业(深圳)有限公司 There is remote control and the interface creating method thereof of three-dimensional user interface
CN105022281A (en) * 2015-07-29 2015-11-04 中国电子科技集团公司第十五研究所 Intelligent household control system based on virtual reality
CN105373001A (en) * 2015-10-29 2016-03-02 小米科技有限责任公司 Control method and device for electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294024A (en) * 2013-04-09 2013-09-11 宁波杜亚机电技术有限公司 Intelligent home system control method
CN104181884A (en) * 2014-08-11 2014-12-03 厦门立林科技有限公司 Device and method for controlling intelligent home based on panoramic view
CN104468837A (en) * 2014-12-29 2015-03-25 小米科技有限责任公司 Intelligent device binding method and device
CN105141913A (en) * 2015-08-18 2015-12-09 华为技术有限公司 Method and system for visually and remotely controlling touch control equipment and relevant equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112968819A (en) * 2021-01-18 2021-06-15 珠海格力电器股份有限公司 Household appliance control method and device based on TOF
CN112968819B (en) * 2021-01-18 2022-07-22 珠海格力电器股份有限公司 Household appliance control method and device based on TOF

Also Published As

Publication number Publication date
WO2017147909A1 (en) 2017-09-08

Similar Documents

Publication Publication Date Title
EP3167446B1 (en) Apparatus and method for supplying content aware photo filters
CN108353151A (en) The control method and device of target device
JP2022537614A (en) Multi-virtual character control method, device, and computer program
KR20170122725A (en) Modifying scenes of augmented reality using markers with parameters
JP7208549B2 (en) VIRTUAL SPACE CONTROL DEVICE, CONTROL METHOD THEREOF, AND PROGRAM
EP2814000A1 (en) Image processing apparatus, image processing method, and program
CN108805989B (en) Scene crossing method and device, storage medium and terminal equipment
KR20170112406A (en) Apparatus and method for taking a picture with avatar in augmented reality
JP6677890B2 (en) Information processing system, its control method and program, and information processing apparatus, its control method and program
KR102091662B1 (en) Real-time method for rendering 3d modeling
CN108629799B (en) Method and equipment for realizing augmented reality
JP2016122392A (en) Information processing apparatus, information processing system, control method and program of the same
CN111968246A (en) Scene switching method and device, electronic equipment and storage medium
JP6700845B2 (en) Information processing apparatus, information processing method, and program
US20140142900A1 (en) Information processing apparatus, information processing method, and program
CN112767248A (en) Infrared camera picture splicing method, device and equipment and readable storage medium
CN109324748B (en) Equipment control method, electronic equipment and storage medium
CN112819559A (en) Article comparison method and device
JP2017084215A (en) Information processing system, control method thereof, and program
JP5999236B2 (en) INFORMATION PROCESSING SYSTEM, ITS CONTROL METHOD, AND PROGRAM, AND INFORMATION PROCESSING DEVICE, ITS CONTROL METHOD, AND PROGRAM
CN110764841B (en) 3D visual application development platform and development method
CN114610143A (en) Method, device, equipment and storage medium for equipment control
JP6967150B2 (en) Learning device, image generator, learning method, image generation method and program
US20160070822A1 (en) Method, Apparatus and Computer Program Code for Design and Visualization of a Physical Object
CN111563956A (en) Three-dimensional display method, device, equipment and medium for two-dimensional picture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180731