CN107132769B - Intelligent equipment control method and device - Google Patents

Intelligent equipment control method and device Download PDF

Info

Publication number
CN107132769B
CN107132769B CN201710264966.XA CN201710264966A CN107132769B CN 107132769 B CN107132769 B CN 107132769B CN 201710264966 A CN201710264966 A CN 201710264966A CN 107132769 B CN107132769 B CN 107132769B
Authority
CN
China
Prior art keywords
equipment
controllable
picture
intelligent
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710264966.XA
Other languages
Chinese (zh)
Other versions
CN107132769A (en
Inventor
高毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201710264966.XA priority Critical patent/CN107132769B/en
Publication of CN107132769A publication Critical patent/CN107132769A/en
Application granted granted Critical
Publication of CN107132769B publication Critical patent/CN107132769B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house

Abstract

The disclosure relates to an intelligent device control method and device, wherein the method comprises the following steps: starting the camera device and displaying a framing picture of the camera device under the condition that the VR live-action mode of the terminal equipment is triggered; under the condition that the control is confirmed to be triggered, acquiring a first picture corresponding to a framing picture, and marking a first position where a cursor is located in the first picture; determining a first intelligent device based on the first picture and the first position; displaying operations executable by the first smart device; and controlling the first intelligent equipment to execute the operation selected by the user. The intelligent device control device provided by the embodiment of the disclosure enables a user to realize control over the controllable intelligent device without taking off the VR head display when the user uses the VR head display to carry out video watching or game playing, and is simple and convenient in operation process, and user experience is improved.

Description

Intelligent equipment control method and device
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for controlling an intelligent device.
Background
With the development of Virtual Reality technology (VR technology for short), various VR helmets and VR glasses are rapidly emerging and accepted by consumers. When a user wears VR glasses to watch a movie or play a game, if the user wants to control each intelligent device in an intelligent home, the user needs to take off the VR glasses to control the intelligent device, and the user experience is influenced.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides an intelligent device control method and apparatus.
According to a first aspect of the embodiments of the present disclosure, there is provided an intelligent device control method, including:
starting a camera device and displaying a framing picture of the camera device under the condition that a VR live-action mode of the terminal equipment is triggered; under the condition that the control is confirmed to be triggered, acquiring a first picture corresponding to a framing picture, and marking a first position where a cursor is located in the first picture; determining a first smart device based on the first picture and the first location; displaying operations executable by the first smart device; and controlling the first intelligent equipment to execute the operation selected by the user.
In one possible implementation, determining a first smart device based on the first picture and the first location includes:
determining equipment to be identified corresponding to the first position in the first picture; and determining the controllable intelligent equipment matched with the equipment to be identified in the equipment library as the first intelligent equipment.
In one possible implementation manner, determining a controllable smart device in a device library that matches the device to be identified as the first smart device includes:
under the condition that a plurality of controllable intelligent devices matched with the device to be identified are available, displaying the controllable intelligent devices to be selected; and determining the controllable intelligent device selected by the user as the first intelligent device.
In a possible implementation manner, the device library is configured to store basic information of the controllable intelligent device, where the basic information includes at least one of:
a name of the controllable smart device, an appearance picture, an executable operation, and a spatial location of the controllable smart device.
In one possible implementation, the displaying the controllable smart device to be selected includes:
the name and/or spatial location of the controllable smart device to be selected is displayed.
In one possible implementation manner, determining a controllable smart device in a device library that matches the device to be identified as the first smart device includes:
and identifying the appearance pictures stored in the equipment library by adopting an image identification technology, and determining the controllable intelligent equipment matched with the equipment to be identified as the first intelligent equipment.
According to a second aspect of the embodiments of the present disclosure, there is provided an intelligent device control apparatus, including:
the mode switching module is used for starting the camera device and displaying a framing picture of the camera device under the condition that the VR live-action mode of the terminal equipment is triggered;
the image determining module is used for acquiring a first image corresponding to a framing picture under the condition that the control is confirmed to be triggered, and marking a first position where a cursor is located in the first image;
a device determination module configured to determine a first smart device based on the first picture and the first location;
the operation display module is used for displaying the operation executable by the first intelligent equipment;
and the equipment control module is used for controlling the first intelligent equipment to execute the operation selected by the user.
In one possible implementation, the device determination module includes:
a first device determining submodule, configured to determine a device to be identified in the first picture, where the device to be identified corresponds to the first position;
and the second equipment determining submodule is used for determining the controllable intelligent equipment matched with the equipment to be identified in the equipment library as the first intelligent equipment.
In one possible implementation manner, the second device determination sub-module includes:
the first determining submodule is used for displaying the controllable intelligent equipment to be selected under the condition that a plurality of controllable intelligent equipment matched with the equipment to be identified are available;
and the second determining submodule is used for determining the controllable intelligent equipment selected by the user as the first intelligent equipment.
In a possible implementation manner, the device library is configured to store basic information of the controllable intelligent device, where the basic information includes at least one of:
a name of the controllable smart device, an appearance picture, an executable operation, and a spatial location of the controllable smart device.
In one possible implementation, the first determining sub-module includes:
and the equipment display submodule is used for displaying the name and/or the spatial position of the controllable intelligent equipment to be selected.
In one possible implementation manner, the second device determination sub-module includes:
and the equipment identification submodule is used for identifying the appearance pictures stored in the equipment library by adopting an image identification technology and determining the controllable intelligent equipment matched with the equipment to be identified as the first intelligent equipment.
According to a third aspect of the embodiments of the present disclosure, there is provided an intelligent device control apparatus including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: starting a camera device and displaying a framing picture of the camera device under the condition that a VR live-action mode of the terminal equipment is triggered; under the condition that the control is confirmed to be triggered, acquiring a first picture corresponding to a framing picture, and marking a first position where a cursor is located in the first picture; determining a first smart device based on the first picture and the first location; displaying operations executable by the first smart device; and controlling the first intelligent equipment to execute the operation selected by the user.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium having instructions therein, which when executed by a processor of a mobile terminal, enable the mobile terminal to perform a data transmission method of the mobile terminal, the method including: starting a camera device and displaying a framing picture of the camera device under the condition that a VR live-action mode of the terminal equipment is triggered; under the condition that the control is confirmed to be triggered, acquiring a first picture corresponding to a framing picture, and marking a first position where a cursor is located in the first picture; determining a first smart device based on the first picture and the first location; displaying operations executable by the first smart device; and controlling the first intelligent equipment to execute the operation selected by the user.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: make the user when using the first apparent video of VR to watch or play the recreation, under the circumstances that does not take the first apparent of VR, can realize the control to controllable smart machine, operation flow is simple, convenient, promotes user experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating a smart device control method according to an exemplary embodiment;
FIG. 2 is a diagram illustrating an application scenario of button and cursor display in accordance with an illustrative embodiment;
FIG. 3 is a schematic diagram illustrating the acquisition of a first picture in accordance with an exemplary embodiment;
FIG. 4 is a flowchart illustrating smart device control method step S13, according to an example embodiment;
FIG. 5 is a flowchart illustrating smart device control method step S132, according to an example embodiment;
FIG. 6 is a block diagram illustrating an intelligent device control apparatus in accordance with an exemplary embodiment;
FIG. 7 is a block diagram of an intelligent device control apparatus, shown in accordance with an example of an illustrative embodiment;
FIG. 8 is a block diagram illustrating an intelligent device control apparatus in accordance with an exemplary embodiment;
fig. 9 is a block diagram illustrating an intelligent device control apparatus according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating a smart device control method according to an exemplary embodiment, and as shown in fig. 1, the method may include the following steps S11 to S15.
In step S11, when the VR live view mode of the terminal device is triggered, the image pickup device is activated, and the finder screen of the image pickup device is displayed.
As an example of this embodiment, the terminal device may provide two working modes, namely, a VR normal mode and a VR live view mode, for the user, and display buttons corresponding to the two working modes in the display interface, so that the user may click the buttons to switch between different working modes. Wherein, the VR headset may be a VR head-mounted display device, e.g., VR glasses or VR helmet, etc. Under the condition that the VR virtual reality mode of the terminal equipment is triggered, for example, a VR virtual reality mode button is clicked, the VR virtual reality mode is entered, and a user can watch videos, play games and the like through a VR head display. When the VR live view mode of the terminal device is triggered, for example, the VR live view mode button is clicked, and the terminal device enters the VR live view mode, so that the user can view a framing picture of the camera device of the terminal device.
Fig. 2 is a schematic diagram illustrating an application scenario of a button and cursor display according to an exemplary embodiment, where a cursor 1 can move in a display interface as shown in fig. 2. The cursor, the button in the VR conventional mode and the button in the VR live view mode may set different transparencies according to the needs of the user, for example, the transparency is 50%, so that the displayed content in the interface is not obstructed, and the viewing experience of the user is ensured.
As an example of the present embodiment, the operation mode triggered by the cursor may be determined by recognizing the position and the staying time of the cursor in the display interface. And under the condition that the stay time of the cursor at the position of the VR virtual reality mode button or the VR real scene mode button is larger than or equal to the threshold time, determining that the VR virtual reality mode or the VR real scene mode is triggered. The threshold time may be 1 second, which is not limited herein. The user can move a cursor in a display interface of the VR head by rotating the head. In this way, the VR head display plays the framing picture, so that the user can watch the real scene of the environment without picking off the VR head display. The camera device may be a camera on the VR head display, or may be a camera of a terminal device such as a mobile phone installed on the VR head display, which is not limited herein.
As an example of the embodiment, it is also possible to move a cursor through a cursor control device such as a handle, and control the cursor to click a control such as a button in the display interface through the cursor control device.
In step S12, in a case where the confirmation control is triggered, a first picture corresponding to the finder screen is acquired, and a first position where the cursor is located is marked in the first picture.
As an example of the embodiment, in a case that the confirmation control is triggered, a current-time framing picture may be shot or intercepted, and a first picture is obtained; and identifying the position of the cursor in a display interface of the VR head display, and determining the first position of the cursor in the first picture. Fig. 3 is a schematic diagram illustrating acquiring a first picture according to an exemplary embodiment, and as shown in fig. 3, acquiring a first picture selected by a user and determining a first position where the cursor 1 is located.
In step S13, a first smart device is determined based on the first picture and the first location.
Fig. 4 is a flowchart illustrating the smart device control method step S13 according to an example of an exemplary embodiment.
In one possible implementation, as shown in fig. 4, step S13 may include steps S131 to S132.
In step S131, the device to be identified corresponding to the first position in the first picture is determined.
As an example of this implementation manner, when there are multiple devices to be identified in the first picture, the device to be identified selected by the user may be determined according to the first position, so as to improve the accuracy of determining the device to be identified and save the identification time.
In step S132, the controllable smart device in the device library that matches the device to be identified is determined as the first smart device.
As an example of this implementation, in a case where there is one controllable smart device that matches the device to be identified, the controllable smart device is determined as the first smart device.
In a possible implementation manner, the device library is configured to store basic information of the controllable intelligent device, where the basic information includes at least one of the following: name of the controllable smart device, appearance picture, executable operations, and spatial location of the controllable smart device.
As an example of this implementation, the appearance pictures may include pictures representing appearance characteristics of the controllable smart device, such as a main view, a left view, a right view, and a top view of the controllable smart device. The name of the controllable smart device may be a name indicating the functional use of the smart device, e.g. a purifier. Also, the name of the controllable smart device may also include the spatial location of the controllable smart device, for example, for a scrubber placed in a bedroom, the name may be "scrubber-bedroom" and for a scrubber placed in a living room, the name may be "scrubber-living room". In this way, the user can distinguish between the controllable smart devices by name. The basic information may also include a picture identification and/or a device ID corresponding to an operation that the controllable smart device can perform.
In one possible implementation, step S132 may include: and identifying the appearance pictures stored in the equipment library by adopting an image identification technology, and determining the controllable intelligent equipment matched with the equipment to be identified as the first intelligent equipment.
It should be noted that, a person skilled in the art may set a matching manner between the device to be identified and the controllable intelligent device according to actual needs, which is not limited herein.
Fig. 5 is a flowchart illustrating a smart device control method step S132 according to an example of an exemplary embodiment.
In one possible implementation, as shown in fig. 5, step S132 may further include step S1321 and step S1322.
In step S1321, in the case where there are a plurality of controllable smart devices that match the device to be identified, the controllable smart device to be selected is displayed.
As an example of this implementation, the controllable smart devices to be selected may be displayed in the form of a list and/or pictures.
In one possible implementation, displaying controllable smart devices to be selected includes: the name and/or spatial location of the controllable smart device to be selected is displayed.
Therefore, the user can more intuitively view each controllable intelligent device to be selected. It should be noted that, a person skilled in the art may set the display mode and the display content of the controllable intelligent device to be selected according to actual requirements, which is not limited herein.
In step S1322, the controllable smart device selected by the user is determined as the first smart device.
As one example of this implementation, the controllable smart device selected by the user may be determined by identifying the position of the cursor in the display interface and the dwell time. For example, in the case where the cursor stays at the display position of the controllable smart device 2 to be selected for a time greater than or equal to the threshold time, the controllable smart device 2 is determined as the first smart device. The selected controllable intelligent device can also be confirmed through a cursor control device such as a handle.
In step S14, operations executable by the first smart device are displayed.
As an example of the embodiment, the operations executable by the first smart device may be displayed in a list form, and the operations executable by the first smart device may also be displayed on an image of the first smart device, so that the user can understand and select the operations. And the picture identification corresponding to each executable operation can be displayed, so that a user can conveniently understand the specific meaning of each operation.
In step S15, the first smart device is controlled to perform the operation selected by the user.
As an example of the present embodiment, the operation selected by the user may be determined by recognizing the position of the cursor in the display interface and the stay time. For example, if the dwell time of the cursor at the position of the executable operation is greater than or equal to a threshold time, it is determined that the user selected the operation. The selected controllable intelligent device can also be confirmed through a cursor control device such as a handle.
As an example of this embodiment, a device ID of a first smart device is obtained, and an operation instruction generated according to an operation selected by a user is sent to a smart device corresponding to the device ID, so that the smart device executes a corresponding operation according to the operation instruction after receiving the operation instruction.
As an example of this embodiment, when a user installs a controllable smart device in an environment such as a home or an office, the user may set the controllable smart device by installing application software on a terminal device such as a mobile phone, including:
first, a communication connection between the controllable intelligent device and the terminal device is set, for example, a WIreless Fidelity (WI-FI) or a bluetooth of the controllable intelligent device is set, so that the controllable intelligent device can be connected to a network, and receive and execute a corresponding operation instruction through the network.
Secondly, basic information of the controllable intelligent device is set. Names of the controllable intelligent devices can be named according to the functional use and the installation position of the controllable intelligent devices so as to distinguish different controllable intelligent devices. The appearance picture of the controllable intelligent device may be a real picture of the controllable intelligent device taken by a user, or a commodity promotion picture obtained by downloading product information of the controllable intelligent device, which is not limited herein.
And thirdly, setting a cloud service account and a password, storing basic information of the controllable intelligent device into an equipment library of the cloud service account, so that a user can log in the cloud service account in different intelligent terminals and control the controllable intelligent device in the equipment library.
The intelligent device control device provided by the embodiment of the disclosure enables a user to realize control over the controllable intelligent device without taking off the VR head display when the user uses the VR head display to carry out video watching or game playing, and is simple and convenient in operation process, and user experience is improved.
Fig. 6 is a block diagram illustrating an intelligent device control apparatus according to an example embodiment. As shown in fig. 6, the apparatus may include: a mode switching module 61, a picture determination module 62, a device determination module 63, an operation display module 64, and a device control module 65. The mode switching module 61 is configured to start the image pickup device and display a framing screen of the image pickup device when the VR live view mode of the terminal device is triggered. The picture determining module 62 is configured to, in a case where the confirmation control is triggered, acquire a first picture corresponding to the finder screen, and mark a first position where the cursor is located in the first picture. The device determination module 63 is configured to determine a first smart device based on the first picture and the first location. The operation display module 64 is configured to display operations that the first smart device can perform. The device control module 65 is configured to control the first smart device to perform the user selected operation.
Fig. 7 is a block diagram of an intelligent device control apparatus according to an example of an example embodiment.
In one possible implementation, as shown in fig. 7, the device determining module 63 may include: a first device determination sub-module 631 and a second device determination sub-module 632. The first device determining sub-module 631 is configured to determine the device to be identified in the first picture corresponding to the first position. The second device determining submodule 632 is configured to determine the controllable smart device in the device library that matches the device to be identified as the first smart device.
In one possible implementation, as shown in fig. 7, the second device determination submodule 632 may include a first determination submodule 6321 and a second determination submodule 6322. The first determining submodule 6321 is configured to display the controllable smart device to be selected when there are a plurality of controllable smart devices matched with the device to be identified. The second determining submodule 6322 is configured to determine the controllable smart device selected by the user as the first smart device.
In a possible implementation manner, the device library is configured to store basic information of the controllable intelligent device, where the basic information includes at least one of the following: name of the controllable smart device, appearance picture, executable operations, and spatial location of the controllable smart device.
In one possible implementation, the first determination submodule may include a device display submodule. The device display sub-module is configured to display a name and/or a spatial location of the controllable smart device to be selected.
In one possible implementation, the second device determining sub-module 632 may include a device identifying sub-module configured to identify an appearance picture stored in a device library by using an image recognition technology, and determine a controllable smart device matching the device to be identified as the first smart device.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The intelligent device control device provided by the embodiment of the disclosure enables a user to realize control over the controllable intelligent device without taking off the VR head display when the user uses the VR head display to carry out video watching or game playing, and is simple and convenient in operation process, and user experience is improved.
Fig. 8 is a block diagram illustrating an intelligent device control apparatus 800 according to an example embodiment. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 8, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Fig. 9 is a block diagram illustrating a smart device control apparatus 1900 according to an example embodiment. For example, the apparatus 1900 may be provided as a server. Referring to fig. 9, the device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by the processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The device 1900 may also include a power component 1926 configured to perform power management of the device 1900, a wired or wireless network interface 1950 configured to connect the device 1900 to a network, and an input/output (I/O) interface 1958. The device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, MacOS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided that includes instructions, such as the memory 1932 that includes instructions, which are executable by the processing component 1922 of the apparatus 1900 to perform the above-described method. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (13)

1. An intelligent device control method, comprising:
displaying a trigger button corresponding to a working mode of the terminal equipment on a display interface of the terminal equipment so that a user can switch different working modes by triggering the button, wherein the working modes comprise a VR conventional mode and a VR live-action mode, and the trigger button comprises a VR conventional mode button and a VR live-action mode button;
starting a camera device and displaying a framing picture of the camera device under the condition that a VR live-action mode button of the terminal equipment is triggered;
under the condition that the control is confirmed to be triggered, acquiring a first picture corresponding to a framing picture, and marking a first position where a cursor is located in the first picture;
determining a first smart device based on the first picture and the first location;
displaying operations executable by the first smart device;
and controlling the first intelligent equipment to execute the operation selected by the user.
2. The method of claim 1, wherein determining the first smart device based on the first picture and the first location comprises:
determining equipment to be identified corresponding to the first position in the first picture;
and determining the controllable intelligent equipment matched with the equipment to be identified in the equipment library as the first intelligent equipment.
3. The method of claim 2, wherein determining a controllable smart device in a device library that matches the device to be identified as the first smart device comprises:
under the condition that a plurality of controllable intelligent devices matched with the device to be identified are available, displaying the controllable intelligent devices to be selected;
and determining the controllable intelligent device selected by the user as the first intelligent device.
4. The method of claim 3, wherein the device library is configured to store basic information of the controllable smart device, and the basic information comprises at least one of the following:
a name of the controllable smart device, an appearance picture, an executable operation, and a spatial location of the controllable smart device.
5. The method of claim 4, wherein displaying the controllable smart device to be selected comprises:
the name and/or spatial location of the controllable smart device to be selected is displayed.
6. The method of claim 4, wherein determining a controllable smart device in a device library that matches the device to be identified as the first smart device comprises:
and identifying the appearance pictures stored in the equipment library by adopting an image identification technology, and determining the controllable intelligent equipment matched with the equipment to be identified as the first intelligent equipment.
7. An intelligent device control apparatus, comprising:
the mode switching module is used for starting a camera device and displaying a view-finding picture of the camera device under the condition that a VR live-action mode button of the terminal equipment is triggered, and displaying a trigger button corresponding to a working mode of the terminal equipment on a display interface of the terminal equipment so that a user can trigger the button to switch different working modes, wherein the working modes comprise a VR conventional mode and a VR live-action mode, and the trigger button comprises a VR conventional mode button and the VR live-action mode button;
the image determining module is used for acquiring a first image corresponding to a framing picture under the condition that the control is confirmed to be triggered, and marking a first position where a cursor is located in the first image;
a device determination module configured to determine a first smart device based on the first picture and the first location;
the operation display module is used for displaying the operation executable by the first intelligent equipment;
and the equipment control module is used for controlling the first intelligent equipment to execute the operation selected by the user.
8. The apparatus of claim 7, wherein the device determination module comprises:
a first device determining submodule, configured to determine a device to be identified in the first picture, where the device to be identified corresponds to the first position;
and the second equipment determining submodule is used for determining the controllable intelligent equipment matched with the equipment to be identified in the equipment library as the first intelligent equipment.
9. The apparatus of claim 8, wherein the second device determination submodule comprises:
the first determining submodule is used for displaying the controllable intelligent equipment to be selected under the condition that a plurality of controllable intelligent equipment matched with the equipment to be identified are available;
and the second determining submodule is used for determining the controllable intelligent equipment selected by the user as the first intelligent equipment.
10. The apparatus of claim 9, wherein the device library is configured to store basic information of the controllable smart device, and the basic information includes at least one of:
a name of the controllable smart device, an appearance picture, an executable operation, and a spatial location of the controllable smart device.
11. The apparatus of claim 10, wherein the first determination submodule comprises:
and the equipment display submodule is used for displaying the name and/or the spatial position of the controllable intelligent equipment to be selected.
12. The apparatus of claim 10, wherein the second device determination submodule comprises:
and the equipment identification submodule is used for identifying the appearance pictures stored in the equipment library by adopting an image identification technology and determining the controllable intelligent equipment matched with the equipment to be identified as the first intelligent equipment.
13. An intelligent device control apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
displaying a trigger button corresponding to a working mode of the terminal equipment on a display interface of the terminal equipment so that a user can switch different working modes by triggering the button, wherein the working modes comprise a VR conventional mode and a VR live-action mode, and the trigger button comprises a VR conventional mode button and a VR live-action mode button;
starting a camera device and displaying a framing picture of the camera device under the condition that a VR live-action mode button of the terminal equipment is triggered;
under the condition that the control is confirmed to be triggered, acquiring a first picture corresponding to a framing picture, and marking a first position where a cursor is located in the first picture;
determining a first smart device based on the first picture and the first location;
displaying operations executable by the first smart device;
and controlling the first intelligent equipment to execute the operation selected by the user.
CN201710264966.XA 2017-04-21 2017-04-21 Intelligent equipment control method and device Active CN107132769B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710264966.XA CN107132769B (en) 2017-04-21 2017-04-21 Intelligent equipment control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710264966.XA CN107132769B (en) 2017-04-21 2017-04-21 Intelligent equipment control method and device

Publications (2)

Publication Number Publication Date
CN107132769A CN107132769A (en) 2017-09-05
CN107132769B true CN107132769B (en) 2020-09-01

Family

ID=59715429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710264966.XA Active CN107132769B (en) 2017-04-21 2017-04-21 Intelligent equipment control method and device

Country Status (1)

Country Link
CN (1) CN107132769B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108717373B (en) * 2018-05-28 2022-02-08 北京小米移动软件有限公司 Name display method and device of intelligent equipment and storage medium
US10725629B2 (en) 2018-06-25 2020-07-28 Google Llc Identifying and controlling smart devices
CN109276881A (en) * 2018-08-31 2019-01-29 努比亚技术有限公司 A kind of game control method, equipment
CN109324748B (en) * 2018-09-05 2021-12-24 联想(北京)有限公司 Equipment control method, electronic equipment and storage medium
CN113225549B (en) * 2021-04-19 2022-07-01 广州朗国电子科技股份有限公司 VR intelligence life system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138123A (en) * 2015-08-24 2015-12-09 小米科技有限责任公司 Device control method and device
CN106254191A (en) * 2016-09-14 2016-12-21 深圳众乐智府科技有限公司 A kind of intelligent home device assisted location method and device
CN106445156A (en) * 2016-09-29 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Method, device and terminal for intelligent home device control based on virtual reality
CN106502118A (en) * 2016-12-21 2017-03-15 惠州Tcl移动通信有限公司 A kind of intelligent home furnishing control method and system based on AR photographic head

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103513753B (en) * 2012-06-18 2017-06-27 联想(北京)有限公司 Information processing method and electronic equipment
CN105929538A (en) * 2016-04-26 2016-09-07 乐视控股(北京)有限公司 Virtual reality equipment display method and device
CN106569409A (en) * 2016-10-13 2017-04-19 杭州鸿雁电器有限公司 Graph capturing based household equipment control system, device and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138123A (en) * 2015-08-24 2015-12-09 小米科技有限责任公司 Device control method and device
CN106254191A (en) * 2016-09-14 2016-12-21 深圳众乐智府科技有限公司 A kind of intelligent home device assisted location method and device
CN106445156A (en) * 2016-09-29 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Method, device and terminal for intelligent home device control based on virtual reality
CN106502118A (en) * 2016-12-21 2017-03-15 惠州Tcl移动通信有限公司 A kind of intelligent home furnishing control method and system based on AR photographic head

Also Published As

Publication number Publication date
CN107132769A (en) 2017-09-05

Similar Documents

Publication Publication Date Title
EP3125530B1 (en) Video recording method and device
EP3220651B1 (en) Live video broadcasting method and device
CN107132769B (en) Intelligent equipment control method and device
EP3276976A1 (en) Method, apparatus, host terminal, server and system for processing live broadcasting information
US9667774B2 (en) Methods and devices for sending virtual information card
US20170344192A1 (en) Method and device for playing live videos
US20170304735A1 (en) Method and Apparatus for Performing Live Broadcast on Game
US9800666B2 (en) Method and client terminal for remote assistance
US20170178289A1 (en) Method, device and computer-readable storage medium for video display
CN109557999B (en) Bright screen control method and device and storage medium
CN106790043B (en) Method and device for sending message in live broadcast application
CN112019893B (en) Screen projection method of terminal and screen projection device of terminal
US10379602B2 (en) Method and device for switching environment picture
EP3147802B1 (en) Method and apparatus for processing information
US20180144546A1 (en) Method, device and terminal for processing live shows
US20180035154A1 (en) Method, Apparatus, and Storage Medium for Sharing Video
CN111212306A (en) Wheat connecting method and device, electronic equipment and storage medium
CN108174269B (en) Visual audio playing method and device
CN107797662B (en) Viewing angle control method and device and electronic equipment
CN104850643B (en) Picture comparison method and device
CN107656616B (en) Input interface display method and device and electronic equipment
CN106896917B (en) Method and device for assisting user in experiencing virtual reality and electronic equipment
CN106454540B (en) Method and device for processing interactive information based on live broadcast
CN112261453A (en) Method, device and storage medium for transmitting subtitle splicing map
CN108986803B (en) Scene control method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant