CN105760106B - A kind of smart home device exchange method and device - Google Patents

A kind of smart home device exchange method and device Download PDF

Info

Publication number
CN105760106B
CN105760106B CN201610130884.1A CN201610130884A CN105760106B CN 105760106 B CN105760106 B CN 105760106B CN 201610130884 A CN201610130884 A CN 201610130884A CN 105760106 B CN105760106 B CN 105760106B
Authority
CN
China
Prior art keywords
controlled plant
information
image acquisition
scene
acquisition unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610130884.1A
Other languages
Chinese (zh)
Other versions
CN105760106A (en
Inventor
赵辰
丛林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yixian Advanced Technology Co.,Ltd.
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201610130884.1A priority Critical patent/CN105760106B/en
Publication of CN105760106A publication Critical patent/CN105760106A/en
Application granted granted Critical
Publication of CN105760106B publication Critical patent/CN105760106B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Embodiments of the present invention provide a kind of smart home device exchange method, comprising: carry out Image Acquisition to current scene by image acquisition units, collected realtime graphic is presented on the display screen corresponding to image acquisition units;Determine the controlled plant in the real time imagery range of image acquisition units;Corresponding position in realtime graphic marks controlled plant, and/or actions menu is controlled accordingly in response to the touch control operation for the controlled plant in realtime graphic, display, and/or predetermined position, the corresponding control actions menu of display of the display screen are shown in response to the controlled plant in realtime graphic.By marking controlled plant therein on real-time scene image, allows users to intuitively position wherein operable controlled plant, realize user's What You See Is What You Get, finding is controlled, reduces user's operation difficulty significantly, improve user experience.Embodiments of the present invention additionally provide a kind of smart home device interactive device.

Description

A kind of smart home device exchange method and device
Technical field
Embodiments of the present invention are related to internet of things field, more specifically, embodiments of the present invention are related to one kind Smart home device exchange method and device.
Background technique
Background that this section is intended to provide an explanation of the embodiments of the present invention set forth in the claims or context.Herein Description recognizes it is the prior art not because not being included in this section.
Smart home passes through bottom Internet of Things fidonetFido, by each electrical equipment, illumination component, the movable component in family (such as curtain) is positioned with Uniform Resource Identifiers (URIs, uniform resource identifier), and with Communication between the principle carry out state of Representational State Transfer (REST, declarative state transfer).
And in terms of user and smart home device interactive mode, due to the smart home that can be interacted in smart home system Number of elements exponentially increases, and therefore, the physical level interactive mode of traditional single-item home equipment (passes through switch, remote controler Interacted Deng with home equipment) it can not be interacted suitable for smart home device.
Summary of the invention
Currently, there are mainly two types of for the interactive mode of smart home device: first way is by mobile device The APP (applications client) of installation is interacted with smart home device, by being grouped to smart home device, in APP Interactive interface is shown with list mode layering, for example, parlor-lamp-lamp 1, user interact with a certain smart home device When, need successively to open list to position the home equipment that need to be interacted, while user also needs to remember the volume of each home equipment Code, such as above-mentioned lamp encodes, if user forgets, may need repeatedly to be attempted, it is clear that under this interactive mode, interaction Interface is not succinct intuitive enough, increases user's operation step, so that user's operation is complex, on the other hand also increases user The difficulty of operation, affects user experience;The second way is by natural interaction interface (NUI, Natural User Interface it) interacting, user is interacted by the natural interactions such as voice, gesture interface, under this interactive mode, intelligence The registration of energy home equipment is the bottleneck of operation, for example, 4 pendent lamps placed side by side in parlor, user needs in advance to different Etc. encoded, such as lamp 1, lamp 2 etc., different home equipments needs to correspond to different voice and gesture, and equally, user needs standard Really remember the coding of each lamp and the operating gesture for its definition or voice etc., to interact, this operation Mode is equally not intuitive enough, increases the difficulty of user's operation.
Thus, it is also very desirable to which a kind of improved smart home device exchange method improves house to reduce user's operation difficulty Occupy the convenience of equipment interaction.
In the present context, embodiments of the present invention are intended to provide a kind of smart home device exchange method and device.
In the first aspect of embodiment of the present invention, a kind of smart home device exchange method is provided, comprising:
Image Acquisition is carried out to current scene by image acquisition units, and corresponding to described image acquisition unit Collected realtime graphic is presented on display screen, includes multiple controlled plants in the current scene;
Determine the controlled plant in the real time imagery range of described image acquisition unit;
The corresponding position in collected realtime graphic presented on the display screen, marks the controlled plant, And/or in response to the touch control operation for the controlled plant in the realtime graphic, shows and control actions menu accordingly, and/ Or predetermined position, the corresponding control behaviour of display of the display screen are shown in response to the controlled plant in the realtime graphic Make menu.
In the second aspect of embodiment of the present invention, a kind of smart home device interactive device is provided, comprising:
Image acquisition units controlled are set in the current scene for carrying out Image Acquisition to current scene comprising multiple It is standby;
First display unit, for rendering collected realtime graphic of described image acquisition unit;
First determination unit, the controlled plant in real time imagery range for determining described image acquisition unit;
Marking unit, the corresponding position in collected realtime graphic for presenting on the display screen, label The controlled plant;And/or
Second display unit, for showing phase in response to the touch control operation for the controlled plant in the realtime graphic The control actions menu answered, and/or it is shown in response to the controlled plant in the realtime graphic the default position of the display screen Set place, the corresponding control actions menu of display.
In the third aspect of embodiment of the present invention, a kind of smart home device interactive device is provided, for example, can be with Including memory and processor, wherein processor can be used for reading the program in memory, executes following process: passing through figure Picture acquisition unit carries out Image Acquisition to current scene, and presents and adopt on the display screen corresponding to described image acquisition unit The realtime graphic collected includes multiple controlled plants in the current scene;Determine the real time imagery of described image acquisition unit Controlled plant in range;The corresponding position in collected realtime graphic presented on the display screen, described in label Controlled plant, and/or in response to the touch control operation for the controlled plant in the realtime graphic, show corresponding control operation Menu, and/or in response to the controlled plant in the realtime graphic to be shown in the predetermined position of the display screen, display corresponding Control actions menu.
In the fourth aspect of embodiment of the present invention, a kind of program product is provided comprising program code, when described When program product is run, said program code is for executing following procedure: carrying out figure to current scene by image acquisition units Collected realtime graphic is presented as acquisition, and on the display screen corresponding to described image acquisition unit, it is described to work as front court It include multiple controlled plants in scape;Determine the controlled plant in the real time imagery range of described image acquisition unit;Described aobvious The corresponding position in collected realtime graphic presented in display screen marks the controlled plant, and/or in response to being directed to institute The touch control operation of the controlled plant in realtime graphic is stated, shows corresponding control actions menu, and/or in response to the real-time figure Controlled plant as in is shown in the predetermined position of the display screen, the corresponding control actions menu of display.
The smart home device exchange method and device of embodiment according to the present invention, can be by the field acquired in real time Controlled plant therein is marked on scape image, enables a user to from the scene image currently shown intuitively that positioning can The controlled plant of operation, in addition, can also be shown in the scene image acquired in real time corresponding according to the touch control operation of user Controlling actions menu realizes user's What You See Is What You Get, institute so that user can carry out control operation to corresponding controlled plant See and controlled, to reduce user's operation difficulty significantly, improve the convenience of smart home device interaction, be user with Better experience is carried out.
Detailed description of the invention
The following detailed description is read with reference to the accompanying drawings, above-mentioned and other mesh of exemplary embodiment of the invention , feature and advantage will become prone to understand.In the accompanying drawings, if showing by way of example rather than limitation of the invention Dry embodiment, in which:
Fig. 1 schematically shows the application scenarios schematic diagram of embodiment according to the present invention;
Fig. 2 schematically shows the controllable areas for the controlled plant for including in the acquisition scene of embodiment according to the present invention The implementation process diagram of domain three dimensional local information;
Fig. 3 schematically shows a kind of possible structural representation of the tables of data of the storage of embodiment according to the present invention Figure;
Fig. 4 schematically shows the interaction initialization process signals of the smart home device of embodiment according to the present invention Figure;
Fig. 5 schematically shows the smart home device exchange method implementing procedure signal of embodiment according to the present invention Figure;
Fig. 6 a schematically shows the realtime graphic of the determination image acquisition units acquisition of embodiment according to the present invention In controlled plant flow diagram;
Fig. 6 b schematically shows the enforcing location and reality of the determination image acquisition units of embodiment according to the present invention The flow diagram of Shi Chaoxiang;
Fig. 7 a schematically shows the real time position and reality of the determination image acquisition units of embodiment according to the present invention Shi Chaoxiang flow diagram;
Fig. 7 b schematically shows the display viewport schematic diagram of embodiment according to the present invention;
Fig. 8 schematically shows being marked in the realtime graphic that display screen is presented for embodiment according to the present invention Flow diagram;
What the display screen that Fig. 9 schematically shows embodiment according to the present invention was shown is marked corresponding controlled plant Realtime graphic schematic diagram afterwards;
Figure 10 schematically show embodiment according to the present invention in response to being collected for what is presented on display screen Realtime graphic in controlled plant touch control operation show the controlled plant it is corresponding control actions menu schematic diagram;
Figure 11 schematically shows smart home device interactive device schematic diagram according to another embodiment of the present invention;
Figure 12 schematically shows the structural schematic diagram of the smart home interactive device of another embodiment according to the present invention;
The program product that Figure 13 schematically shows smart home interactive device according to yet another embodiment of the invention shows It is intended to;
In the accompanying drawings, identical or corresponding label indicates identical or corresponding part.
Specific embodiment
The principle and spirit of the invention are described below with reference to several illustrative embodiments.It should be appreciated that providing this A little embodiments are used for the purpose of making those skilled in the art can better understand that realizing the present invention in turn, and be not with any Mode limits the scope of the invention.On the contrary, these embodiments are provided so that this disclosure will be more thorough and complete, and energy It is enough that the scope of the present disclosure is completely communicated to those skilled in the art.
One skilled in the art will appreciate that embodiments of the present invention can be implemented as a kind of system, device, equipment, method Or computer program product.Therefore, the present disclosure may be embodied in the following forms, it may be assumed that complete hardware, complete software The form that (including firmware, resident software, microcode etc.) or hardware and software combine.
Embodiment according to the present invention proposes a kind of smart home device exchange method and equipment.
Herein, any number of elements in attached drawing is used to example rather than limitation and any name are only used for It distinguishes, without any restrictions meaning.
Below with reference to several representative embodiments of the invention, the principle and spirit of the present invention are explained in detail.
Summary of the invention
The inventors discovered that interactive interface is not succinct intuitive enough in existing smart home device interactive mode, make It is complicated for operation to obtain user, in addition, user needs to remember the corresponding coding of different home equipments or mode of operation of same type, Increase user's operation difficulty.
In order to simplify user's operation step, user's operation difficulty is reduced, in the embodiment of the present invention, can use terminal device On the image acquisition units such as camera real-time Image Acquisition is carried out to the scene comprising smart home device, and will collect Realtime graphic be presented on the display screen of terminal device, pass through determine realtime graphic in include smart home device (this hair It is referred to as controlled plant in bright embodiment) and be marked, it allows users to passing through what terminal device was presented in real time Corresponding controlled plant is intuitively seen on scene image, and in response to the touch control operation of user, it can also show corresponding controlled The control actions menu of equipment, so that user carries out operation control to corresponding controlled plant.Household provided in an embodiment of the present invention In equipment interactive mode, interactive interface is succinctly intuitive, realizes What You See Is What You Get, and finding is controlled, and simplifies user's operation step Suddenly, outside woods, since user is without encoding or defining its corresponding operating gesture to the different home equipments of same type Deng reducing user's operation difficulty.
After introduced the basic principles of the present invention, lower mask body introduces various non-limiting embodiment party of the invention Formula.
Application scenarios overview
It is that the application scenarios of smart home device exchange method provided in an embodiment of the present invention are illustrated referring initially to Fig. 1 Figure.It includes multiple smart home devices, such as air-conditioning 11, TV 12, empty humidifier 13 etc., the progress to smart home device The communication of control can follow Internet of Things fidonetFido, can also follow other communication protocols.
Illustrative methods
Below with reference to the application scenarios of Fig. 1, the intelligence of illustrative embodiments according to the present invention is described with reference to Fig. 2-Fig. 9 Home equipment exchange method.It should be noted that above-mentioned application scenarios are merely for convenience of understanding spirit and principles of the present invention And show, embodiments of the present invention are not limited in this respect.On the contrary, embodiments of the present invention can be applied to fit Any scene.
In order to realize the embodiment of the present invention, need to carry out relevant data acquisition and data to specific application scenarios in advance Pretreatment.Application scenarios involved in the embodiment of the present invention can be any set house in common property plots or office building In, can also be a room in any set house, for example, parlor, bedroom etc., or the room etc. for hotel.
When it is implemented, it is generally necessary to acquire following data: each equipment in the three-dimensional structure information and the scene of scene Three dimensional local information, the zone of control three dimensional local information for the controlled plant for including in the scene, and choose in scene and appoint One region is introduced individually below as initial position.
One, in the three-dimensional structure information and the scene of scene each equipment three dimensional local information.
It, can be each in the three-dimensional structure information and the scene that obtain scene in any manner by appointing in the embodiment of the present invention The three dimensional local information of equipment:
Mode one carries out three-dimensional reconstruction by the scene picture of acquisition.
Specifically, can implement according to following below scheme: receive be directed to scene capture image, according to the image received into Row three-dimensional reconstruction obtains the three-dimensional structure information of the scene and the three dimensional local information of each equipment.
For example, can be using recovery three-dimensional scene structure (SFM, Structure in the slave motion information in computer vision From motion) technology, it can recover corresponding three-dimensional information from two-dimensional image or video sequence, including at As the kinematic parameter of video camera and the three-dimensional structure information of scene.Its input can be a series of two dimensional images or video sequence Column, output are the 3D model informations of scene, output include the approximate dimensions of scene, in scene each equipment vertex in the scene Three dimensional local information, i.e. output further includes the 3D model of details object inside scene).
Mode two, the three dimensional local information for receiving each equipment in the three-dimensional structure information and scene for the scene that user provides.
Under this embodiment, usually the three of scene can be directly provided by the owner in room or the developer in house Structure chart is tieed up, directly acquires three of each equipment in the three-dimensional structure information and scene of corresponding scene according to the tomograph of acquisition Tie up location information.
Two, the zone of control three dimensional local information for the controlled plant for including in scene.
In the embodiment of the present invention, the controllable area for the controlled plant for including in scene can be acquired according to process shown in Fig. 2 Domain three dimensional local information:
S21, the three dimensional local information for obtaining each equipment in the three-dimensional structure information and the scene of scene.
When it is implemented, can be according to the three-dimensional structure information of any one of above two mode acquisition scene and this The three dimensional local information of each equipment in scape, which is not described herein again.
S22, the three dimensional local information of each equipment in the three-dimensional structure information and the scene of scene is imported into default editor's work In tool.
Wherein preset edit tool provides the interactive interface interacted with user, and user imports scene by interactive interface The three dimensional local information of each equipment in three-dimensional structure information and the scene.
S23, in edit tool, user by interactive interface select from each equipment of the scene controlled plant and this The zone of control of controlled plant.
When it is implemented, the interactive interface that can be provided by edit tool, carries out controlled plant and its zone of control Registration, and illustrate information for the addition of corresponding controlled plant, for example, the controlled plant registered is the lamp 1 in parlor etc..Specifically, User can choose the crucial vertex of some controlled plants, such as eight vertex of controlled plant, as registration controlled plant, choosing Information input when zone of control is selected, the region of these crucial vertex periphery line compositions can be understood as the controllable of controlled plant Region.
S24, the zone of control three dimensional local information for storing controlled plant in the scene.
When it is implemented, can use tables of data after defining the zone of control three dimensional local information of controlled plant To store related data.In storage, need to store each controlled plant for including in corresponding scene identity and the scene can Control area three-dimensional location information.As shown in figure 3, it is a kind of possible table structure schematic diagram of the tables of data of storage.Wherein, field Scape is identified for distinguishing different scenes, preferably, scene identity may indicate that different cells, different buildings number is different House and different rooms, controlled plant are identified for distinguishing different controlled plants.
The three-dimensional structure information of scene and the zone of control three dimensional local information of controlled plant, usually can be by 8 vertex Coordinate composition.
It should be noted that when it is implemented, according to the difference of the shape of controlled plant, the zone of control three of controlled plant Tieing up may difference in the description of location information.For example, if controlled plant is regular shape, the controllable area of controlled plant Domain three dimensional local information may be made of 8 apex coordinates.But if when controlled plant is the equipment being embedded on wall, Such as it is embedded in the smart television on wall, 4 apex coordinates may be needed only to describe the zone of control of the controlled plant Three dimensional local information.In addition, when it is implemented, according to the difference of actual scene, when selecting controlled plant and its zone of control It can also only selected section apex coordinate, remaining apex coordinate be determined using the method for symmetrical mapping.And if controlled plant is Irregular shape then can select representative key point and its coordinate to describe the controlled plant according to actual needs Zone of control three dimensional local information, selected key point peripheral line composition region can represent the controlled plant and Its zone of control.In Fig. 3 by taking each controlled plant is made of 8 apex coordinates as an example.
Three, the initial position that any region is identified as subsequent image in scene is chosen
Preferably, when it is implemented, can choose in scene includes higher object (the i.e. embodiment of the present invention of contrast In the first object of reference, wherein contrast is higher to be referred to and refers to that color and texture differ greatly with surrounding objects, such as the object of black For body in white background, contrast is higher) the initial position that is identified as subsequent image of region, when it is implemented, initially Position can be specified by user, and shoot the benchmark image comprising above-mentioned object of reference, record the first ginseng for including in benchmark image The related datas such as angle information when three dimensional local information and shooting benchmark image according to object.Wherein, when shooting benchmark image The angular relationship of camera direction and some reference plane of the first object of reference when angle information refers to shooting benchmark image.With just For square objects, when face object is shot, 4 vertex of the object are not distorted, can basis if there is certain angle Angle information when the distortion degree place of determination is shot;Angle information can be known when shooting and carries out storage record.
Based on the related data of above-mentioned acquisition, user, can when the controlled plant for needing with including in scene interacts To be interacted using method provided in an embodiment of the present invention, it is described in detail below it.
When it is implemented, user can open a terminal and set when needing to interact with the controlled plant in current scene Standby upper image acquisition units are roamed in scene for example, the image acquisition units can be camera, are based on user The scene image sequence that acquires in roam procedure completes initialization operation, that is, determines the initial position and just of image acquisition units Begin towards (i.e. initial shooting direction).
Initialization process is illustrated below in conjunction with Fig. 4, may comprise steps of:
S41, the first image sequence for receiving the current scene that image acquisition units acquire.
When it is implemented, user, in scene roam procedure, image acquisition units acquire current scene according to preset frame rate Image obtains the first image sequence, and the first image sequence of acquisition is sent to processing unit.
S42, it is referred to from first comprising the first object of reference in the current scene determining in the first image sequence received Image.
Processing unit therefrom identifies after the scene image sequence for receiving image acquisition units transmission comprising above-mentioned the The image of one object of reference is identified the first object of reference for including in image using existing common image recognition methods, and determined Scene image comprising the first object of reference is the first reference picture.
S43, according to benchmark image preset, comprising the first object of reference and the first reference picture and the first object of reference Location information and shooting benchmark image when angle information, calculate image acquisition units initial position and initial direction.
Processor is according to the benchmark image and step comprising the first object of reference stored in data acquisition and preprocessing process The first reference picture determined in rapid S42, and the angle when location information of the first object of reference of combination and shooting benchmark image Information, determine image acquisition units initial position and initial direction.
For example, attitude prediction technology (Pose Estimation) can be used, according to the image characteristic point of benchmark image and The matching relationship of the image characteristic point of first reference picture is (for example, a left side for benchmark image top left corner apex and the first reference picture The matching relationship at upper angle), mistake is eliminated by RANSAC (Random Sample Consensus, random sampling are consistent) algorithm Matching solves the current of image acquisition units by PnP (Perspective N Point) method (n point is mapped) Pose (posture), the i.e. position and orientation of image acquisition units.
So far, initialization process is completed.
Based on this, intelligent interactive method provided in an embodiment of the present invention, can according to flow implementation shown in fig. 5, including Following steps:
S51, Image Acquisition is carried out to current scene by image acquisition units, and corresponding to image acquisition units Collected realtime graphic is presented on display screen.
When it is implemented, user utilizes terminal device pair when needing to interact with the controlled plant in current scene Current scene comprising the controlled plant carries out Image Acquisition, and collected real-time figure is presented on the display screen of terminal device Picture.
S52, controlled plant in the real time imagery range of image acquisition units is determined.
Specifically, determine multiple controlled plants in the real time imagery range of image acquisition units, it can be by the reality When areas imaging in all controlled plants all determine.
Corresponding position in S53, the collected realtime graphic presented on a display screen, marks the controlled plant, And/or in response to the touch control operation for the controlled plant in realtime graphic, the corresponding control actions menu of display, and/or use In the predetermined position, the corresponding control behaviour of display that are shown in the display screen in response to the controlled plant in the realtime graphic Make menu.
Specifically, the multiple controlled plants in real time imagery range that will identify that are marked when label controlled plant; All controlled plants in real time imagery range that can be will identify that are marked.
Step S52 and S53 are described in detail below in conjunction with attached drawing.
In step S52, the embodiment of the present invention provides following two embodiment and determines image acquisition units real time imagery Controlled plant in range, when it is implemented, any embodiment can be used according to actual needs.
The first embodiment, image recognition mode
I.e. according to the image information of controlled plant gathered in advance, identify that the image presented on terminal device display screen is adopted Collect the controlled plant in the collected realtime graphic of unit.
Second of embodiment, using image acquisition units real time position, in real time direction, correlation acquisition parameter, currently The zone of control three-dimensional information of controlled plant in scene calculates controlled setting in the realtime graphic for determining image acquisition units acquisition It is standby.
Under this embodiment, can be determined according to process shown in Fig. 6 a image acquisition units acquisition realtime graphic in Controlled plant, comprising the following steps:
S521, real time position and real-time direction of the image acquisition units in current scene are determined.
S522, according to the real time positions of image acquisition units, in real time towards and image acquisition units parameter information, The zone of control three dimensional local information of controlled plant included in the current scene gathered in advance, determines Image Acquisition list Controlled plant in the real time imagery range of member.
Specifically, in step S522, the projection imaging principle in computer vision can use: p2d=Proj ([R | T] P3d) _, p3d refers to the 3D coordinate (coordinate relative to initial position) in world coordinate system, and Proj () is projection function, The projection function can determine that [R | T] is the position and orientation of Current camera according to the parameter information of image acquisition units, and R is Direction, T are positions, wherein R contains the information of 3 freedom degrees, indicates that image acquisition units are respectively relative to x-axis, y-axis It is the matrix of a 3*3 with the rotation amount of z-axis, T is 3 dimensional vectors (x, y, z), then [R | T] it can be expressed as a 3*4's Matrix exports the coordinate p2d on 2D screen, is based on this, only it is to be understood that 8 3D apex coordinates of controlled plant can calculate Its position on 2D screen, if the coordinate p2d calculated, not within the scope of preset 2D screen coordinate, this is controlled to set For not within the scope of the real time imagery of image acquisition units.It is not required that the information of the three-dimensional structure with current scene.
Wherein, can be determined according to process shown in Fig. 6 b in step S521 image acquisition units enforcing location and in real time Direction:
S5211, initial position and initial direction of the image acquisition units in current scene are determined.
Wherein it is determined that detailed process such as Fig. 4 of initial position and initial direction of the image acquisition units in current scene Shown, which is not described herein again.
S5212, the displacement information and rotation information for acquiring image acquisition units.
When it is implemented, can use the displacement information of the accelerometer acquisition image acquisition units in terminal device, and Utilize the rotation information of the gyroscope acquisition image acquisition units in terminal device.
Wherein, rotation information of the gyroscope acquisition image acquisition units in time t, accelerometer acquire Image Acquisition list Linear acceleration information of the member in time t, thus, it is possible to estimate displacement of the image acquisition units in time t.
S5213, according to the initial position of image acquisition units, initial towards, displacement information and rotation information, determine image The real time position of acquisition unit and real-time direction.
When it is implemented, can use the image acquisition units determined in step S521 initial position and initial court To, in the moving process of image acquisition units, using SLAM (Simultaneous Localization and Mapping, Immediately positioning and map structuring) algorithm in real time determine image acquisition units real time position and real-time direction.
Based on this, in step S522, can be adopted according to the real time position of image acquisition units, real-time direction and image Collect controlled included in the parameter information of unit, the three-dimensional structure information and the current scene of current scene gathered in advance The zone of control three dimensional local information of equipment, determines in the real time imagery range and real time imagery range of image acquisition units Controlled plant.I.e. according to embodiments of the present invention, when controlled plant in the real time imagery range for determining image acquisition units, The three-dimensional structure information that current scene can be used calculates image acquisition units for the areas imaging of current scene, then Further according to three dimensional local information of the controlled plant in current scene, calculates and wrapped in the real time imagery range of image acquisition units Containing which controlled plant.
Wherein, the parameter information of image acquisition units may include the focal length of image acquisition units, component size, distortion ginseng Number etc..
Preferably, when it is implemented, can also be flowed in image acquisition units moving process according to shown in following figure 7 a Journey determine image acquisition units real time position and real-time direction:
S71, in image acquisition units moving process, extract image acquisition units acquisition the reality comprising the second object of reference When image as key frame images.
Wherein, the second object of reference can be the higher object of contrast for including in image, and image acquisition units are acquired The image comprising the second object of reference as key frame images.Later use key frame images determine that image acquisition units are being adopted Real time position and real-time direction when collecting the second reference picture.Wherein determine that image is adopted using the relevant information of key frame images The process of the real time position and real-time direction that collect unit determines image acquisition units using the relevant information of benchmark image with above-mentioned Real time position it is similar with the process of real-time direction, simply introduce it below.
The first position letter of image acquisition units when S72, the image information for storing key frame images, shooting key frame images Breath and the first orientation information.
S73, the second image sequence for receiving the current scene that image acquisition units acquire.
User continues to roam in current scene, correspondingly, in image acquisition units moving process, image acquisition units The second image sequence is obtained according to preset frame rate (being determined by terminal device inherent parameters) acquisition current scene image, and will acquisition The second image sequence be sent to processing unit.
S74, from determined in the second image sequence received comprising the second object of reference in current scene second with reference to figure Picture.
Processing unit is after the scene image sequence for receiving image acquisition units transmission, from the scene image sequence received Identification includes the image of the second object of reference in column, as the second reference picture.
S75, according to the image information of key frame images, the second reference picture, first location information and the first orientation information, Determine real time position and real-time direction of the image acquisition units when collecting the second reference picture.
Specifically, key frame images and the second reference picture can be compared, believed according to the image of the key frame images of storage The first location information and the first orientation information of image acquisition units, determine image acquisition units when breath, shooting key frame images Real time position and real-time direction when collecting the second reference picture.
If subsequent user continues to roam in the scene, can be according to the displacement information of acquired image acquisition unit (accelerometer in using terminal equipment is acquired) and rotation information (gyroscope in using terminal equipment is acquired) And the real-time direction of image acquisition units and real time position determine that image acquisition units exist when collecting the second reference picture Real time position and real-time direction in moving process.
According to the real time position of image acquisition units and real-time direction, the real time imagery of image acquisition units can be determined Range.Specifically, when the position and orientation of image acquisition units change, the as angle of image acquisition units photographed scene Degree changes, and causes to show that viewport (FOV, Field of View) is different, display viewport refers to that image acquisition units can be shot The maximum field of view's range arrived, as shown in Figure 7b, for display viewport schematic diagram, camera (camera) is in the embodiment of the present invention Image acquisition units, display viewport include horizontal viewport (Horizontal FOV) and vertical viewport (Vertical FOV). Scene only in display viewport is presented on display screen corresponding with image acquisition units.Show viewport by hither plane (Near) it is formed with far plane (Far), the scene between hither plane and far plane can be presented on and image acquisition units On corresponding display screen.Therefore, the field presented according to the difference of the real time position of image acquisition units and real-time direction, display screen Scape image is also different, this is because caused by the areas imaging difference of image acquisition units.According to the reality of image acquisition units When position and in real time towards can determine the real time imagery range of image acquisition units, and then can determine Image Acquisition list Controlled plant in the real time imagery range of member acquisition.
Based on this, in step S53, set according to determine, controlled within the scope of image acquisition units real time imagery It is standby, it is marked in the realtime graphic that display screen is presented according to process shown in Fig. 8:
S531, the zone of control three dimensional local information according to controlled plant gathered in advance will be located at real time imagery range In the zone of control three dimensional local information of controlled plant be converted in the collected realtime graphic presented on a display screen Two-dimensional position information.
It is similar with above-mentioned steps S522, it equally can be using the projection imaging principle in computer vision in step S531: P2d=Proj ([R | T] p3d) _, wherein p3d refers to the 3D coordinate in world coordinate system, and Proj () is projection function, [R | T] It is the position and orientation of Current camera, exporting coordinate on 2D screen is p2d, accordingly, it is determined that the top 8 3D of controlled plant Point coordinate can calculate its position on 2D screen.
The controlled plant being located in real time imagery range that S532, basis are determined presents collected on a display screen Two-dimensional position information in realtime graphic, the corresponding position in collected realtime graphic presented on a display screen mark phase The controlled plant answered.
As shown in figure 9, it is the realtime graphic schematic diagram being marked after corresponding controlled plant that display screen is shown.
Optionally, after corresponding controlled plant is marked, can also for be located at real time imagery within the scope of it is each by Equipment is controlled, determines the corresponding control actions menu of the controlled plant, and corresponding hiding for the labeled controlled plant addition Control actions menu.Specifically, can determine the corresponding control of the controlled plant according to the description information of the controlled plant of storage Actions menu processed.
More preferably, in response to the touch control operation for controlled plant in the collected realtime graphic presented on display screen, Such as the touch control operation in labeled range, show the corresponding control actions menu of corresponding controlled plant.Alternatively, if detection When any position on the collected realtime graphic presented on to display screen in label range is clicked, corresponding control is shown Actions menu.As shown in Figure 10, in response to for controlled plant in the collected realtime graphic presented on display screen Touch control operation shows the schematic diagram of the corresponding control actions menu of the controlled plant.Further, it is also possible to which following triggering control is arranged The mode that actions menu processed is shown: predeterminated position makes marks on a display screen, such as does specific markers in display screen middle, uses Family adjust mobile phone towards when, show that the controlled plant in the realtime graphic of screen display can move display position in display screen, When at the specific markers that controlled plant is moved to display screen middle, corresponding control actions menu is shown.
Preferably, when it is implemented, can show the corresponding control operation of controlled plant using augmented reality (AR) mode Menu.
Since controlled plant can intuitively be shown on a display screen, if user can voluntarily judge which sets to be controlled It is standby, therefore, when it is implemented, controlled plant can not also be marked, touch-control behaviour is only carried out at controlled plant in user When making, the corresponding control actions menu of corresponding controlled plant is shown.When it is implemented, can also only to controlled plant into Line flag, alternatively, only predeterminated position makes marks on a display screen, in image acquisition units moving process, if there is by When control equipment is shown in the predeterminated position, show that the controlled plant corresponding operation controls menu automatically.
It should be noted that terminal device involved in the embodiment of the present invention can for common mobile phone, tablet computer etc., It may be wearable device, such as augmented reality (AR) glasses etc..
Smart home device interactive mode provided in an embodiment of the present invention, can be intuitive by the display screen of terminal device Display current scene realtime graphic, and controlled plant wherein included can be marked, user is facilitated to identify, this Sample allows user directly to position corresponding controlled plant by terminal screen (display screen of mobile phone, tablet computer etc.), or User can directly position corresponding controlled plant by gesture (being applied in the wearable devices such as AR glasses), in addition, response In user to the touch control operation of corresponding controlled plant, realize that the interaction of user and controlled plant (reads controlled plant feedback information Or carry out control operation etc.).
Example devices
After describing the method for exemplary embodiment of the invention, next, exemplary to the present invention with reference to Figure 11 Embodiment, smart home device interactive device is illustrated.
As shown in figure 11, smart home interactive device provided in an embodiment of the present invention may include:
Image acquisition units 111 include multiple controlled in the current scene for carrying out Image Acquisition to current scene Equipment;
First display unit 112, for rendering collected realtime graphic of described image acquisition unit;
First determination unit 113, controlled in the real time imagery range for determining described image acquisition unit 111 are set It is standby;
Marking unit 114, the corresponding position in collected realtime graphic for presenting on the display screen, mark Remember the controlled plant;And/or
Second display unit 115, in response to the touch control operation for the controlled plant in the realtime graphic, display Corresponding control actions menu, and/or for being shown in the display screen in response to the controlled plant in the realtime graphic Predetermined position, the corresponding control actions menu of display.
Wherein, the first determination unit 113 may include:
First determines subelement 1131, for determining real time position of the described image acquisition unit in the current scene With real-time direction;
Second determines subelement 1132, for according to the real time position, in real time direction, the ginseng of described image acquisition unit Number information and the scene gathered in advance in include controlled plant zone of control three dimensional local information, determine described in Controlled plant in the real time imagery range of image acquisition units.
When it is implemented, second determine subelement 1132, be specifically used for according to the real time position and in real time towards, it is described The parameter information of image acquisition units, the current scene gathered in advance three-dimensional structure information and the scene included in Controlled plant zone of control three dimensional local information, determine the real time imagery range and the reality of described image acquisition unit When areas imaging in controlled plant.
When it is implemented, first determines subelement 1131, comprising:
First determining module, for determine described image acquisition unit initial position and initial direction;
Acquisition module, for acquiring the displacement information and rotation information of described image acquisition unit;
Second determining module, for determining institute according to the initial position, initial direction, displacement information and rotation information State image acquisition units real time position and real-time direction.
Wherein, the first determining module, comprising:
First receiving submodule, the first image sequence of the current scene for receiving the acquisition of described image acquisition unit;
First determines submodule, for determining from the first image sequence comprising the first reference in the current scene First reference picture of object;
First computational submodule, for according to benchmark image preset, comprising first object of reference and described first The location information of reference picture and first object of reference and angle information when shooting the benchmark image, described in calculating The initial position of image acquisition units and initial direction.
Optionally, the first determining module further includes extracting sub-module and sub-module stored, the second receiving submodule, second Determine submodule and the second computational submodule, in which:
The extracting sub-module, for extracting described image acquisition unit in described image acquisition unit moving process The realtime graphic comprising the second object of reference of acquisition is as key frame images;
The sub-module stored, when for storing the image information of the key frame images, shooting the key frame images The first location information of described image acquisition unit and the first orientation information;
Second receiving submodule, the second image sequence of the current scene for receiving the acquisition of described image acquisition unit Column;
Second determines submodule, for determining from second image sequence comprising described in the current scene second Second reference picture of object of reference;
Second computational submodule, specifically for according to the image information of the key frame images, the second reference picture, First location information and the first orientation information determine reality of the described image acquisition unit when collecting second reference picture When position and real-time direction.
Preferably, the first determination unit 113 can also include identification subelement 1133, for according to gathered in advance controlled The image information of equipment identifies the controlled plant in the collected realtime graphic presented on the display screen.
When it is implemented, marking unit 114, comprising:
Conversion subunit 1141, for the zone of control three dimensional local information according to controlled plant gathered in advance, by position The zone of control three dimensional local information of controlled plant in the real time imagery range is converted to be presented on the display screen Collected realtime graphic in two-dimensional position information;
Subelement 1142 is marked, for according to the two-dimensional position information, what is presented on the display screen to be collected Corresponding position in realtime graphic marks the controlled plant.
Optionally, smart home device interactive device provided in an embodiment of the present invention can also include:
Obtaining unit 116, the three-dimensional position of each equipment in the three-dimensional structure information and the scene for obtaining the scene Confidence breath;
Import unit 117, for leading the three dimensional local information of the three-dimensional structure information of the scene and each equipment Enter in default edit tool;
Selecting unit 118, for selecting the controlled plant and institute from each equipment in the edit tool State the zone of control of controlled plant;
Storage unit 119, for storing the zone of control three dimensional local information of the controlled plant.
Wherein, the obtaining unit 116, comprising:
Receiving subelement 1161, for receiving the image for being directed to the scene capture;Or receive the described of user's offer The three dimensional local information of each equipment in the three-dimensional structure information and the scene of scene;
Subelement 1162 is rebuild, the image progress three-dimensional reconstruction for receiving according to the receiving subelement obtains described The three dimensional local information of each equipment in the three-dimensional structure information and the scene of scene.
Optionally, home equipment interactive device provided in an embodiment of the present invention can also include:
Adding unit 1120 is used for after the marking unit marks the controlled plant, described real-time for being located at Each controlled plant in areas imaging, determines the corresponding control actions menu of the controlled plant, and labeled controlled for this Equipment addition control actions menu hiding accordingly.
When it is implemented, the second display unit 115, if be also used to detect presented on the display screen it is collected When any position on realtime graphic in label range is clicked, corresponding control actions menu is shown.Preferably, described second Display unit 115 can be used for showing corresponding control actions menu using augmented reality AR mode.
Smart home interactive device provided in an embodiment of the present invention can be set in terminal device (above-mentioned mobile phone, plate Computer and wearable device such as AR glasses) in.
After describing smart home device exchange method and device provided by the invention, next, introducing according to this The smart home device interactive device of the another exemplary embodiment of invention.
Person of ordinary skill in the field it is understood that various aspects of the invention can be implemented as system, method or Program product.Therefore, various aspects of the invention can be embodied in the following forms, it may be assumed that complete hardware embodiment, complete The embodiment combined in terms of full Software Implementation (including firmware, microcode etc.) or hardware and software, can unite here Referred to as circuit, " module " or " system ".
In some possible embodiments, smart home interactive device according to the present invention may include at least one Manage unit and at least one storage unit.Wherein, the storage unit is stored with program code, when said program code quilt When the processing unit executes, so that the processing unit executes root described in above-mentioned " illustrative methods " part of this specification According to the various steps in the message prompt method of the various illustrative embodiments of the present invention.For example, the processing unit can be held Row step S51 as shown in Figure 5 carries out Image Acquisition to current scene by image acquisition units, and corresponding to figure As acquisition unit display screen on collected realtime graphic is presented, step S52 determines the real time imagery model of image acquisition units Controlled plant in enclosing;And step S53, the corresponding position in collected realtime graphic presented on a display screen, mark Remember the controlled plant, and/or in response to the touch control operation for the controlled plant in realtime graphic, shows corresponding control behaviour Make menu, and/or predetermined position for being shown in the display screen in response to the controlled plant in the realtime graphic, aobvious Show corresponding control actions menu.
It should be noted that although being referred to several unit or sub-units of device in the above detailed description, this stroke It point is only exemplary not enforceable.In fact, embodiment according to the present invention, it is above-described two or more The feature and function of unit can embody in a unit.Conversely, the feature and function of an above-described unit can It is to be embodied by multiple units with further division.
In addition, although describing the operation of the method for the present invention in the accompanying drawings with particular order, this do not require that or Hint must execute these operations in this particular order, or have to carry out shown in whole operation be just able to achieve it is desired As a result.Additionally or alternatively, it is convenient to omit multiple steps are merged into a step and executed by certain steps, and/or by one Step is decomposed into execution of multiple steps.
The smart home device interactive device 120 of this embodiment according to the present invention is described referring to Figure 12. For smart home device interactive device it is only an example shown in Figure 12, function to the embodiment of the present invention and should not makes With range band come any restrictions.
As shown in figure 12, smart home device interactive device 120 can be showed in the form of universal computing device.Intelligent family The component for occupying equipment interactive device 120 can include but is not limited to: at least one above-mentioned processing unit 121, it is above-mentioned at least one Storage unit 122, the bus 123 of the different system components (including storage unit 122 and processing unit 121) of connection.
Bus 123 indicates one of a few class bus structures or a variety of, including memory bus or Memory Controller, Peripheral bus, processor or the local bus using any bus structures in a variety of bus structures.
Storage unit 122 may include the readable medium of form of volatile memory, such as random access memory (RAM) 1221 and/or cache memory 1222, it can further include read-only memory (ROM) 1223.
Storage unit 122 can also include program/utility with one group of (at least one) program module 1224 1225, such program module 1224 includes but is not limited to: operating system, one or more application program, other program moulds It may include the realization of network environment in block and program data, each of these examples or certain combination.
Smart home device interactive device 120 (such as keyboard, can also be directed toward and set with one or more external equipments 124 It is standby etc.) communication, it is logical that the equipment interacted with smart home device interactive device 120 can be also enabled a user to one or more Letter, and/or with enable the smart home device interactive device 120 with it is one or more of the other calculating equipment communicated appoint What equipment (such as router, modem etc.) communication.This communication can by input/output (I/O) interface 125 into Row.Also, smart home device interactive device 120 can also by network adapter 126 and one or more network (such as Local area network (LAN), wide area network (WAN) and/or public network, such as internet) communication.As shown, network adapter 126 is logical Bus 123 is crossed to communicate with other modules for smart home device interactive device 120.It will be appreciated that though be not shown in the figure, Other hardware and/or software module can be used in conjunction with smart home device interactive device 120, including but not limited to: microcode, Device driver, redundant processing unit, external disk drive array, RAID system, tape drive and data backup storage System etc..
Exemplary process product
In some possible embodiments, the various aspects of smart home device exchange method provided by the invention may be used also In the form of being embodied as a kind of program product comprising program code, when described program product is run on the terminal device, institute Program code is stated for executing the terminal device described in above-mentioned " illustrative methods " part of this specification according to this hair Step in the smart home device exchange method of bright various illustrative embodiments, for example, the terminal device can execute Step S51 as shown in Figure 5 carries out Image Acquisition to current scene by image acquisition units, and is corresponding to image Collected realtime graphic is presented on the display screen of acquisition unit, step S52 determines the real time imagery range of image acquisition units In controlled plant;And step S53, the corresponding position in collected realtime graphic presented on a display screen, label The controlled plant, and/or in response to the touch control operation for the controlled plant in realtime graphic, show corresponding control operation Menu, and/or the predetermined position for being shown in the display screen in response to the controlled plant in the realtime graphic, display Corresponding control actions menu.
Described program product can be using any combination of one or more readable mediums.Readable medium can be readable letter Number medium or readable storage medium storing program for executing.Readable storage medium storing program for executing for example may be-but not limited to-electricity, magnetic, optical, electromagnetic, red The system of outside line or semiconductor, device or device, or any above combination.The more specific example of readable storage medium storing program for executing (non exhaustive list) includes: the electrical connection with one or more conducting wires, portable disc, hard disk, random access memory (RAM), read-only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disc Read memory (CD-ROM), light storage device, magnetic memory device or above-mentioned any appropriate combination.
As shown in figure 13, the program product for smart home device interaction of embodiment according to the present invention is described 130, can be using portable compact disc read only memory (CD-ROM) and including program code, and it can be on the terminal device Operation.However, program product of the invention is without being limited thereto, in this document, readable storage medium storing program for executing, which can be, any to be included or deposits The tangible medium of program is stored up, which can be commanded execution system, device or device use or in connection.
Readable signal medium may include in a base band or as the data-signal that carrier wave a part is propagated, wherein carrying Readable program code.The data-signal of this propagation can take various forms, including --- but being not limited to --- electromagnetism letter Number, optical signal or above-mentioned any appropriate combination.Readable signal medium can also be other than readable storage medium storing program for executing it is any can Read medium, the readable medium can send, propagate or transmit for by instruction execution system, device or device use or Program in connection.
The program code for including on readable medium can transmit with any suitable medium, including --- but being not limited to --- Wirelessly, wired, optical cable, RF etc. or above-mentioned any appropriate combination.
The program for executing operation of the present invention can be write with any combination of one or more programming languages Code, described program design language include object oriented program language-Java, C++ etc., further include conventional Procedural programming language-such as " C " language or similar programming language.Program code can be fully in user It calculates and executes in equipment, partly executes on a user device, being executed as an independent software package, partially in user's calculating Upper side point is executed on a remote computing or is executed in remote computing device or server completely.It is being related to far Journey calculates in the situation of equipment, and remote computing device can pass through the network of any kind --- including local area network (LAN) or extensively Domain net (WAN)-be connected to user calculating equipment, or, it may be connected to external computing device (such as utilize Internet service Provider is connected by internet).
In addition, although describing the operation of the method for the present invention in the accompanying drawings with particular order, this do not require that or Hint must execute these operations in this particular order, or have to carry out shown in whole operation be just able to achieve it is desired As a result.Additionally or alternatively, it is convenient to omit multiple steps are merged into a step and executed by certain steps, and/or by one Step is decomposed into execution of multiple steps.
Although detailed description of the preferred embodimentsthe spirit and principles of the present invention are described by reference to several, it should be appreciated that, this It is not limited to the specific embodiments disclosed for invention, does not also mean that the feature in these aspects cannot to the division of various aspects Combination is benefited to carry out, this to divide the convenience merely to statement.The present invention is directed to cover appended claims spirit and Included various modifications and equivalent arrangements in range.

Claims (26)

1. a kind of smart home device exchange method, comprising:
Image Acquisition is carried out to current scene by image acquisition units, and in the display for corresponding to described image acquisition unit Collected realtime graphic is presented on screen, includes multiple controlled plants in the current scene;
Determine the controlled plant in the real time imagery range of described image acquisition unit;
The corresponding position in collected realtime graphic presented on the display screen, marks the controlled plant, and/or In response to the touch control operation for the controlled plant in the realtime graphic, the corresponding control actions menu of display, and/or response Controlled plant in the realtime graphic is shown in the predetermined position of the display screen, the corresponding control operation dish of display It is single.
2. according to the method described in claim 1, determine described image acquisition unit real time imagery range in controlled plant, Include:
Determine real time position and real-time direction of the described image acquisition unit in the current scene;
According to the real time position, in real time direction, the parameter information of described image acquisition unit and the field gathered in advance The zone of control three dimensional local information for the controlled plant for including in scape, determines in the real time imagery range of described image acquisition unit Controlled plant.
3. according to the method described in claim 2, according to the real time position, in real time direction, the ginseng of described image acquisition unit Number information and the scene gathered in advance in include controlled plant zone of control three dimensional local information, determine described in Controlled plant in the real time imagery range of image acquisition units, comprising:
According to the real time position, in real time towards the parameter information of, described image acquisition unit, gathered in advance described work as front court The zone of control three dimensional local information of controlled plant included in the three-dimensional structure information and the current scene of scape, determines institute State the controlled plant in the real time imagery range and the real time imagery range of image acquisition units.
4. according to the method in claim 2 or 3, reality of the determining described image acquisition unit in the current scene When position and real-time direction, comprising:
Determine described image acquisition unit initial position and initial direction;
Acquire the displacement information and rotation information of described image acquisition unit;
According to the initial position, initial direction, displacement information and rotation information, the real-time position of described image acquisition unit is determined It sets and real-time direction.
5. according to the method described in claim 4, initial position and the initial direction of determining described image acquisition unit, comprising:
Receive the first image sequence of the current scene of described image acquisition unit acquisition;
From the first reference picture comprising the first object of reference in the current scene determining in the first image sequence;
According to benchmark image preset, comprising first object of reference and first reference picture and first ginseng According to object location information and shoot the benchmark image when angle information, calculate described image acquisition unit initial position and Initial direction.
6. according to the method described in claim 5, further include:
In described image acquisition unit moving process, the reality comprising the second object of reference of described image acquisition unit acquisition is extracted When image as key frame images;And
First of described image acquisition unit when storing the image information of the key frame images, shooting the key frame images Confidence breath and the first orientation information;And
The method also includes:
Receive the second image sequence of the current scene of described image acquisition unit acquisition;
The second reference picture comprising the second object of reference described in the current scene is determined from second image sequence;
According to the image information of the key frame images, the second reference picture, first location information and the first orientation information, determine Real time position and real-time direction of the described image acquisition unit when collecting second reference picture.
7. according to the method described in claim 1, determine described image acquisition unit real time imagery range in controlled plant, It specifically includes:
According to the image information of controlled plant gathered in advance, identification is presented collected described real-time on the display screen Controlled plant in image.
8. according to the method described in claim 3, the corresponding positions in the collected realtime graphic presented on the display screen Place is set, the controlled plant is marked, comprising:
According to the zone of control three dimensional local information of controlled plant gathered in advance, by be located in the real time imagery range by The zone of control three dimensional local information of control equipment is converted to two in the collected realtime graphic presented on the display screen Tie up location information;
According to the two-dimensional position information, the corresponding position in collected realtime graphic presented on the display screen, Mark the controlled plant.
9. the method according to claim 2 or 8 acquires the zone of control three-dimensional position of the controlled plant according to following procedure Confidence breath:
Obtain the three dimensional local information of each equipment in the three-dimensional structure information and the scene of the scene;
The three dimensional local information of the three-dimensional structure information of the scene and each equipment is imported in default edit tool;
In the edit tool, the controllable area of the controlled plant and the controlled plant is selected from each equipment Domain;
Store the zone of control three dimensional local information of the controlled plant.
10. according to the method described in claim 9, obtaining each equipment in the three-dimensional structure information and the scene of the scene Three dimensional local information, comprising:
The image for being directed to the scene capture is received, three-dimensional reconstruction is carried out according to the image received and obtains the three-dimensional of the scene The three dimensional local information of each equipment in structural information and the scene;Or receive the three-dimensional structure for the scene that user provides The three dimensional local information of each equipment in information and the scene.
11. according to the method described in claim 1, after marking the controlled plant, further includes:
For each controlled plant being located within the scope of the real time imagery, the corresponding control operation dish of the controlled plant is determined Single and hiding accordingly for labeled controlled plant addition control actions menu.
12. according to the method for claim 11, in response to the touch control operation for the controlled plant in the realtime graphic, The corresponding control actions menu of display, comprising:
If detecting that any position on the collected realtime graphic presented on the display screen in label range is clicked When, show corresponding control actions menu.
13. according to claim 1-3, method described in any one of 5-8 and 11-12 claim shows corresponding control dish It is single, comprising:
Corresponding control actions menu is shown using augmented reality AR mode.
14. a kind of smart home device interactive device, comprising:
Image acquisition units include multiple controlled plants in the current scene for carrying out Image Acquisition to current scene;
First display unit, for rendering collected realtime graphic of described image acquisition unit;
First determination unit, the controlled plant in real time imagery range for determining described image acquisition unit;
Marking unit, the corresponding position in collected realtime graphic for presenting on the display screen, described in label Controlled plant;And/or
Second display unit, in response to the touch control operation for the controlled plant in the realtime graphic, display to be corresponding Control actions menu, and/or the default position for being shown in the display screen in response to the controlled plant in the realtime graphic Set place, the corresponding control actions menu of display.
15. device according to claim 14, first determination unit, comprising:
First determines subelement, for determining real time position and real-time court of the described image acquisition unit in the current scene To;
Second determines subelement, for according to the real time position, in real time towards the parameter information of, described image acquisition unit, And the zone of control three dimensional local information for the controlled plant in the scene gathered in advance including, determine that described image acquires Controlled plant in the real time imagery range of unit.
16. device according to claim 15,
Described second determines subelement, specifically for according to the real time position and in real time towards, described image acquisition unit Parameter information, the current scene gathered in advance three-dimensional structure information and the scene included in controlled plant can Area three-dimensional location information is controlled, is determined in the real time imagery range and the real time imagery range of described image acquisition unit Controlled plant.
17. device according to claim 15 or 16, described first determines subelement, comprising:
First determining module, for determine described image acquisition unit initial position and initial direction;
Acquisition module, for acquiring the displacement information and rotation information of described image acquisition unit;
Second determining module, for determining the figure according to the initial position, initial direction, displacement information and rotation information Real time position and real-time direction as acquisition unit.
18. device according to claim 17, first determining module, comprising:
First receiving submodule, the first image sequence of the current scene for receiving the acquisition of described image acquisition unit;
First determines submodule, for including the first object of reference in the current scene from determining in the first image sequence First reference picture;
First computational submodule, for according to benchmark image preset, comprising first object of reference and first reference The angle information when location information of image and first object of reference and the shooting benchmark image, calculates described image The initial position of acquisition unit and initial direction.
19. device according to claim 18, first determining module further include extracting sub-module and storage submodule Block, the second receiving submodule, second determine submodule and the second computational submodule, in which:
The extracting sub-module, in described image acquisition unit moving process, extracting the acquisition of described image acquisition unit The realtime graphic comprising the second object of reference as key frame images;
The sub-module stored, for store the key frame images image information, shoot the key frame images when described in The first location information of image acquisition units and the first orientation information;
Second receiving submodule, the second image sequence of the current scene for receiving the acquisition of described image acquisition unit Column;
Second determines submodule, for determining from second image sequence comprising the second reference described in the current scene Second reference picture of object;
Second computational submodule, specifically for the image information according to the key frame images, the second reference picture, first Location information and the first orientation information determine real-time position of the described image acquisition unit when collecting second reference picture It sets and real-time direction.
20. device according to claim 14, first determination unit, comprising:
It identifies subelement, is presented on the display screen for the image information according to controlled plant gathered in advance, identification Controlled plant in the collected realtime graphic.
21. device according to claim 16, the marking unit, comprising:
Conversion subunit will be located at the reality for the zone of control three dimensional local information according to controlled plant gathered in advance When areas imaging in the zone of control three dimensional local information of controlled plant be converted to collecting of presenting on the display screen Realtime graphic in two-dimensional position information;
Subelement is marked, the collected realtime graphic for being presented on the display screen according to the two-dimensional position information In corresponding position, mark the controlled plant.
22. device described in 5 or 21 according to claim 1, further includes:
Obtaining unit, the three dimensional local information of each equipment in the three-dimensional structure information and the scene for obtaining the scene;
Import unit, for the three dimensional local information of the three-dimensional structure information of the scene and each equipment to be imported default compile In the tool of collecting;
Selecting unit selects the controlled plant and described controlled in the edit tool from each equipment The zone of control of equipment;
Storage unit, for storing the zone of control three dimensional local information of the controlled plant.
23. device according to claim 22, the obtaining unit, comprising:
Receiving subelement, for receiving the image for being directed to the scene capture;Or receive the three of the scene that user provides Tie up the three dimensional local information of each equipment in structural information and the scene;
Subelement is rebuild, the image for receiving according to the receiving subelement carries out three-dimensional reconstruction and obtains the three of the scene Tie up the three dimensional local information of each equipment in structural information and the scene.
24. device according to claim 14, further includes:
Adding unit is used for after the marking unit marks the controlled plant, for positioned at the real time imagery range Interior each controlled plant determines the corresponding control actions menu of the controlled plant, and adds phase for labeled controlled plant The hiding control actions menu answered.
25. device according to claim 24,
Second display unit, if being also used to detect label range on the collected realtime graphic presented on the display screen When interior any position is clicked, corresponding control actions menu is shown.
26. the described in any item devices of 4-16,18-21 and 24-25 according to claim 1,
Second display unit, specifically for showing corresponding control actions menu using augmented reality AR mode.
CN201610130884.1A 2016-03-08 2016-03-08 A kind of smart home device exchange method and device Active CN105760106B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610130884.1A CN105760106B (en) 2016-03-08 2016-03-08 A kind of smart home device exchange method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610130884.1A CN105760106B (en) 2016-03-08 2016-03-08 A kind of smart home device exchange method and device

Publications (2)

Publication Number Publication Date
CN105760106A CN105760106A (en) 2016-07-13
CN105760106B true CN105760106B (en) 2019-01-15

Family

ID=56332736

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610130884.1A Active CN105760106B (en) 2016-03-08 2016-03-08 A kind of smart home device exchange method and device

Country Status (1)

Country Link
CN (1) CN105760106B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018045076A1 (en) * 2016-08-30 2018-03-08 C3D Augmented Reality Solutions Ltd Systems and methods for simultaneous localization and mapping
CN106406805B (en) * 2016-09-13 2019-05-14 Oppo广东移动通信有限公司 A kind of method and mobile terminal of speaker grouping
CN106445298A (en) * 2016-09-27 2017-02-22 三星电子(中国)研发中心 Visual operation method and device for internet-of-things device
US11119722B2 (en) * 2016-11-08 2021-09-14 Sharp Kabushiki Kaisha Movable body control apparatus and recording medium
CN106502118A (en) * 2016-12-21 2017-03-15 惠州Tcl移动通信有限公司 A kind of intelligent home furnishing control method and system based on AR photographic head
CN107065522B (en) * 2016-12-29 2020-06-16 冶金自动化研究设计院 Fuzzy slow state feedback H-infinity control method for nonlinear switching double-time scale system
CN107797661A (en) * 2017-09-14 2018-03-13 珠海格力电器股份有限公司 Device control device and method and mobile terminal
CN109727321A (en) * 2017-10-31 2019-05-07 中兴通讯股份有限公司 Settlement method and device
CN108548300B (en) * 2018-03-02 2020-01-07 珠海格力电器股份有限公司 Air supply method and device of air conditioner and electronic equipment
WO2019178863A1 (en) * 2018-03-23 2019-09-26 深圳市大疆创新科技有限公司 Control method, control device, control system, and computer readable storage medium
KR102524586B1 (en) * 2018-04-30 2023-04-21 삼성전자주식회사 Image display device and operating method for the same
CN111314398A (en) * 2018-12-11 2020-06-19 阿里巴巴集团控股有限公司 Equipment control method, network distribution method, system and equipment
CN109491263B (en) * 2018-12-13 2022-06-03 深圳绿米联创科技有限公司 Intelligent household equipment control method, device and system and storage medium
CN109814822B (en) * 2018-12-24 2021-08-24 维沃移动通信有限公司 Multimedia play control method and device and terminal equipment
CN110135238B (en) * 2019-03-26 2021-04-06 浙江工业大学 Markless Internet of things equipment identification method based on mobile AR
CN112051919B (en) * 2019-06-05 2022-10-18 北京外号信息技术有限公司 Interaction method and interaction system based on position
CN110297472A (en) * 2019-06-28 2019-10-01 上海商汤智能科技有限公司 Apparatus control method, terminal, controlled plant, electronic equipment and storage medium
CN111045344A (en) * 2019-12-31 2020-04-21 维沃移动通信有限公司 Control method of household equipment and electronic equipment
WO2021227068A1 (en) * 2020-05-15 2021-11-18 北京小米移动软件有限公司 Map acquiring method and device for internet of things device
CN111787081B (en) * 2020-06-21 2021-03-23 江苏永鼎通信有限公司 Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform
CN112346630B (en) * 2020-10-27 2022-09-27 北京有竹居网络技术有限公司 State determination method, device, equipment and computer readable medium
CN112817547A (en) * 2021-01-22 2021-05-18 北京小米移动软件有限公司 Display method and device, and storage medium
CN113033311A (en) * 2021-02-25 2021-06-25 福建氢启健康科技有限公司 Equipment control method based on video stream

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103135754A (en) * 2011-12-02 2013-06-05 深圳泰山在线科技有限公司 Interactive device and method for interaction achievement with interactive device
KR101358064B1 (en) * 2012-10-29 2014-02-11 한국과학기술원 Method for remote controlling using user image and system of the same
CN104461318A (en) * 2013-12-10 2015-03-25 苏州梦想人软件科技有限公司 Touch read method and system based on augmented reality technology
CN104851134A (en) * 2015-05-18 2015-08-19 天机数码创新技术有限公司 Augmented reality system with virtual trigger and real object trigger in combination and method thereof
CN105096180A (en) * 2015-07-20 2015-11-25 北京易讯理想科技有限公司 Commodity information display method and apparatus based augmented reality
CN205028239U (en) * 2014-12-10 2016-02-10 杭州凌手科技有限公司 Interactive all -in -one of virtual reality intelligence projection gesture

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103135754A (en) * 2011-12-02 2013-06-05 深圳泰山在线科技有限公司 Interactive device and method for interaction achievement with interactive device
KR101358064B1 (en) * 2012-10-29 2014-02-11 한국과학기술원 Method for remote controlling using user image and system of the same
CN104461318A (en) * 2013-12-10 2015-03-25 苏州梦想人软件科技有限公司 Touch read method and system based on augmented reality technology
CN205028239U (en) * 2014-12-10 2016-02-10 杭州凌手科技有限公司 Interactive all -in -one of virtual reality intelligence projection gesture
CN104851134A (en) * 2015-05-18 2015-08-19 天机数码创新技术有限公司 Augmented reality system with virtual trigger and real object trigger in combination and method thereof
CN105096180A (en) * 2015-07-20 2015-11-25 北京易讯理想科技有限公司 Commodity information display method and apparatus based augmented reality

Also Published As

Publication number Publication date
CN105760106A (en) 2016-07-13

Similar Documents

Publication Publication Date Title
CN105760106B (en) A kind of smart home device exchange method and device
US12056837B2 (en) Employing three-dimensional (3D) data predicted from two-dimensional (2D) images using neural networks for 3D modeling applications and other applications
US20190129607A1 (en) Method and device for performing remote control
CN107045844B (en) A kind of landscape guide method based on augmented reality
WO2018100090A1 (en) Virtual sensor configuration
US8537231B2 (en) User interface system based on pointing device
US11132845B2 (en) Real-world object recognition for computing device
WO2020069049A1 (en) Employing three-dimensional data predicted from two-dimensional images using neural networks for 3d modeling applications
CN104301661B (en) A kind of smart home monitoring method, client and related device
TW202109450A (en) An image processing method, device and storage medium
CN111026261A (en) Method for AR interactive display of tourist attractions
CN109997364A (en) Method, apparatus and stream providing indication of mapping of omnidirectional images
CN109063799B (en) Positioning method and device of equipment
CN106648098A (en) User-defined scene AR projection method and system
KR20120076175A (en) 3d street view system using identification information
JP2021524119A (en) Information device interaction methods and systems based on optical labels
CN109992111A (en) Augmented reality extended method and electronic equipment
CN117369233B (en) Holographic display method, device, equipment and storage medium
CN111179341B (en) Registration method of augmented reality equipment and mobile robot
CN114935975B (en) Virtual reality multi-user interaction method, electronic device and readable storage medium
KR101036107B1 (en) Emergency notification system using rfid
CN114945090A (en) Video generation method and device, computer readable storage medium and computer equipment
US20240127552A1 (en) Augmented reality method and system enabling commands to control real-world devices
CN112634773B (en) Augmented reality presentation method and device, display equipment and storage medium
WO2020244576A1 (en) Method for superimposing virtual object on the basis of optical communication apparatus, and corresponding electronic device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210518

Address after: 311200 Room 102, 6 Blocks, C District, Qianjiang Century Park, Xiaoshan District, Hangzhou City, Zhejiang Province

Patentee after: Hangzhou Yixian Advanced Technology Co.,Ltd.

Address before: 310052 Building No. 599, Changhe Street Network Business Road, Binjiang District, Hangzhou City, Zhejiang Province, 4, 7 stories

Patentee before: NETEASE (HANGZHOU) NETWORK Co.,Ltd.

TR01 Transfer of patent right