CN105760106A - Interaction method and interaction device of intelligent household equipment - Google Patents

Interaction method and interaction device of intelligent household equipment Download PDF

Info

Publication number
CN105760106A
CN105760106A CN201610130884.1A CN201610130884A CN105760106A CN 105760106 A CN105760106 A CN 105760106A CN 201610130884 A CN201610130884 A CN 201610130884A CN 105760106 A CN105760106 A CN 105760106A
Authority
CN
China
Prior art keywords
controlled plant
image acquisition
real time
acquisition units
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610130884.1A
Other languages
Chinese (zh)
Other versions
CN105760106B (en
Inventor
赵辰
丛林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yixian Advanced Technology Co.,Ltd.
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201610130884.1A priority Critical patent/CN105760106B/en
Publication of CN105760106A publication Critical patent/CN105760106A/en
Application granted granted Critical
Publication of CN105760106B publication Critical patent/CN105760106B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Abstract

The implementation mode of the invention provides an interaction method of intelligent household equipment. The interaction method comprises the following steps: carrying out image acquisition on a current scene by an image acquisition unit and presenting an acquired real-time image on a display screen corresponding to the image acquisition unit; determining controlled equipment in the real-time imaging range of the image acquisition unit; marking the controlled equipment at the corresponding positions in the real-time image, and/or responding to the touch operation of controlled equipment in the real-time image, displaying a corresponding control operation menu, and/or responding to a preset position of the controlled equipment displayed on the display screen in the real-time image, and displaying the corresponding control operation menu by. According to the interaction method and the interaction device provided by the invention, a user can position operable controlled equipment intuitively by marking the controlled equipment on the real-time scene image to realize obtaining and controlling for the user when in seeing, so that the operation difficulty of the user is obviously reduced and the user experience is improved. The implementation mode of the invention also provides an interaction device of the intelligent household equipment.

Description

A kind of intelligent home device exchange method and device
Technical field
Embodiments of the present invention relate to technology of Internet of things field, more specifically, embodiments of the present invention relate to a kind of intelligent home device exchange method and device.
Background technology
This part is it is intended that the embodiments of the present invention stated in claims provide background or context.Description herein is not because including just admitting in this part to be prior art.
Smart Home is namely by bottom Internet of Things agreement, by each electric equipment in family, illumination component, moving component (such as curtain etc.) with UniformResourceIdentifiers (URIs, Uniform Resource Identifier) location, and carry out the communication between state with the principle of RepresentationalStateTransfer (REST, declarative state shifts).
And in user with intelligent home device interactive mode, owing to Smart Home number of elements that can be mutual in intelligent domestic system exponentially increases, therefore, it is mutual that the physical level interactive mode (being interacted with home equipment by switch, remote controller etc.) of traditional single product home equipment cannot be applicable to intelligent home device.
Summary of the invention
At present, the interactive mode of intelligent home device mainly has following two: first kind of way is to be interacted with intelligent home device by the APP (applications client) installed on mobile equipment, by intelligent home device is grouped, display with list mode layering at APP interactive interface, such as, parlor-lamp-lamp 1, user is when mutual with a certain intelligent home device, need successively to open list and need mutual home equipment to position, user also needs to remember the coding of each home equipment simultaneously, lamp described above encodes, if user forgets, then it is likely to needs repeatedly attempt, under obvious this interactive mode, interactive interface is succinct not directly perceived, add user operation step, make user operation complex, too increase the difficulty of user operation on the other hand, have impact on Consumer's Experience;The second way is by natural interaction interface (NUI, NaturalUserInterface) interact, user passes through voice, the natural interaction interfaces such as gesture interact, under this interactive mode, the registration of intelligent home device is the bottleneck of operation, such as, 4 pendent lamps placed side by side in parlor, user needs to be encoded different etc. in advance, such as lamp 1, lamp 2 etc., different home equipments needs corresponding different voice and hands gesture, equally, user need to accurately remember the coding of each lamp and for the operating gesture of its definition or voice etc., to interact, this mode of operation is directly perceived equally not, add the difficulty of user operation.
For this, it is also very desirable to the intelligent home device exchange method of a kind of improvement, to reduce user operation difficulty, improve the convenience that home equipment is mutual.
In the present context, embodiments of the present invention expectation provides a kind of intelligent home device exchange method and device.
In the first aspect of embodiment of the present invention, it is provided that a kind of intelligent home device exchange method, including:
By image acquisition units, current scene carried out image acquisition, and on the display screen corresponding to described image acquisition units, present the real time imaging collected, described current scene comprises multiple controlled plant;
Determine the controlled plant in the realtime imaging scope of described image acquisition units;
Corresponding position in the real time imaging collected presented on described display screen, controlled plant described in labelling, and/or in response to the touch control operation for the controlled plant in described real time imaging, display controls actions menu accordingly, and/or be shown in the predetermined position of described display screen in response to the controlled plant in described real time imaging, display controls actions menu accordingly.
In the second aspect of embodiment of the present invention, it is provided that a kind of intelligent home device interactive device, including:
Image acquisition units, for current scene is carried out image acquisition, comprises multiple controlled plant in described current scene;
First display unit, for presenting the real time imaging that described image acquisition units collects;
First determines unit, is used for the controlled plant determining in the realtime imaging scope of described image acquisition units;
Indexing unit, the corresponding position in the real time imaging collected presented on described display screen, controlled plant described in labelling;And/or
Second display unit, for in response to the touch control operation for the controlled plant in described real time imaging, display controls actions menu accordingly, and/or be shown in the predetermined position of described display screen in response to the controlled plant in described real time imaging, display controls actions menu accordingly.
In the third aspect of embodiment of the present invention, provide a kind of intelligent home device interactive device, such as, memorizer and processor can be included, wherein, processor may be used for reading the program in memorizer, performs following process: by image acquisition units, current scene is carried out image acquisition, and on the display screen corresponding to described image acquisition units, present the real time imaging collected, described current scene comprises multiple controlled plant;Determine the controlled plant in the realtime imaging scope of described image acquisition units;Corresponding position in the real time imaging collected presented on described display screen, controlled plant described in labelling, and/or in response to the touch control operation for the controlled plant in described real time imaging, display controls actions menu accordingly, and/or be shown in the predetermined position of described display screen in response to the controlled plant in described real time imaging, display controls actions menu accordingly.
In the fourth aspect of embodiment of the present invention, provide a kind of program product, it includes program code, when described program product runs, described program code is used for performing procedure below: by image acquisition units, current scene is carried out image acquisition, and on the display screen corresponding to described image acquisition units, present the real time imaging collected, described current scene comprises multiple controlled plant;Determine the controlled plant in the realtime imaging scope of described image acquisition units;Corresponding position in the real time imaging collected presented on described display screen, controlled plant described in labelling, and/or in response to the touch control operation for the controlled plant in described real time imaging, display controls actions menu accordingly, and/or be shown in the predetermined position of described display screen in response to the controlled plant in described real time imaging, display controls actions menu accordingly.
Intelligent home device exchange method according to embodiment of the present invention and device, labelling controlled plant therein on the scene image of Real-time Collection can be passed through, enable a user to position intuitively exercisable controlled plant from currently displaying scene image, additionally, touch control operation according to user, can also show in the scene image of Real-time Collection and control actions menu accordingly, so that corresponding controlled plant can be controlled operation by user, achieve user's What You See Is What You Get, namely finding is controlled, thus significantly reducing user operation difficulty, improve the convenience that intelligent home device is mutual, better experience is brought for user.
Accompanying drawing explanation
Reading detailed description below by reference accompanying drawing, above-mentioned and other purposes of exemplary embodiment of the invention, feature and advantage will become prone to understand.In the accompanying drawings, illustrate some embodiments of the present invention by way of example, and not by way of limitation, wherein:
Fig. 1 schematically shows the application scenarios schematic diagram according to embodiment of the present invention;
Fig. 2 schematically shows the implementing procedure schematic diagram of the zone of control three dimensional local information of the controlled plant comprised in the collection scene according to embodiment of the present invention;
Fig. 3 schematically shows a kind of possible structural representation of the tables of data of the storage according to embodiment of the present invention;
Fig. 4 schematically shows the intelligent home device according to embodiment of the present invention and initializes schematic flow sheet alternately;
Fig. 5 schematically shows the intelligent home device exchange method implementing procedure schematic diagram according to embodiment of the present invention;
Fig. 6 a schematically shows the controlled plant schematic flow sheet in the real time imaging that the determination image acquisition units according to embodiment of the present invention gathers;
Fig. 6 b schematically show the determination image acquisition units according to embodiment of the present invention enforcing location and in real time towards schematic flow sheet;
Fig. 7 a schematically shows the real time position of the determination image acquisition units according to embodiment of the present invention and in real time towards schematic flow sheet;
Fig. 7 b schematically shows the display viewport schematic diagram according to embodiment of the present invention;
Fig. 8 schematically shows the schematic flow sheet carrying out labelling in the real time imaging that display screen presents according to embodiment of the present invention;
Real time imaging schematic diagram after what Fig. 9 schematically showed that the display screen according to embodiment of the present invention shows marked corresponding controlled plant;
Figure 10 schematically show according to embodiment of the present invention in response to for the touch control operation of controlled plant in the real time imaging collected presented on display screen show this controlled plant corresponding control actions menu schematic diagram;
Figure 11 schematically shows intelligent home device interactive device schematic diagram according to another embodiment of the present invention;
Figure 12 schematically shows the structural representation of the Smart Home interactive device according to further embodiment of this invention;
Figure 13 schematically shows the program product schematic diagram of Smart Home interactive device according to yet another embodiment of the invention;
In the accompanying drawings, identical or corresponding label represents identical or corresponding part.
Detailed description of the invention
Principles of the invention and spirit are described below with reference to some illustrative embodiments.Should be appreciated that providing these embodiments is only used to make those skilled in the art better understood when and then realize the present invention, and the scope being not intended to limit the present invention in any manner.On the contrary, it is provided that these embodiments are to make the disclosure more thorough and complete, and the scope of the present disclosure can intactly convey to those skilled in the art.
One skilled in the art will appreciate that embodiments of the present invention can be implemented as a kind of system, device, equipment, method or computer program.Therefore, the disclosure can be implemented as following form, it may be assumed that the form that hardware, completely software (including firmware, resident software, microcode etc.), or hardware and software completely combines.
According to the embodiment of the present invention, it is proposed that a kind of intelligent home device exchange method and equipment.
In this article, any number of elements in accompanying drawing is all unrestricted for example, and any name is only used for distinguishing, and does not have any limitation.
Below with reference to some representative embodiments of the present invention, explaination principles of the invention and spirit in detail.
Summary of the invention
The inventors discovered that, in existing intelligent home device interactive mode, its interactive interface is succinct not directly perceived, make user operation complicated, it addition, user needs to remember coding or the mode of operation that same kind of different home equipment is corresponding, add user operation difficulty.
In order to simplify user operation step, reduce user operation difficulty, in the embodiment of the present invention, the shooting first-class image acquisition units scene to comprising intelligent home device on terminal unit can be utilized to carry out real-time image acquisition, and the real time imaging collected is presented on the display screen of terminal unit, it is determined by the intelligent home device (being referred to as controlled plant in the embodiment of the present invention) that comprises in real time imaging and is marked, allow users to see intuitively corresponding controlled plant on the scene image presented in real time by terminal unit, and in response to the touch control operation of user, the control actions menu of corresponding controlled plant can also be shown, so that corresponding controlled plant is operated controlling by user.In the home equipment interactive mode that the embodiment of the present invention provides, interactive interface is succinctly directly perceived, achieve What You See Is What You Get, namely finding is controlled, simplify user operation step, outside woods, owing to user is without being encoded to same kind of different home equipments or define the operating gesture etc. of its correspondence, reduce user operation difficulty.
After the ultimate principle describing the present invention, introduce the various non-limiting embodiment of the present invention in detail below.
Application scenarios overview
With reference first to Fig. 1, the application scenarios schematic diagram of its intelligent home device exchange method provided for the embodiment of the present invention.It includes multiple intelligent home device, such as air-conditioning 11, TV 12, and empty humidifier 13 etc., the communication being controlled of intelligent home device can be followed Internet of Things agreement, it is also possible to follow other communication protocol.
Illustrative methods
Below in conjunction with the application scenarios of Fig. 1, with reference to Fig. 2-Fig. 9, the intelligent home device exchange method according to exemplary embodiment of the invention is described.It should be noted that above-mentioned application scenarios is for only for ease of the spirit and principle of understanding the present invention and illustrates, embodiments of the present invention are unrestricted in this regard.On the contrary, embodiments of the present invention can apply to any scene of being suitable for.
In order to realize the embodiment of the present invention, it is necessary to the data acquisition in advance concrete application scenarios being correlated with and data prediction.The room that the application scenarios related in the embodiment of the present invention can be in the arbitrary set house in common property plots or office building, can also is that in arbitrary set house, for instance, parlor, bedroom etc., or it is a room in hotel etc..
When being embodied as, typically require collection data below: the three dimensional local information of each equipment in the three dimensional structure information of scene and described scene, the zone of control three dimensional local information of the controlled plant comprised in this scene, and choose any region in scene and, as initial position, be introduced individually below.
One, the three dimensional local information of each equipment in the three dimensional structure information of scene and described scene.
In the embodiment of the present invention, it is possible to by appointing the three dimensional local information of each equipment in the three dimensional structure information and this scene obtaining scene in any manner:
Mode one, by obtain scene picture carry out three-dimensional reconstruction.
Concrete, it is possible to implement according to below scheme: receive the image for scene capture, carry out three-dimensional reconstruction according to the image received and obtain the three dimensional structure information of this scene and the three dimensional local information of each equipment.
Such as, recovery three-dimensional scene structure (the SFM from movable information in computer vision can be adopted, Structurefrommotion) technology, it can recover corresponding three-dimensional information from the image of two dimension or video sequence, including the three dimensional structure information of the kinematic parameter of imaging camera machine and scene.Its input can be a series of two dimensional image or video sequence, output is the 3D model information of scene, output includes the summit of each equipment three dimensional local information in the scene in the approximate dimensions of scene, scene, namely exports the 3D model also including the details object within scene).
The three dimensional local information of each equipment in mode two, the three dimensional structure information receiving the scene that user provides and scene.
Under this embodiment, generally can directly be provided the tomograph of scene by the developer in the owner in room or house, directly obtain the three dimensional local information of each equipment in the three dimensional structure information of corresponding scene and scene according to the tomograph obtained.
Two, the zone of control three dimensional local information of the controlled plant comprised in scene.
In the embodiment of the present invention, it is possible to the flow process shown in Fig. 2 gathers the zone of control three dimensional local information of the controlled plant comprised in scene:
The three dimensional local information of each equipment in S21, the three dimensional structure information obtaining scene and this scene.
When being embodied as, it is possible to obtain three dimensional structure information and the three dimensional local information of each equipment in this scene of scene according to any one in above two mode, repeat no more here.
S22, the three dimensional local information of each equipment in the three dimensional structure information of scene and this scene is imported and presets in edit tool.
Wherein default edit tool provides the interactive interface mutual with user, and user imports three dimensional structure information and the three dimensional local information of each equipment in this scene of scene by interactive interface.
S23, in edit tool, user selects the zone of control of controlled plant and this controlled plant by interactive interface from each equipment of this scene.
When being embodied as, it is possible to by the interactive interface that edit tool provides, controlled plant and zone of control thereof are registered, and it is that corresponding controlled plant adds descriptive information, for instance, the controlled plant registered is as the lamp 1 etc. in parlor.Concrete, user can select the crucial summit of some controlled plants, eight summits of such as controlled plant, as registration controlled plant, select information during zone of control to input, and the region of these crucial periphery, summit lines compositions can be understood as the zone of control of controlled plant.
S24, the controlled plant stored in this scene zone of control three dimensional local information.
When being embodied as, after defining the zone of control three dimensional local information of controlled plant, it is possible to use tables of data is to store related data.When storage, it is necessary to store the zone of control three dimensional local information of each controlled plant comprised in corresponding scene identity and this scene.As it is shown on figure 3, a kind of possible list structure schematic diagram of the tables of data that it is storage.Wherein, scene identity is for distinguishing different scenes, it is advantageous to, scene identity may indicate that different communities, different buildings number, different houses and different rooms, and controlled plant mark is for distinguishing different controlled plants.
The three dimensional structure information of scene and the zone of control three dimensional local information of controlled plant, generally can be made up of 8 apex coordinates.
It should be noted that when being embodied as, the difference according to the shape of controlled plant, the description of the zone of control three dimensional local information of controlled plant is likely to difference to some extent.Such as, if controlled plant is regular shape, then the zone of control three dimensional local information of controlled plant is likely to be made up of 8 apex coordinates.But, if controlled plant is be embedded in the equipment on wall, for instance be embedded in the intelligent television on wall, it is likely to the zone of control three dimensional local information only needing 4 apex coordinates to describe this controlled plant.It addition, when being embodied as, the difference according to actual scene, can also only select part apex coordinate when selecting controlled plant and zone of control thereof, all the other apex coordinates adopt the method for symmetrical mapping to determine.And if controlled plant is irregularly shaped, then can be according to actual needs, selecting representative key point and coordinate thereof to describe the zone of control three dimensional local information of this controlled plant, the region of the peripheral line composition of selected key point can represent this controlled plant and zone of control thereof.Fig. 3 is consisted of example with each controlled plant by 8 apex coordinates.
Three, any region is chosen in scene as the initial position of successive image identification
Preferably, when being embodied as, can choose in scene and include the higher object of contrast (i.e. the first object of reference in the embodiment of the present invention, wherein, contrast is higher refers to that finger color and texture and surrounding objects differ greatly, the object of such as black is in white background, its contrast is higher) region as the initial position of successive image identification, when being embodied as, initial position can be specified by user, and shoot the benchmark image comprising above-mentioned object of reference, the related datas such as the angle information when three dimensional local information of the first object of reference comprised in record benchmark image and shooting benchmark image.Wherein, angle information during shooting benchmark image refers to that during shooting benchmark image, photographic head is towards the angular relationship with some reference plane of the first object of reference.For square object, when just object being shot, do not distort in 4 summits of this object, if there being certain angle, it is possible to determines angle information during place's shooting according to distortion degree;Angle information can shooting time known and carry out storage record.
Based on the related data of above-mentioned collection, user is when needing to interact with the controlled plant that comprises in scene, it is possible to use the method that the embodiment of the present invention provides interacts, described in detail below it.
When being embodied as, user is when needing to interact with the controlled plant in current scene, can be opened a terminal the image acquisition units on equipment, such as, this image acquisition units can be photographic head, roaming in scene, the scene image sequence gathered in roam procedure based on user completes initialization operation, namely determines the initial position of image acquisition units and initial towards (i.e. initial shooting direction).
Illustrate initializing flow process below in conjunction with Fig. 4, it is possible to comprise the following steps:
First image sequence of the current scene that S41, reception image acquisition units gather.
When being embodied as, user is in scene roam procedure, and image acquisition units gathers current scene image according to preset frame rate and obtains the first image sequence, and the first image sequence gathered is sent to processing unit.
S42, from the first image sequence received, determine the first reference picture comprising the first object of reference in this current scene.
Processing unit is after receiving the scene image sequence that image acquisition units sends, therefrom identify the image comprising above-mentioned first object of reference, utilize existing common image recognition methods to identify the first object of reference comprised in image, and determine that the scene image comprising the first object of reference is the first reference picture.
S43, according to preset, to comprise the first object of reference benchmark image and the first reference picture, and the angle information when positional information of the first object of reference and shooting benchmark image, calculate image acquisition units initial position and initial towards.
Processor is according to the first reference picture determined in the benchmark image comprising the first object of reference stored in data acquisition and preprocessing process and step S42, and angle information during in conjunction with the positional information of the first object of reference and shooting benchmark image, it is determined that the initial position of image acquisition units and initial towards.
Such as, attitude prediction technology (PoseEstimation) can be adopted, image characteristic point according to benchmark image and the matching relationship of the image characteristic point of the first reference picture are (such as, the matching relationship in the upper left corner of benchmark image top left corner apex and the first reference picture), by RANSAC (RandomSampleConsensus, stochastic sampling is consistent) algorithm elimination erroneous matching, the current pose (attitude) of image acquisition units is solved by PnP (PerspectiveNPoint) method (being mapped by n point), namely image acquisition units position and towards.
So far, initialization flow process is completed.
Based on this, the embodiment of the present invention provide intelligent interactive method, it is possible to the flow implementation shown in Fig. 5, comprise the following steps:
S51, by image acquisition units current scene carried out image acquisition, and corresponding to the display screen of image acquisition units presents the real time imaging collected.
When being embodied as, user, when needing to interact with the controlled plant in current scene, utilizes the terminal unit current scene to comprising this controlled plant to carry out image acquisition, and presents the real time imaging collected on the display screen of terminal unit.
S52, the controlled plant determined in the realtime imaging scope of image acquisition units.
Concrete, it is determined that go out the multiple controlled plants in the realtime imaging scope of image acquisition units, it is possible to all determined by all controlled plants in described realtime imaging scope.
Corresponding position in S53, the real time imaging collected presented on a display screen, controlled plant described in labelling, and/or control actions menu accordingly in response to for the touch control operation of the controlled plant in real time imaging, display, and/or control actions menu accordingly for being shown in the predetermined position of described display screen in response to the controlled plant in described real time imaging, showing.
Concrete, during labelling controlled plant, the multiple controlled plants in the realtime imaging scope that will identify that carry out labelling;All controlled plants in the realtime imaging scope that can will identify that all carry out labelling.
Below in conjunction with accompanying drawing, step S52 and S53 is described in detail.
In step S52, the embodiment of the present invention provides following two embodiment to determine the controlled plant in image acquisition units realtime imaging scope, when being embodied as, it is possible to adopt any one embodiment according to actual needs.
The first embodiment, image recognition mode
The i.e. image information according to the controlled plant gathered in advance, identifies the controlled plant in the real time imaging that the image acquisition units presented on terminal unit display screen collects.
The second embodiment, utilize the real time position of image acquisition units, in real time towards, the zone of control three-dimensional information of controlled plant in correlation acquisition parameter, current scene, calculate the controlled plant determining in the real time imaging that image acquisition units gathers.
Under this embodiment, it is possible to the flow process shown in Fig. 6 a determines the controlled plant in the real time imaging that image acquisition units gathers, and comprises the following steps:
S521, determine image acquisition units real time position in current scene and in real time towards.
S522, according to the real time position of image acquisition units, in real time towards and the parameter information of image acquisition units, the described current scene that gathers in advance in the zone of control three dimensional local information of controlled plant that comprises, it is determined that the controlled plant in the realtime imaging scope of image acquisition units.
nullConcrete,In step S522,The projection imaging principle in computer vision can be utilized: p2d=Proj ([R | T] p3d) _,P3d refers to the 3D coordinate (coordinate relative to initial position) in world coordinate system,Proj () is projection function,This projection function can be determined according to the parameter information of image acquisition units,[R | T] be Current camera position and towards,R is directed towards、T is position,Wherein,R contains the information of 3 degree of freedom,It represents that image acquisition units is respectively relative to x-axis,The rotation amount of y-axis and z-axis,It is the matrix of a 3*3,T is 3 dimensional vector (x,y,z),Then [R | T] can be expressed as the matrix of a 3*4,Output coordinate p2d on 2D screen,Based on this,Only it is to be understood that 8 3D apex coordinates of controlled plant just can calculate its position on 2D screen,If the coordinate p2d calculated is not within the scope of default 2D screen coordinate,Then this controlled plant is not within the scope of the realtime imaging of image acquisition units.It is not required that by the information of the three dimensional structure of current scene.
Wherein, in step S521 can the flow process shown in Fig. 6 b determine image acquisition units enforcing location and in real time towards:
S5211, determine image acquisition units initial position in current scene and initial towards.
Wherein it is determined that the initial position that image acquisition units is in current scene and initial towards idiographic flow as shown in Figure 4, repeat no more here.
S5212, the displacement information gathering image acquisition units and rotation information.
When being embodied as, it is possible to use the accelerometer in terminal unit gathers the displacement information of image acquisition units, and utilizes the gyroscope in terminal unit to gather the rotation information of image acquisition units.
Wherein, gyroscope gathers image acquisition units rotation information in time t, and accelerometer gathers image acquisition units linear acceleration information in time t, thus, it is possible to estimate image acquisition units displacement in time t.
S5213, according to the initial position of image acquisition units, initial towards, displacement information with rotation information, it is determined that the real time position of image acquisition units and in real time towards.
When being embodied as, can utilize the image acquisition units determined in step S521 initial position and initial towards, in the moving process of image acquisition units, utilize SLAM (SimultaneousLocalizationandMapping, instant location and map structuring) algorithm determine in real time image acquisition units real time position and real-time towards.
Based on this, in step S522, can according to the real time position of image acquisition units, in real time towards and the parameter information of image acquisition units, the three dimensional structure information of current scene gathered in advance and described current scene in the zone of control three dimensional local information of controlled plant that comprises, it is determined that the controlled plant in the realtime imaging scope of image acquisition units and realtime imaging scope.Namely according to embodiments of the present invention, during controlled plant in the realtime imaging scope determining image acquisition units, the three dimensional structure information of current scene can also be used, calculate the image acquisition units areas imaging for current scene, then further according to controlled plant three dimensional local information in current scene, calculate in the realtime imaging scope of image acquisition units to comprise which controlled plant.
Wherein, the parameter information of image acquisition units can include the focal length of image acquisition units, component size, distortion parameter etc..
It is also preferred that the left when being embodied as, it is also possible in image acquisition units moving process, the flow process shown in figure 7 below a determine image acquisition units real time position and in real time towards:
S71, in image acquisition units moving process, extract image acquisition units collection the real time imaging comprising the second object of reference as key frame images.
Wherein, the second object of reference can be the object that the contrast comprised in image is higher, using the image comprising the second object of reference of image acquisition units collection as key frame images.Later use key frame images determine the image acquisition units real time position when collecting the second reference picture and in real time towards.The relevant information wherein utilizing key frame images determine the real time position of image acquisition units with in real time towards flow process determine with the above-mentioned relevant information utilizing benchmark image image acquisition units real time position and real-time towards flow process similar, simply introduce it below.
S72, the image information of storage key frame images, the primary importance information shooting image acquisition units during key frame images and the first orientation information.
Second image sequence of the current scene that S73, reception image acquisition units gather.
User continues to roam in current scene, accordingly, in image acquisition units moving process, image acquisition units gathers current scene image according to preset frame rate (being determined by terminal unit inherent parameters) and obtains the second image sequence, and the second image sequence gathered is sent to processing unit.
S74, from the second image sequence received, determine the second reference picture comprising the second object of reference in current scene.
Processing unit, after receiving the scene image sequence that image acquisition units sends, identifies the image comprising the second object of reference from the scene image sequence received, it can be used as the second reference picture.
S75, according to the image information of key frame images, the second reference picture, primary importance information and the first orientation information, it is determined that the image acquisition units real time position when collecting the second reference picture and in real time towards.
Concrete, key frame images and the second reference picture can be compared, the primary importance information of image acquisition units and the first orientation information when the image information of the key frame images according to storage, shooting key frame images, it is determined that the image acquisition units real time position when collecting the second reference picture and in real time towards.
If follow-up user continues to roam in the scene, then can according to the displacement information of the image acquisition units collected (using the accelerometer in terminal unit to be acquired) and rotation information (using the gyroscope in terminal unit to be acquired) and when collecting the second reference picture image acquisition units real-time towards with real time position determine image acquisition units real time position in moving process and real-time towards.
Real time position according to image acquisition units and in real time towards, it may be determined that go out the realtime imaging scope of image acquisition units.Concrete, when the position of image acquisition units with towards when changing, the angle being image acquisition units photographed scene changes, cause display viewport (FOV, FieldofView) different, display viewport refers to maximum field of view's scope that image acquisition units can photograph, as shown in Figure 7b, it is display viewport schematic diagram, camera (camera) is the image acquisition units in the embodiment of the present invention, and display viewport includes horizontal viewport (HorizontalFOV) and vertical viewport (VerticalFOV).Only the scene in display viewport is presented on display screen corresponding with image acquisition units.Display viewport is made up of hither plane (Near) and far plane (Far), and the scene between hither plane and far plane can be presented on display screen corresponding with image acquisition units.Therefore, according to the real time position of image acquisition units and in real time towards difference, the scene image that display screen presents is also different, this is because the areas imaging of image acquisition units is different and cause.Real time position according to image acquisition units and in real time towards the realtime imaging scope that can determine that image acquisition units, and then can determine that the controlled plant in the realtime imaging scope that image acquisition units gathers.
Based on this, in step S53, according to that determine, be positioned at the controlled plant of image acquisition units realtime imaging scope, the flow process shown in Fig. 8 carries out labelling in the real time imaging that display screen presents:
The zone of control three dimensional local information of the controlled plant that S531, basis gather in advance, will be located in the zone of control three dimensional local information of the controlled plant in realtime imaging scope and is converted to the two-dimensional position information in the real time imaging collected presented on a display screen.
Similar with above-mentioned steps S522, projection imaging principle in equally possible employing computer vision: p2d=Proj in step S531 ([R | T] p3d) _, wherein, p3d refers to the 3D coordinate in world coordinate system, Proj () is projection function, [R | T] be Current camera position and towards, output coordinate on 2D screen is p2d, accordingly, it is determined that 8 3D apex coordinates of controlled plant just can calculate its position on 2D screen.
Two-dimensional position information in the real time imaging collected that the controlled plant being arranged in realtime imaging scope that S532, basis are determined presents on a display screen, the corresponding position in the real time imaging collected presented on a display screen, the corresponding controlled plant of labelling.
As it is shown in figure 9, its real time imaging schematic diagram that marked after corresponding controlled plant shown for display screen.
Alternatively, after marked corresponding controlled plant, it is also possible to for each controlled plant being positioned at realtime imaging scope, it is determined that the control actions menu that this controlled plant is corresponding, and the corresponding control actions menu hidden is added for this labeled controlled plant.Concrete, it is possible to determine, according to the description information of the controlled plant of storage, the control actions menu that this controlled plant is corresponding.
More preferably, in response to for the touch control operation of controlled plant in the real time imaging collected presented on display screen, for instance the touch control operation in labeled scope, the control actions menu that corresponding controlled plant is corresponding is shown.Or, if be detected that when on the real time imaging collected presented on display screen, any position in label range is clicked, display controls actions menu accordingly.As shown in Figure 10, it is show, in response to for the touch control operation of controlled plant in the real time imaging collected presented on display screen, the schematic diagram controlling actions menu that this controlled plant is corresponding.In addition, following triggering can also be set and control the mode that actions menu shows: predeterminated position makes marks on a display screen, such as do specific markers in display screen middle, user adjust mobile phone towards time, display screen display real time imaging in controlled plant can in display screen mobile display position, when controlled plant moves the specific markers place to display screen middle, display controls actions menu accordingly.
It is also preferred that the left when being embodied as, it is possible to adopt augmented reality (AR) mode to show the control actions menu that controlled plant is corresponding.
Owing to controlled plant can show on a display screen intuitively, if user can judge which is controlled plant voluntarily, therefore, when being embodied as, controlled plant can not also be carried out labelling, only when user carries out touch control operation at controlled plant place, show the control actions menu that corresponding controlled plant is corresponding.When being embodied as, it is also possible to only controlled plant is carried out labelling, or, only predeterminated position makes marks on a display screen, in image acquisition units moving process, if having controlled plant to be shown in this predeterminated position, automatically show that this controlled plant operates Control-Menu accordingly.
It should be noted that the terminal unit related in the embodiment of the present invention can for common mobile phone, panel computer etc., it is also possible to for Wearable device, for instance augmented reality (AR) glasses etc..
The intelligent home device interactive mode that the embodiment of the present invention provides, the display screen that can pass through terminal unit shows the real time imaging of current scene intuitively, and the controlled plant wherein comprised can be carried out labelling, user is facilitated to identify, so, make user can pass through terminal screen (hands machine, the display screen of panel computer etc.) directly position corresponding controlled plant, or user can pass through gesture (being applied in the Wearable device such as AR glasses) and directly position corresponding controlled plant, additionally, in response to user's touch control operation to corresponding controlled plant, realize mutual (read controlled plant feedback information or be controlled operation etc.) of user and controlled plant.
Example devices
After the method describing exemplary embodiment of the invention, it follows that with reference to Figure 11 to exemplary embodiment of the invention, intelligent home device interactive device illustrates.
As shown in figure 11, the Smart Home interactive device that the embodiment of the present invention provides, it is possible to including:
Image acquisition units 111, for current scene is carried out image acquisition, comprises multiple controlled plant in described current scene;
First display unit 112, for presenting the real time imaging that described image acquisition units collects;
First determines unit 113, is used for the controlled plant determining in the realtime imaging scope of described image acquisition units 111;
Indexing unit 114, the corresponding position in the real time imaging collected presented on described display screen, controlled plant described in labelling;And/or
Second display unit 115, for in response to the touch control operation for the controlled plant in described real time imaging, display controls actions menu accordingly, and/or controls actions menu accordingly for being shown in the predetermined position of described display screen in response to the controlled plant in described real time imaging, showing.
Wherein, first determines unit 113, it is possible to including:
First determines subelement 1131, for determine described image acquisition units real time position in described current scene and in real time towards;
Second determines subelement 1132, for according to described real time position, in real time towards the zone of control three dimensional local information of the controlled plant comprised in the parameter information of, described image acquisition units and the described scene that gathers in advance, it is determined that the controlled plant in the realtime imaging scope of described image acquisition units.
When being embodied as, second determines subelement 1132, specifically for according to described real time position with in real time towards the zone of control three dimensional local information of the controlled plant comprised in the parameter information of, described image acquisition units, the three dimensional structure information of described current scene gathered in advance and described scene, it is determined that the controlled plant in the realtime imaging scope of described image acquisition units and described realtime imaging scope.
When being embodied as, first determines subelement 1131, including:
First determines module, for determine described image acquisition units initial position and initial towards;
Acquisition module, for gathering displacement information and the rotation information of described image acquisition units;
Second determines module, for according to described initial position, initial towards, displacement information with rotation information, it is determined that the real time position of described image acquisition units and in real time towards.
Wherein, first determines module, including:
First receives submodule, for receiving the first image sequence of the current scene that described image acquisition units gathers;
First determines submodule, for determining the first reference picture comprising the first object of reference in described current scene from described first image sequence;
First calculating sub module, for according to preset, to comprise described first object of reference benchmark image and described first reference picture, and the positional information of described first object of reference and angle information when shooting described benchmark image, calculate described image acquisition units initial position and initial towards.
Optionally, first determines module, also includes extracting submodule and sub module stored, second receives submodule, second determines submodule and the second calculating sub module, wherein:
Described extraction submodule, for, in described image acquisition units moving process, extracting the real time imaging comprising the second object of reference of described image acquisition units collection as key frame images;
Described sub module stored, for storing the image information of described key frame images, the primary importance information shooting described image acquisition units during described key frame images and the first orientation information;
Described second receives submodule, for receiving the second image sequence of the current scene that described image acquisition units gathers;
Second determines submodule, for determining the second reference picture comprising the second object of reference described in described current scene from described second image sequence;
Described second calculating sub module, specifically for according to the image information of described key frame images, the second reference picture, primary importance information and the first orientation information, it is determined that the described image acquisition units real time position when collecting described second reference picture and in real time towards.
It is also preferred that the left first determines that unit 113 can also include identifying subelement 1133, for the image information according to the controlled plant gathered in advance, identify the controlled plant in the described real time imaging collected presented on described display screen.
When being embodied as, indexing unit 114, including:
Conversion subelement 1141, for the zone of control three dimensional local information according to the controlled plant gathered in advance, will be located in the zone of control three dimensional local information of the controlled plant in described realtime imaging scope and be converted to the two-dimensional position information in the real time imaging collected presented on described display screen;
Labelling subelement 1142, for according to described two-dimensional position information, the corresponding position in the real time imaging collected presented on described display screen, controlled plant described in labelling.
Alternatively, the intelligent home device interactive device that the embodiment of the present invention provides, it is also possible to including:
Obtain unit 116, be used for obtaining the three dimensional local information of each equipment in the three dimensional structure information of described scene and described scene;
Import unit 117, for the three dimensional structure information of described scene and the three dimensional local information of described each equipment are imported and preset in edit tool;
Select unit 118, for, in described edit tool, selecting described controlled plant and the zone of control of described controlled plant from described each equipment;
Memory element 119, for storing the zone of control three dimensional local information of described controlled plant.
Wherein, described acquisition unit 116, including:
Receive subelement 1161, for receiving the image for described scene capture;Or the three dimensional local information of each equipment in the three dimensional structure information of the described scene that reception user provides and described scene;
Rebuilding subelement 1162, the image for receiving according to described reception subelement carries out three-dimensional reconstruction and obtains the three dimensional local information of each equipment in the three dimensional structure information of described scene and described scene.
Alternatively, the home equipment interactive device that the embodiment of the present invention provides, it is also possible to including:
Adding device 1120, for after controlled plant described in described indexing unit labelling, for each controlled plant being positioned at described realtime imaging scope, it is determined that the control actions menu that this controlled plant is corresponding, and add the corresponding control actions menu hidden for this labeled controlled plant.
When being embodied as, the second display unit 115, it is additionally operable to if be detected that when on the real time imaging collected that presents on described display screen, any position in label range is clicked, display controls actions menu accordingly.It is also preferred that the left described second display unit 115, it is possible to it is used for adopting augmented reality AR mode to show and controls actions menu accordingly.
The Smart Home interactive device that the embodiment of the present invention provides can be arranged in terminal unit (above-mentioned mobile phone, panel computer and wearable device such as AR glasses etc.).
After describing intelligent home device exchange method provided by the invention and device, it follows that introduce the intelligent home device interactive device of the another exemplary embodiment according to the present invention.
Person of ordinary skill in the field is it is understood that various aspects of the invention can be implemented as system, method or program product.Therefore, various aspects of the invention can be implemented as following form, that is: hardware embodiment, completely Software Implementation (including firmware, microcode etc.) completely, or the embodiment that hardware and software aspect combines, may be collectively referred to as " circuit ", " module " or " system " here.
In the embodiment that some are possible, at least one processing unit and at least one memory element can be included according to the Smart Home interactive device of the present invention.Wherein, described memory element has program stored therein code, when described program code is performed by described processing unit so that described processing unit performs the various steps in the message prompt method according to the various illustrative embodiments of the present invention described in this specification above-mentioned " illustrative methods " part.Such as, described processing unit can perform step S51 as shown in Figure 5, by image acquisition units, current scene is carried out image acquisition, and corresponding to the display screen of image acquisition units presents the real time imaging collected, step S52, it is determined that the controlled plant in the realtime imaging scope of image acquisition units;And step S53, corresponding position in the real time imaging collected presented on a display screen, controlled plant described in labelling, and/or in response to the touch control operation for the controlled plant in real time imaging, display controls actions menu accordingly, and/or controls actions menu accordingly for being shown in the predetermined position of described display screen in response to the controlled plant in described real time imaging, showing.
Although it should be noted that, be referred to some unit or the subelement of device in above-detailed, but this division is merely exemplary not enforceable.It practice, according to the embodiment of the present invention, the feature of two or more unit above-described and function can embody in a unit.Otherwise, the feature of an above-described unit and function can Further Division for be embodied by multiple unit.
Although additionally, describe the operation of the inventive method in the accompanying drawings with particular order, but, this does not require that or implies and must operate to perform these according to this particular order, or having to carry out all shown operation could realize desired result.Additionally or alternatively, it is convenient to omit some step, multiple steps are merged into a step and performs, and/or a step is decomposed into the execution of multiple step.
Referring to Figure 12, the intelligent home device interactive device 120 according to the embodiment of the invention is described.Shown in Figure 12 is only an example for intelligent home device interactive device, the function of the embodiment of the present invention and use scope should not brought any restriction.
As shown in figure 12, intelligent home device interactive device 120 can show with the form of universal computing device.The assembly of intelligent home device interactive device 120 can include but not limited to: at least one processing unit 121 above-mentioned, at least one memory element 122 above-mentioned, connect different system assembly (including memory element 122 and processing unit 121) bus 123.
Bus 123 represents one or more in a few class bus structures, including memory bus or Memory Controller, peripheral bus, processor or use any bus-structured local bus in multiple bus structures.
Memory element 122 can include the computer-readable recording medium of form of volatile memory, for instance random access memory (RAM) 1221 and/or cache memory 1222, it is also possible to farther includes read only memory (ROM) 1223.
Memory element 122 can also include the program/utility 1225 with one group of (at least one) program module 1224, such program module 1224 includes but not limited to: operating system, one or more application program, other program module and routine data, potentially includes the realization of network environment in each or certain combination in these examples.
Intelligent home device interactive device 120 can also communicate with one or more external equipments 124 (such as keyboard, sensing equipment etc.), also can enable a user to the equipment communication mutual with intelligent home device interactive device 120 with one or more, and/or can communicate with any equipment (such as router, modem etc.) that other computing equipments one or more communicate with making this intelligent home device interactive device 120.This communication can be passed through input/output (I/O) interface 125 and carry out.Further, intelligent home device interactive device 120 can also pass through network adapter 126 and one or more network (such as LAN (LAN), wide area network (WAN) and/or public network, for instance the Internet) communication.As it can be seen, network adapter 126 is communicated with other module for intelligent home device interactive device 120 by bus 123.It is to be understood that, although not shown in, other hardware and/or software module can be used by combined with intelligent home equipment interactive device 120, include but not limited to: microcode, device driver, redundant processing unit, external disk drive array, RAID system, tape drive and data backup storage system etc..
Exemplary process product
In the embodiment that some are possible, the various aspects of intelligent home device exchange method provided by the invention are also implemented as the form of a kind of program product, it includes program code, when described program product runs on the terminal device, described program code is used for the step making described terminal unit perform in the intelligent home device exchange method according to the various illustrative embodiments of the present invention described in this specification above-mentioned " illustrative methods " part, such as, described terminal unit can perform step S51 as shown in Figure 5, by image acquisition units, current scene is carried out image acquisition, and corresponding to the display screen of image acquisition units presents the real time imaging collected, step S52, determine the controlled plant in the realtime imaging scope of image acquisition units;And step S53, corresponding position in the real time imaging collected presented on a display screen, controlled plant described in labelling, and/or in response to the touch control operation for the controlled plant in real time imaging, display controls actions menu accordingly, and/or controls actions menu accordingly for being shown in the predetermined position of described display screen in response to the controlled plant in described real time imaging, showing.
Described program product can adopt the combination in any of one or more computer-readable recording medium.Computer-readable recording medium can be readable signal medium or readable storage medium storing program for executing.Readable storage medium storing program for executing such as can be but not limited to the system of electricity, magnetic, optical, electromagnetic, infrared ray or quasiconductor, device or device or arbitrarily above combination.The example more specifically (non exhaustive list) of readable storage medium storing program for executing includes: have the combination of the electrical connection of one or more wire, portable disc, hard disk, random access memory (RAM), read only memory (ROM), erasable type programmable read only memory (EPROM or flash memory), optical fiber, portable compact disc read only memory (CD-ROM), light storage device, magnetic memory device or above-mentioned any appropriate.
As shown in figure 13, describing the program product 130 mutual for intelligent home device according to the embodiment of the present invention, it can adopt portable compact disc read only memory (CD-ROM) and include program code, it is possible to run on the terminal device.But, the program product of the present invention is not limited to this, and in this document, readable storage medium storing program for executing can be any tangible medium comprised or store program, and this program can be commanded execution system, device or device and use or in connection.
The data signal that readable signal medium can include in a base band or propagate as a carrier wave part, wherein carries readable program code.The data signal of this propagation can take various forms, and includes but not limited to the combination of electromagnetic signal, optical signal or above-mentioned any appropriate.Readable signal medium can also is that any computer-readable recording medium beyond readable storage medium storing program for executing, and this computer-readable recording medium can send, propagate or transmit for by instruction execution system, device or device use or program in connection.
The program code comprised on computer-readable recording medium with any suitable medium transmission, can include but not limited to wireless, wired, optical cable, RF etc. or the combination of above-mentioned any appropriate.
The program code for performing present invention operation can be write with the combination in any of one or more programming languages, described programming language includes object oriented program language such as Java, C++ etc., also includes process type programming language such as " C " language or similar programming language of routine.Program code fully can perform on the user computing device, partly performs on a user device, performs as an independent software kit, partly partly perform on a remote computing on the user computing device or perform in remote computing device or server completely.In the situation relating to remote computing device, remote computing device can include LAN (LAN) by the network of any kind or wide area network (WAN) is connected to user's computing equipment, or, it may be connected to external computing device (such as utilizes ISP to pass through Internet connection).
Although additionally, describe the operation of the inventive method in the accompanying drawings with particular order, but, this does not require that or implies and must operate to perform these according to this particular order, or having to carry out all shown operation could realize desired result.Additionally or alternatively, it is convenient to omit some step, multiple steps are merged into a step and performs, and/or a step is decomposed into the execution of multiple step.
Although describe spirit and the principle of the present invention by reference to some detailed description of the invention, however, it should be understood that, the present invention is not limited to disclosed detailed description of the invention, the division of each side is not meant that the feature in these aspects can not combine to be benefited yet, this division merely to statement convenience.It is contemplated that contain various amendments included in the spirit and scope of claims and equivalent arrangements.

Claims (14)

1. an intelligent home device exchange method, including:
By image acquisition units, current scene carried out image acquisition, and on the display screen corresponding to described image acquisition units, present the real time imaging collected, described current scene comprises multiple controlled plant;
Determine the controlled plant in the realtime imaging scope of described image acquisition units;
Corresponding position in the real time imaging collected presented on described display screen, controlled plant described in labelling, and/or control actions menu accordingly in response to for the touch control operation of the controlled plant in described real time imaging, display, and/or be shown in the predetermined position of described display screen in response to the controlled plant in described real time imaging, display controls actions menu accordingly.
2. method according to claim 1, it is determined that the controlled plant in the realtime imaging scope of described image acquisition units, including:
Determine described image acquisition units real time position in described current scene and in real time towards;
According to described real time position, in real time towards the zone of control three dimensional local information of the controlled plant comprised in the parameter information of, described image acquisition units and the described scene that gathers in advance, it is determined that the controlled plant in the realtime imaging scope of described image acquisition units.
3. method according to claim 2, according to described real time position, in real time towards the zone of control three dimensional local information of the controlled plant comprised in the parameter information of, described image acquisition units and the described scene that gathers in advance, determine the controlled plant in the realtime imaging scope of described image acquisition units, including:
According to described real time position, in real time towards the zone of control three dimensional local information of the controlled plant comprised in the parameter information of, described image acquisition units, the three dimensional structure information of described current scene gathered in advance and described current scene, it is determined that the controlled plant in the realtime imaging scope of described image acquisition units and described realtime imaging scope.
4. according to the method in claim 2 or 3, described determine described image acquisition units real time position in described current scene and in real time towards, including:
Determine described image acquisition units initial position and initial towards;
Gather displacement information and the rotation information of described image acquisition units;
According to described initial position, initial towards, displacement information with rotation information, it is determined that the real time position of described image acquisition units and in real time towards.
5. method according to claim 4, it is determined that the initial position of described image acquisition units and initial towards, including:
Receive the first image sequence of the described current scene that described image acquisition units gathers;
The first reference picture comprising the first object of reference in described current scene is determined from described first image sequence;
According to preset, to comprise described first object of reference benchmark image and described first reference picture, and the positional information of described first object of reference and angle information when shooting described benchmark image, calculate described image acquisition units initial position and initial towards.
6. method according to claim 5, also includes:
In described image acquisition units moving process, extract the real time imaging comprising the second object of reference of described image acquisition units collection as key frame images;And
The image information storing described key frame images, the primary importance information shooting described image acquisition units during described key frame images and the first orientation information;And
Described method also includes:
Receive the second image sequence of the described current scene that described image acquisition units gathers;
The second reference picture comprising the second object of reference described in described current scene is determined from described second image sequence;
According to the image information of described key frame images, the second reference picture, primary importance information and the first orientation information, it is determined that the described image acquisition units real time position when collecting described second reference picture and in real time towards.
7. method according to claim 1, it is determined that the controlled plant in the realtime imaging scope of described image acquisition units, specifically includes:
Image information according to the controlled plant gathered in advance, identifies the controlled plant in the described real time imaging collected presented on described display screen.
8. method according to claim 3, the corresponding position in the real time imaging collected presented on described display screen, controlled plant described in labelling, including:
Zone of control three dimensional local information according to the controlled plant gathered in advance, will be located in the zone of control three dimensional local information of the controlled plant in described realtime imaging scope and is converted to the two-dimensional position information in the real time imaging collected presented on described display screen;
According to described two-dimensional position information, the corresponding position in the real time imaging collected presented on described display screen, controlled plant described in labelling.
9. the method according to claim 2 or 8, gathers the zone of control three dimensional local information of described controlled plant according to procedure below:
Obtain the three dimensional local information of each equipment in the three dimensional structure information of described scene and described scene;
The three dimensional local information of the three dimensional structure information of described scene and described each equipment is imported and presets in edit tool;
In described edit tool, from described each equipment, select described controlled plant and the zone of control of described controlled plant;
Store the zone of control three dimensional local information of described controlled plant.
10. method according to claim 9, it is thus achieved that the three dimensional local information of each equipment in the three dimensional structure information of described scene and described scene, including:
Receive for the image of described scene capture, carry out three-dimensional reconstruction according to the image received and obtain the three dimensional local information of each equipment in the three dimensional structure information of described scene and described scene;Or
The three dimensional local information of each equipment in the three dimensional structure information of the described scene that reception user provides and described scene.
11. method according to claim 1, after controlled plant described in labelling, also include:
For each controlled plant being positioned at described realtime imaging scope, it is determined that the control actions menu that this controlled plant is corresponding, and add the corresponding control actions menu hidden for this labeled controlled plant.
12. method according to claim 11, in response to the touch control operation for the controlled plant in described real time imaging, display controls actions menu accordingly, including:
If be detected that when on the real time imaging collected presented on described display screen, any position in label range is clicked, display controls actions menu accordingly.
13. according to the method described in claim 1-3 and 5-12 any one claim, show corresponding Control-Menu, including:
Adopt augmented reality AR mode to show and control actions menu accordingly.
14. an intelligent home device interactive device, including:
Image acquisition units, for current scene is carried out image acquisition, comprises multiple controlled plant in described current scene;
First display unit, for presenting the real time imaging that described image acquisition units collects;
First determines unit, is used for the controlled plant determining in the realtime imaging scope of described image acquisition units;
Indexing unit, the corresponding position in the real time imaging collected presented on described display screen, controlled plant described in labelling;And/or
Second display unit, for in response to the touch control operation for the controlled plant in described real time imaging, display controls actions menu accordingly, and/or controls actions menu accordingly for being shown in the predetermined position of described display screen in response to the controlled plant in described real time imaging, showing.
CN201610130884.1A 2016-03-08 2016-03-08 A kind of smart home device exchange method and device Active CN105760106B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610130884.1A CN105760106B (en) 2016-03-08 2016-03-08 A kind of smart home device exchange method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610130884.1A CN105760106B (en) 2016-03-08 2016-03-08 A kind of smart home device exchange method and device

Publications (2)

Publication Number Publication Date
CN105760106A true CN105760106A (en) 2016-07-13
CN105760106B CN105760106B (en) 2019-01-15

Family

ID=56332736

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610130884.1A Active CN105760106B (en) 2016-03-08 2016-03-08 A kind of smart home device exchange method and device

Country Status (1)

Country Link
CN (1) CN105760106B (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406805A (en) * 2016-09-13 2017-02-15 广东欧珀移动通信有限公司 Method for grouping sound boxes and mobile terminal
CN106445298A (en) * 2016-09-27 2017-02-22 三星电子(中国)研发中心 Visual operation method and device for internet-of-things device
CN106502118A (en) * 2016-12-21 2017-03-15 惠州Tcl移动通信有限公司 A kind of intelligent home furnishing control method and system based on AR photographic head
CN107065522A (en) * 2016-12-29 2017-08-18 冶金自动化研究设计院 Non-linear switching two-time scale system obscures slow state feedback H∞ control method
CN107797661A (en) * 2017-09-14 2018-03-13 珠海格力电器股份有限公司 A kind of plant control unit, method and mobile terminal
CN108548300A (en) * 2018-03-02 2018-09-18 珠海格力电器股份有限公司 Air-conditioning method and device, electronic equipment
CN109491263A (en) * 2018-12-13 2019-03-19 深圳绿米联创科技有限公司 Intelligent home equipment control method, device, system and storage medium
CN109727321A (en) * 2017-10-31 2019-05-07 中兴通讯股份有限公司 Settlement method and device
CN109804411A (en) * 2016-08-30 2019-05-24 C3D增强现实解决方案有限公司 System and method for positioning and mapping simultaneously
CN109814822A (en) * 2018-12-24 2019-05-28 维沃移动通信有限公司 A kind of multimedia control method for playing back, device and terminal device
CN109906435A (en) * 2016-11-08 2019-06-18 夏普株式会社 Mobile member control apparatus and moving body control program
CN110135238A (en) * 2019-03-26 2019-08-16 浙江工业大学 A kind of marker free networked devices recognition methods based on mobile AR
WO2019178863A1 (en) * 2018-03-23 2019-09-26 深圳市大疆创新科技有限公司 Control method, control device, control system, and computer readable storage medium
CN111045344A (en) * 2019-12-31 2020-04-21 维沃移动通信有限公司 Control method of household equipment and electronic equipment
CN111314398A (en) * 2018-12-11 2020-06-19 阿里巴巴集团控股有限公司 Equipment control method, network distribution method, system and equipment
CN111787081A (en) * 2020-06-21 2020-10-16 张伟 Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform
CN112041803A (en) * 2018-04-30 2020-12-04 三星电子株式会社 Electronic device and operation method thereof
CN112051919A (en) * 2019-06-05 2020-12-08 北京外号信息技术有限公司 Interaction method and interaction system based on position
CN112346630A (en) * 2020-10-27 2021-02-09 北京有竹居网络技术有限公司 State determination method, device, equipment and computer readable medium
CN112817547A (en) * 2021-01-22 2021-05-18 北京小米移动软件有限公司 Display method and device, and storage medium
TWI743853B (en) * 2019-06-28 2021-10-21 大陸商上海商湯智能科技有限公司 Device control method, electronic device and medium thereof
WO2021227068A1 (en) * 2020-05-15 2021-11-18 北京小米移动软件有限公司 Map acquiring method and device for internet of things device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103135754A (en) * 2011-12-02 2013-06-05 深圳泰山在线科技有限公司 Interactive device and method for interaction achievement with interactive device
KR101358064B1 (en) * 2012-10-29 2014-02-11 한국과학기술원 Method for remote controlling using user image and system of the same
CN104461318A (en) * 2013-12-10 2015-03-25 苏州梦想人软件科技有限公司 Touch read method and system based on augmented reality technology
CN104851134A (en) * 2015-05-18 2015-08-19 天机数码创新技术有限公司 Augmented reality system with virtual trigger and real object trigger in combination and method thereof
CN105096180A (en) * 2015-07-20 2015-11-25 北京易讯理想科技有限公司 Commodity information display method and apparatus based augmented reality
CN205028239U (en) * 2014-12-10 2016-02-10 杭州凌手科技有限公司 Interactive all -in -one of virtual reality intelligence projection gesture

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103135754A (en) * 2011-12-02 2013-06-05 深圳泰山在线科技有限公司 Interactive device and method for interaction achievement with interactive device
KR101358064B1 (en) * 2012-10-29 2014-02-11 한국과학기술원 Method for remote controlling using user image and system of the same
CN104461318A (en) * 2013-12-10 2015-03-25 苏州梦想人软件科技有限公司 Touch read method and system based on augmented reality technology
CN205028239U (en) * 2014-12-10 2016-02-10 杭州凌手科技有限公司 Interactive all -in -one of virtual reality intelligence projection gesture
CN104851134A (en) * 2015-05-18 2015-08-19 天机数码创新技术有限公司 Augmented reality system with virtual trigger and real object trigger in combination and method thereof
CN105096180A (en) * 2015-07-20 2015-11-25 北京易讯理想科技有限公司 Commodity information display method and apparatus based augmented reality

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109804411A (en) * 2016-08-30 2019-05-24 C3D增强现实解决方案有限公司 System and method for positioning and mapping simultaneously
CN109804411B (en) * 2016-08-30 2023-02-17 斯纳普公司 System and method for simultaneous localization and mapping
CN106406805A (en) * 2016-09-13 2017-02-15 广东欧珀移动通信有限公司 Method for grouping sound boxes and mobile terminal
CN106406805B (en) * 2016-09-13 2019-05-14 Oppo广东移动通信有限公司 A kind of method and mobile terminal of speaker grouping
CN106445298A (en) * 2016-09-27 2017-02-22 三星电子(中国)研发中心 Visual operation method and device for internet-of-things device
CN109906435A (en) * 2016-11-08 2019-06-18 夏普株式会社 Mobile member control apparatus and moving body control program
CN106502118A (en) * 2016-12-21 2017-03-15 惠州Tcl移动通信有限公司 A kind of intelligent home furnishing control method and system based on AR photographic head
CN107065522A (en) * 2016-12-29 2017-08-18 冶金自动化研究设计院 Non-linear switching two-time scale system obscures slow state feedback H∞ control method
CN107065522B (en) * 2016-12-29 2020-06-16 冶金自动化研究设计院 Fuzzy slow state feedback H-infinity control method for nonlinear switching double-time scale system
CN107797661A (en) * 2017-09-14 2018-03-13 珠海格力电器股份有限公司 A kind of plant control unit, method and mobile terminal
CN109727321A (en) * 2017-10-31 2019-05-07 中兴通讯股份有限公司 Settlement method and device
CN108548300B (en) * 2018-03-02 2020-01-07 珠海格力电器股份有限公司 Air supply method and device of air conditioner and electronic equipment
CN108548300A (en) * 2018-03-02 2018-09-18 珠海格力电器股份有限公司 Air-conditioning method and device, electronic equipment
WO2019178863A1 (en) * 2018-03-23 2019-09-26 深圳市大疆创新科技有限公司 Control method, control device, control system, and computer readable storage medium
CN110462574A (en) * 2018-03-23 2019-11-15 深圳市大疆创新科技有限公司 Control method, control equipment, control system and computer readable storage medium
CN112041803A (en) * 2018-04-30 2020-12-04 三星电子株式会社 Electronic device and operation method thereof
CN111314398A (en) * 2018-12-11 2020-06-19 阿里巴巴集团控股有限公司 Equipment control method, network distribution method, system and equipment
CN109491263A (en) * 2018-12-13 2019-03-19 深圳绿米联创科技有限公司 Intelligent home equipment control method, device, system and storage medium
CN109491263B (en) * 2018-12-13 2022-06-03 深圳绿米联创科技有限公司 Intelligent household equipment control method, device and system and storage medium
CN109814822A (en) * 2018-12-24 2019-05-28 维沃移动通信有限公司 A kind of multimedia control method for playing back, device and terminal device
CN109814822B (en) * 2018-12-24 2021-08-24 维沃移动通信有限公司 Multimedia play control method and device and terminal equipment
CN110135238B (en) * 2019-03-26 2021-04-06 浙江工业大学 Markless Internet of things equipment identification method based on mobile AR
CN110135238A (en) * 2019-03-26 2019-08-16 浙江工业大学 A kind of marker free networked devices recognition methods based on mobile AR
CN112051919A (en) * 2019-06-05 2020-12-08 北京外号信息技术有限公司 Interaction method and interaction system based on position
CN112051919B (en) * 2019-06-05 2022-10-18 北京外号信息技术有限公司 Interaction method and interaction system based on position
TWI743853B (en) * 2019-06-28 2021-10-21 大陸商上海商湯智能科技有限公司 Device control method, electronic device and medium thereof
CN111045344A (en) * 2019-12-31 2020-04-21 维沃移动通信有限公司 Control method of household equipment and electronic equipment
WO2021227068A1 (en) * 2020-05-15 2021-11-18 北京小米移动软件有限公司 Map acquiring method and device for internet of things device
CN111787081B (en) * 2020-06-21 2021-03-23 江苏永鼎通信有限公司 Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform
CN111787081A (en) * 2020-06-21 2020-10-16 张伟 Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform
CN112346630A (en) * 2020-10-27 2021-02-09 北京有竹居网络技术有限公司 State determination method, device, equipment and computer readable medium
CN112817547A (en) * 2021-01-22 2021-05-18 北京小米移动软件有限公司 Display method and device, and storage medium

Also Published As

Publication number Publication date
CN105760106B (en) 2019-01-15

Similar Documents

Publication Publication Date Title
CN105760106A (en) Interaction method and interaction device of intelligent household equipment
US11100649B2 (en) Fiducial marker patterns, their automatic detection in images, and applications thereof
US20220044019A1 (en) Augmented reality smartglasses for use at cultural sites
KR20140109020A (en) Apparatus amd method for constructing device information for smart appliances control
CN105981076B (en) Synthesize the construction of augmented reality environment
EP3548993A1 (en) Virtual sensor configuration
JP6348741B2 (en) Information processing system, information processing apparatus, information processing program, and information processing method
CN105631773A (en) Electronic device and method for providing map service
KR20130110907A (en) Apparatus and method for remote controlling based on virtual reality and augmented reality
CN104301661B (en) A kind of smart home monitoring method, client and related device
KR101181967B1 (en) 3D street view system using identification information.
CN104460330A (en) Augmented reality and remote control method based on indoor positioning and electronic compass
CN106249611A (en) A kind of Smart Home localization method based on virtual reality, device and system
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
JP7150980B2 (en) Information device interaction method and system based on optical label
CN115935458A (en) Automatic exchange and use of attribute information between multiple types of architectural images
CN107590337A (en) A kind of house ornamentation displaying interactive approach and device
CN106468917A (en) A kind of tangible live real-time video image remotely assume exchange method and system
CN108548267B (en) Air conditioner control method and user terminal
WO2018006481A1 (en) Motion-sensing operation method and device for mobile terminal
KR102166586B1 (en) Mobile Augmented Reality Service Apparatus and Method Using Deep Learning Based Positioning Technology
CN115731349A (en) Method and device for displaying house type graph, electronic equipment and storage medium
JP2013004001A (en) Display control device, display control method, and program
US11756260B1 (en) Visualization of configurable three-dimensional environments in a virtual reality system
KR101036107B1 (en) Emergency notification system using rfid

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210518

Address after: 311200 Room 102, 6 Blocks, C District, Qianjiang Century Park, Xiaoshan District, Hangzhou City, Zhejiang Province

Patentee after: Hangzhou Yixian Advanced Technology Co.,Ltd.

Address before: 310052 Building No. 599, Changhe Street Network Business Road, Binjiang District, Hangzhou City, Zhejiang Province, 4, 7 stories

Patentee before: NETEASE (HANGZHOU) NETWORK Co.,Ltd.