WO2019213855A1 - 设备控制方法和系统 - Google Patents

设备控制方法和系统 Download PDF

Info

Publication number
WO2019213855A1
WO2019213855A1 PCT/CN2018/086104 CN2018086104W WO2019213855A1 WO 2019213855 A1 WO2019213855 A1 WO 2019213855A1 CN 2018086104 W CN2018086104 W CN 2018086104W WO 2019213855 A1 WO2019213855 A1 WO 2019213855A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
control system
device control
module
Prior art date
Application number
PCT/CN2018/086104
Other languages
English (en)
French (fr)
Inventor
方超
Original Assignee
Fang Chao
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fang Chao filed Critical Fang Chao
Priority to PCT/CN2018/086104 priority Critical patent/WO2019213855A1/zh
Priority to US17/053,803 priority patent/US20210224368A1/en
Publication of WO2019213855A1 publication Critical patent/WO2019213855A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10712Fixed beam scanning
    • G06K7/10722Photodetector array or CCD scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/10Information sensed or collected by the things relating to the environment, e.g. temperature; relating to location
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/30Control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2111Location-sensitive, e.g. geographical location, GPS

Definitions

  • the present invention relates to the field of Internet of Things applications, and in particular, to a device control method and system.
  • the operating systems of smart terminals include Android Android system developed by Google Inc., Apple's iOS system and Microsoft's Windows system. Using the application of these systems, the smart device is controlled by the application on the smart terminal, but this way the interaction is relatively poor and the interaction mode is single.
  • a device control system for controlling a device in the Internet of Things and identifying a target image comprising: an information collecting unit and a control unit communicably connected to the information collecting unit, wherein the information collecting unit is configured to collect the user in the current environment And transmitting the information to the control unit, the control unit analyzing the user's intention to use the device according to the information, and controlling the corresponding device to perform a corresponding action according to the use intention.
  • the information collecting unit includes
  • An environment collection module for collecting environmental information and device information
  • An identity authentication collection module configured to collect user identity information
  • a user analysis module for identifying users in the environment and identifying the user's intent features
  • a first screen module for outputting an image
  • the control unit is connected to the environment collection module, the identity authentication collection module, and the user analysis module, and the control unit presets an operation instruction corresponding to the intent feature, and is used to analyze an environment collected by the environment collection module.
  • the information locates the user location, compares the user intention feature with the preset intention feature, and sends a corresponding operation instruction to the device, and verifies the user identity feature information to analyze whether the user has the authority to control the device control system.
  • the information collecting unit further includes
  • a first wireless module for connecting to devices within the Internet of Things
  • the first screen module is for outputting an image.
  • the identity authentication collection module is an infrared camera, and the infrared camera is configured to collect a user finger vein image and a joint texture image.
  • the device control system further includes an output unit communicatively connecting the information collection unit and the control unit, the output unit including
  • Audio module for audio capture and audio output
  • a second screen module for outputting an image
  • An output control module is coupled to the audio module, the second screen module, and the second wireless module for generating an operation command to control the device.
  • an image presented by the first screen module is displayed to the second screen module, and the first screen module does not display an image.
  • the device control system is coupled to a server for processing data information of the device control system, and the device control system controls a device connected to the server through the server.
  • a device control method applied to a device control system includes the following steps:
  • the user's intention to use the device is analyzed according to the information, and the corresponding device is controlled according to the use intention to perform a corresponding action.
  • the method further includes
  • the device control system responds to the user input command
  • the device control system does not respond to user input commands.
  • control unit analyzes the user's intention to use the device according to the information, and controls the corresponding device according to the use intention to perform corresponding action steps.
  • the device control system and/or the server recognizes the target image, and the software corresponding to the target image in the device control system is turned on.
  • the device control method and system provided by the present invention, the user connects the device through the device control system, the device control system acquires the positional relationship and the device information between the device and the user, and the device control system recognizes the user intention feature to control the device, so that The interaction between the user and the device is more convenient and faster.
  • FIG. 1 is a schematic diagram of a device control system according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of the information collection unit of FIG. 1;
  • Figure 3 is a schematic view of the output unit of Figure 1;
  • FIG. 4 is a diagram of a device control method according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram of a device control system based on an Internet of Things according to an embodiment of the present invention, for controlling a device 400 in an Internet of Things, which may be, but is not limited to, a vehicle, a household appliance, and a furniture. , office supplies, electronic equipment, etc.
  • the device control system includes: an information collection unit 100 and a control unit 300 communicatively coupled to the information collection unit 100, the information collection unit 100 is configured to collect information of a user in a current environment and send the information to The control unit 300 analyzes the user's intention to use the device according to the information, and controls the corresponding device 400 to perform a corresponding action according to the use intention.
  • the device control system includes an information collection unit 100, an output unit 200, and a control unit 300.
  • the information collection unit 100, the output unit 200, and the control unit 300 are communicatively coupled.
  • the device control system is coupled to a server for processing data information of the device control system.
  • the device control system is in communication with the device 400, is capable of detecting and identifying the device 400, and can control the device 400 according to predetermined trigger conditions.
  • the device control system 400 and the device 400 may be connected through an internal closed network or through an Internet connection.
  • the device control system can be connected to an intelligent terminal, and the smart terminal can control the device 400 to which the device control system is connected.
  • the smart terminal includes a mobile phone, a tablet, a computer or other smart device. In one embodiment, the smart terminal is a mobile phone.
  • FIG. 2 is a schematic diagram of the information collecting unit 100 of FIG. 1.
  • the information collection unit 100 includes an environment collection module 120, an identity authentication collection module 110, a user analysis module 130, a first wireless module 140, and a first screen module 150.
  • the environment collection module 120 is configured to collect environment information and device 400 information;
  • the identity authentication collection module 110 is configured to collect user identity feature information;
  • the user analysis module 130 is configured to identify a user in the environment and identify a user's intention feature.
  • the first screen module 150 is configured to output an image;
  • the first wireless module 140 is configured to be connected to the device 400 in the Internet of Things.
  • the control unit 300 is communicatively connected to each module in the information collection unit 100, and the control unit 300 presets an operation instruction corresponding to the intent feature, and is used to analyze the environment information collected by the environment collection module 120 to locate the user location. And comparing the user intention feature to the preset intention feature to send a corresponding operation instruction to the device 400, and verifying the user identity feature information to analyze whether the user has the authority to control the device control system.
  • the identity authentication collection module 110 sends the collected user identity feature information to the control unit 300.
  • the user identity feature information is pre-set in the control unit 300, and the user identity feature information collected by the identity authentication collection module 110 is consistent with the user identity feature information preset in the control module. If not, the system does not respond to the user. Enter the command; if they match, the system responds to the user input command.
  • the identity authentication collection module 110 is an infrared camera for collecting a user finger vein image and a joint texture image. It can be understood that the identity authentication collection module 110 can also be other biometric authentication information recognition devices such as an iris recognizer, a fingerprint recognizer, and a face recognition.
  • the control unit 300 is provided with a vision processor.
  • the control unit 300 receives the device 400 information and the environment information collected by the environment collection module 120, analyzes the device 400 information and the environment information through a visual processor, and identifies a fixed identifier location user location in the environment information.
  • the fixed identification includes household appliance equipment, furniture, street signs, signal lights, trees, buildings, or other stationary objects.
  • the control unit 300 collects the environment information and the device 400 information by the environment collection module 120, and establishes a spatial rectangular coordinate system, where the spatial rectangular coordinate system is used in the device 400 information and the environment information.
  • the fixed identification simulation setting the corresponding parameters to locate the user location.
  • the user has six degrees of freedom posture position in the space rectangular coordinate system
  • the control unit 300 analyzes the environmental information, locates a positional relationship between the user and the six-degree-of-freedom posture position and the fixed identification, and determines the device 400 and The location relationship of the user.
  • the control unit 300 controls the device 400 to perform an action according to the positional relationship between the device 400 and the user.
  • the user analysis module 130 sends a user intent feature to the control unit 300, and an operation instruction corresponding to the intent feature is pre-set in the control unit 300.
  • the intent feature is a gesture and spatial pose information of the wrist and/or user binocular focus information.
  • the control unit analyzes the user gesture and the spatial attitude information of the wrist and/or the user's binocular focus information to compare the gesture preset in the control unit 300 with the spatial attitude information of the wrist, if the user gesture and the spatial posture of the wrist The information is consistent with the preset gesture and the spatial attitude information of the wrist, and the corresponding operation instruction is sent to the device 400; if not, the operation instruction is not sent to the device 400, prompting the user to not recognize the gesture.
  • the intent feature can also be a representation of the user's binocular focus information, limb motion, voice password or other user intent features, and the control unit 300 can only recognize that the intent feature matches the preset operation instruction.
  • the prompting manner can also be a vibration prompt, a voice prompt, a light prompt, a text prompt, an image prompt, or other prompting manner.
  • the user analysis module 130 may specify that the local range of the information collected by the environment collection module 120 is submitted to the control unit for processing, reduce the image recognition pixel calculation amount, reject the useless image, and cause the device control system to identify the device. 400 is faster and more accurate.
  • FIG. 3 is a schematic diagram of the output unit 200 of FIG.
  • the output unit 200 includes: an audio module 210 for audio collection and output; a second screen module 230 for outputting an image; a second wireless module 220 for connecting with the device 400; and an output control module 240 and the second
  • the screen module 230 is coupled to the audio module 210 for generating operational commands to control the device 400.
  • the output unit 200 further includes: a second wireless module 220, and the second wireless module 220 is configured to be connected to the device 400 in the Internet of Things.
  • the output unit 200 further includes a second environment collection module 250, and the second environment collection module 250 sends the collected compensation environment information to the control unit 300, and the control unit 300 analyzes the compensation environment information and compares the information.
  • the environmental information collected by the acquisition unit can more accurately locate the user.
  • the audio module 210 can also identify the voice of the user, and send the voice of the user to the output control module 240.
  • the output control module 240 pre-sets an operation instruction corresponding to the voice, and analyzes the voice comparison of the user.
  • the voice preset in the unit 300 if the user identity feature information collecting system and the preset user identity feature information in the control unit 300 are consistent, the device control system responds to the user input instruction; if not, the device control system does not respond to the user. Enter an instruction to prompt the user that the voice is not recognized.
  • the prompting manner can also be a vibration prompt, a voice prompt, a light prompt, a text prompt, an image prompt, or other prompting manner.
  • the audio module 210 and the user analysis module 130 in the information collection unit 100 can be used in combination to achieve the fastest input result.
  • the image information obtained by the environment collection module 120 is analyzed by the control unit 300 to obtain data of the user's six-degree-of-freedom gesture in the space coordinate system, that is, the user positioning data.
  • data such as distance, direction, and the like of the user with respect to the control device 400 are obtained.
  • the command control device 400 is automatically issued.
  • the device control system is connected to the smart toilet.
  • the device control system When the user approaches the smart toilet, the device control system sends an open command to the smart toilet, and the toilet lid of the smart toilet is opened; when the user finishes using the toilet In the case of a smart toilet, the device control system sends a close command to the smart toilet, the toilet lid of the smart toilet is closed and flushed.
  • the device control system has connected a smart desk lamp, and when the user sits at the desk, the device control system sends an open command to the smart desk lamp, the smart desk lamp is turned on; when the user leaves the desk, the The device control system sends a shutdown command to the smart desk light, the smart desk light being turned off.
  • the user uses the device control system, and the user determines the controlled device 400 by means of image recognition, and the device control system connects the determined device 400 to control the device 400.
  • the user controls the smart TV through a specific gesture
  • the device control system controls the smart TV
  • the control menu of the smart TV is displayed on the output unit 200 for the user to select, and the user can select the menu through the corresponding gesture and / or select the menu by touch screen.
  • the device control system establishes a virtual reality according to the environmental information collected in advance or acquires real-time environment information through the proxy camera, and the user displays the virtual reality of another room in the output unit 200, and the user points to the device 400, and the device
  • the control system performs image recognition on the device 400
  • the determined device 400 is connected
  • the output unit 200 is an image in the virtual reality of the device 400 selected by the user, and the device 400 controlled by the user through a specific gesture
  • the output unit 200 The device 400 image in the virtual reality is correspondingly activated according to the operation instruction corresponding to the user specific gesture.
  • the user uses the device control system to control the device control system to acquire a target image, identify the target image through the device control system and/or the server, and display the control menu on the output unit 200. display.
  • the information collection unit 100 may collect the fixed identifier in the environment information to make the recognition target image more accurate. For example, if the user controls the device control system to point to the express cabinet, the control device control unit 300 analyzes the target image as a courier cabinet, and at the same time, according to the fixed identifier around the courier cabinet, the position of the courier cabinet can be located, and the device control system is opened correspondingly.
  • the software (app) of the express cabinet after the information collection unit 100 submits the identity authentication information, automatically sends the pickup code and opens the door of the express cabinet.
  • the user specifies that the device control system recognizes the vehicle, the device control system and/or the server recognizes success, and displays a control menu for the user to select at the output unit 200, the control menu includes identifying vehicle model information, and presenting different searches according to different vehicle information. information. For example, it is identified as a bus, and a road map corresponding to the bus is opened; it is recognized that the car opens the taxi software (app), and further image analysis, vehicle route or other related vehicle information can be identified for the vehicle components.
  • Manner 4 The user uses the device control system to acquire a target image, and the target image is software information.
  • the control unit 300 processes the two-dimensional code and/or the text description information, and the control unit 300 according to the processed two-dimensional code and/or the Text description information, open the corresponding software.
  • the control unit 300 activates the corresponding software by recognizing the Chinese characters therein.
  • a two-dimensional code merchant usually prints text description information around the two-dimensional code to describe the two-dimensional code.
  • the control unit 300 starts the corresponding software according to the text description information after the processing.
  • the information collection unit 100 simultaneously collects user identity feature information, and can directly verify the user identity feature information to enter the taxi software (app) without additional operation by the user.
  • the taxi software (app) can also be other (app) that requires identity information verification.
  • the user identity feature information can be authenticated with the device control system connection device, and if the verification is passed, the user is allowed to use the device, and if the verification fails, the user is not allowed to use the device.
  • FIG. 4 is a diagram of a device control method according to an embodiment of the present invention.
  • a device control method applied to a device control system includes the following steps:
  • Step 410 Collect information about the user in the current environment.
  • Step 420 Analyze the user's intention to use the device according to the information, and control the corresponding device according to the use intention to perform a corresponding action.
  • the device control method further includes: collecting user identity feature information, verifying whether the user identity feature information is consistent with the pre-stored user identity feature information in the database; if consistent, the device control system responds to the user input instruction; if not, the device control system Does not respond to user input instructions.
  • the step of analyzing the user's intention to use the device according to the information, and controlling the corresponding device according to the use intention to perform the corresponding action step further includes: connecting the device through the device control system and/or connecting the device through the server Identifying the user's intent feature to control the device and/or determining a spatial positional relationship between the user and the device to control the device; acquiring the target image to identify the target image through the device control system and/or the server, and opening the device control system Software corresponding to the target image. If the software requires authentication, the information collection unit collects user identity information for verification. If the software does not require authentication, the software is directly opened.
  • the user identity information is verified to be consistent with the pre-stored user identity information in the database. If they are consistent, the user is allowed to use the device control system; if not, the user is prohibited.
  • Control device 400 The information collecting unit collects information of the user in the current environment and sends the information to the control unit, and the control unit analyzes the user's intention to use the device according to the information, and according to the use It is intended to control the corresponding device to perform corresponding actions.
  • the information collection unit 100 collects environmental information, analyzes the environmental information, and locates the user's location. The user connects to the device 400 through the device control system, and controls the device 400 through gestures and voices.
  • the device 400 is controlled by determining the positional relationship between the user and the device 400.
  • the information collecting unit 100 locates the distance between the user and the device 400 and the direction reaches a preset value, the device 400 is controlled to operate.
  • the information collecting unit 100 acquires a target image, and opens a software (app) corresponding to the target image by identifying the target image, the corresponding software (app) including software on the control device system or the control of the server Device software.
  • the device control method and system provided by the present invention the user connects to the device 400 through the device control system, the device control system acquires the positional relationship between the device 400 and the user and the device 400 information, and the device control system identifies the user intention feature to the device 400. Control is performed to make the interaction between the user and the device 400 more convenient and faster.

Abstract

一种设备控制方法和系统,用于控制物联网内的设备(400)与识别目标图像,包括:信息采集单元(100)以及与信息采集单元(100)通信连接的控制单元(300),信息采集单元(100)用于采集用户于当前环境中的信息并将信息发送至控制单元(300),控制单元(300)根据信息分析用户对设备(400)的使用意图,并根据使用意图控制对应的设备(400)执行相应动作。用户通过设备控制系统连接设备(400),设备控制系统获取设备(400)与用户之间位置关系与设备信息,设备控制系统识别用户意图特征对设备(400)进行控制,使用户与设备(400)之间的交互更加方便、快捷。

Description

设备控制方法和系统 技术领域
本发明涉及物联网应用领域,特别是涉及一种设备控制方法和系统。
背景技术
在人类科技飞速发展的今天,人们的生活习惯,正在悄悄发生着变革。因为智能设备的普及,物联网越发被人们所熟知。
随着移动互联网技术的发展,越来越多的应用程序被安装到智能终端上,智能终端的操作系统包括谷歌公司开发的安卓Android系统、苹果公司的iOS系统和微软公司的Windows系统等。利用这些系统的应用程序,在智能终端上通过应用程序控制智能设备,但是这种方式交互性比较差,且交互方式单一。
发明内容
基于此,有必要针对智能终端通过应用程序控制智能设备交互性比较差,且交互方式单一问题,提供一种设备控制方法和系统。
一种设备控制系统,用于控制物联网内的设备与识别目标图像,包括:信息采集单元以及与所述信息采集单元通信连接的控制单元,所述信息采集单元用于采集用户于当前环境中的信息并将所述信息发送至所述控制单元,所述控制单元根据所述信息分析所述用户对所述设备的使用意图,并根据所述使用意图控制对应的所述设备执行相应动作。
在其中一个实施例中,所述信息采集单元包括,
环境采集模块,用于采集环境信息与设备信息;
身份认证采集模块,用于采集用户身份特征信息;
用户分析模块,用于识别环境中的用户并识别用户的意图特征;
第一屏幕模块,用于输出图像;
所述控制单元与所述环境采集模块、所述身份认证采集模块以及所述用户分析模块相连接,所述控制单元预设对应意图特征的操作指令,用于分析所述环境采集模块采集的环境信息定位用户位置、对比所述用户意图特征与预设的意图特征发送对应的操作指令至设备、验证所述用户身份特征信息分析用户是否有权限控制设备控制系统。
在其中一个实施例中,所述信息采集单元还包括,
第一无线模块,用于与物联网内的设备相连接;以及
第一屏幕模块,用于输出图像。
在其中一个实施例中,所述身份认证采集模块为红外摄像头,所述红外摄像头用于采集用户手指静脉图像与关节纹路图像。
在其中一个实施例中,所述设备控制系统还包括输出单元,所述输出单元通信连接所述信息采集单元和所述控制单元,所述输出单元包括,
音频模块用于音频采集与音频输出;
第二屏幕模块用于输出图像;以及
输出控制模块与所述音频模块、所述第二屏幕模块和所述第二无线模块相连接,用于生成操作指令控制所述设备。
在其中一个实施例中,所述信息采集单元和所述输出单元连接时,所述第一屏幕模块呈现的图像显示至所述第二屏幕模块,所述第一屏幕模块不显示图像。
在其中一个实施例中,所述设备控制系统与服务器连接,所述服务器用于 处理所述设备控制系统的数据信息,所述设备控制系统通过所述服务器,控制连接于所述服务器的设备。
一种设备控制方法,应用于设备控制系统,包括以下步骤:
采集用户于当前环境中的信息;
根据所述信息分析所述用户对所述设备的使用意图,并根据所述使用意图控制对应的所述设备进行相应动作。
在其中一个实施例中,所述方法还包括,
采集用户身份特征信息,验证用户身份特征信息与控制单元中预设的用户身份特征信息是否一致;
若一致,则设备控制系统响应用户输入指令;
若不一致,则设备控制系统不响应用户输入指令。
在其中一个实施例中,所述控制单元根据所述信息分析所述用户对所述设备的使用意图,并根据所述使用意图控制对应的所述设备进行相应动作步骤还包括:
通过设备控制系统连接设备和/或通过服务器连接设备,识别用户的意图特征对设备进行控制和/或确定用户与设备之间的空间位置关系对设备进行控制;
获取目标图像通过设备控制系统和/或服务器识别所述目标图像,开启设备控制系统中对应所述目标图像的软件。
本发明提供的设备控制方法和系统,用户通过设备控制系统连接设备,所述设备控制系统获取设备与用户之间位置关系与设备信息,所述设备控制系统识别用户意图特征对设备进行控制,使用户与设备之间的交互更加方便、快捷。
附图说明
图1为本发明一实施例的设备控制系统示意图;
图2为图1的信息采集单元示意图;
图3为图1的输出单元示意图;
图4为本发明一实施例的设备控制方法图。
具体实施方式
为了便于理解本发明,下面将参照相关附图对本发明进行更全面的描述。附图中给出了本发明的较佳实施例。但是,本发明可以以许多不同的形式来实现,并不限于本文所描述的实施例。相反地,提供这些实施例的目的是使对本发明的公开内容的理解更加透彻全面。
除非另有定义,本文所使用的所有的技术和科学术语与属于本发明的技术领域的技术人员通常理解的含义相同。本文中在本发明的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在限制本发明。
如图1所示,图1为本发明一实施例的基于物联网的设备控制系统示意图,用于控制物联网中的设备400,所述设备400可以但不限于为交通工具、家用电器、家具、办公用品、电子设备等。所述设备控制系统,包括:信息采集单元100以及与所述信息采集单元100通信连接的控制单元300,所述信息采集单元100用于采集用户于当前环境中的信息并将所述信息发送至所述控制单元300,所述控制单元300根据所述信息分析所述用户对所述设备的使用意图,并根据所述使用意图控制对应的所述设备400进行相应动作。
所述设备控制系统包括信息采集单元100、输出单元200以及控制单元300,所述信息采集单元100、所述输出单元200以及控制单元300通信连接。所述设备控制系统连接服务器,所述服务器用于处理所述设备控制系统的数据信息。
所述设备控制系统与所述设备400通信连接,能够检测、识别所述设备400,并,并可依据预定触发条件控制所述设备400。所述设备控制系统400与设备400可以通过内部封闭网络连接,也可通过互联网连接。
所述设备控制系统可连接智能终端,所述智能终端能够控制所述设备控制系统所连接的设备400。所述智能终端包括手机、平板、电脑或其他智能设备。在一个实施例中,所述智能终端为手机。
如图2所示,图2为图1的信息采集单元100示意图。所述信息采集单元100包括:环境采集模块120、身份认证采集模块110、用户分析模块130、第一无线模块140以及第一屏幕模块150。所述环境采集模块120用于采集环境信息与设备400信息;所述身份认证采集模块110用于采集用户身份特征信息;所述用户分析模块130用于识别环境中的用户并识别用户的意图特征;所述第一屏幕模块150用于输出图像;所述第一无线模块140用于与物联网内的设备400相连接。
所述控制单元300与所述信息采集单元100中的各模块通信连接,所述控制单元300预设对应意图特征的操作指令,用于分析所述环境采集模块120采集的环境信息定位用户位置,方向、对比所述用户意图特征与预设的意图特征发送对应的操作指令至设备400、验证所述用户身份特征信息分析用户是否有权限控制设备控制系统。
所述身份认证采集模块110将采集的用户身份特征信息发送至所述控制单元300。所述控制单元300中预设有用户身份特征信息,对比所述身份认证采集模块110采集的用户身份特征信息与控制模块中预设的用户身份特征信息是否一致,若不一致,则系统不响应用户输入指令;若一致,则系统响应用户输入指令。在一个实施例中,所述身份认证采集模块110为红外摄像头,所述红外 摄像头用于采集用户手指静脉图像与关节纹路图像。可以理解,所述身份认证采集模块110也可以是虹膜识别器、指纹识别器、脸部识别等其他生物认证信息识别设备。
所述控制单元300设置有视觉处理器。所述控制单元300接收所述环境采集模块120采集的设备400信息和环境信息,通过视觉处理器分析所述设备400信息和所述环境信息,识别环境信息中固定标识定位用户位置。所述固定标识包括家用电器设备、家具、路牌、信号灯、树木、建筑或其他静置物体。
在一个实施例中,所述控制单元300将所述环境采集模块120采集环境信息和设备400信息,建立空间直角坐标系,所述空间直角坐标系将所述设备400信息和所述环境信息中的固定标识模拟,设定对应的参数,从而定位用户位置。所述用户在空间直角坐标系中拥有六个自由度姿态位置,所述控制单元300分析所述环境信息,定位用户六个自由度姿态位置中与固定标识的位置关系并且确定所述设备400与用户的位置关系。所述控制单元300根据所述设备400与用户的位置关系,控制所述设备400进行作动。
所述用户分析模块130将用户意图特征发送至所述控制单元300,所述控制单元300中预设有对应意图特征的操作指令。在一个实施例中,所述意图特征为手势以及手腕的空间姿态信息和/或用户双眼焦点信息。所述控制单元分析所述用户手势以及手腕的空间姿态信息和/或用户双眼焦点信息对比所述控制单元300中预设的手势以及手腕的空间姿态信息,若所述用户手势以及手腕的空间姿态信息与预设的手势以及手腕的空间姿态信息一致,则将对应的操作指令发送至所述设备400;若不一致,则不发送操作指令至设备400,提示用户未识别手势。可以理解,所述意图特征也可以为用户双眼焦点信息、肢体动作、语音口令或其他用户意图特征的表现形式,只需所述控制单元300能够识别所述 意图特征与预设的操作指令相匹配即可。可以理解,所述提示方式也可为震动提示、语音提示、灯光提示、文字提示、图像提示或其他提示方式。在一个实施例中,所述用户分析模块130可指定环境采集模块120采集信息的局部范围提交给所述控制单元处理,减少图像识别像素计算量,剔除无用图像,使所述设备控制系统识别设备400速度更快,准确度更高。
如图3所示,图3为图1的输出单元200示意图。所述输出单元200包括:音频模块210用于音频采集与输出;第二屏幕模块230用于输出图像;第二无线模块220用于与设备400相连接;以及输出控制模块240与所述第二屏幕模块230和所述音频模块210相连接,用于生成操作指令控制所述设备400。所述输出单元200还包括:第二无线模块220,所述第二无线模块220用于与物联网内设备400相连接。
所述输出单元200还包括第二环境采集模块250,所述第二环境采集模块250将采集的补偿环境信息发送所述控制单元300,所述控制单元300分析所述补偿环境信息对比所述信息采集单元采集的环境信息,可以更加精准的定位用户位置。
当所述信息采集单元100和所述输出单元200连接时,所述第一屏幕模块150呈现的图像投影至所述第二屏幕模块230,所述第一屏幕模块150不显示图像。
所述音频模块210还可识别用户的语音,将用户的语音发送至所述输出控制模块240,所述输出控制模块240中预设有对应语音的操作指令,分析所述用户语音对比所述控制单元300中预设的语音,若所述用户身份特征信息采集系统与控制单元300中预设的用户身份特征信息一致,则设备控制系统响应用户输入指令;若不一致,则设备控制系统不响应用户输入指令,提示用户未识别 语音。可以理解,所述提示方式也可为震动提示、语音提示、灯光提示、文字提示、图像提示或其他提示方式。所述音频模块210和所述信息采集单元100中的用户分析模块130可以混合使用以达到最快的输入结果。
以下是所述设备控制系统的几种控制方式:
方式一,用户穿戴所述设备控制系统靠近设备400时,环境采集模块120获得的图像信息经过控制单元300分析获得用户六个自由度姿态在空间坐标系中的数据也即用户定位数据。从而获得用户相对于控制设备400的距离,方向等数据。从而自动发出指令控制设备400。例如,所述设备控制系统连接智能马桶,当用户靠近所述智能马桶时,所述设备控制系统发送开启指令至所述智能马桶,所述智能马桶的马桶盖开启;当用户使用完毕离开所述智能马桶时,所述设备控制系统发送关闭指令至所述智能马桶,所述智能马桶的马桶盖关闭并进行冲水。又例如,所述设备控制系统已连接智能台灯,当用户坐于书桌前时,所述设备控制系统发送开启指令至所述智能台灯,所述智能台灯的开启;当用户离开书桌后,所述设备控制系统发送关闭指令至所述智能台灯,所述智能台灯的关闭。
方式二,用户使用所述设备控制系统,用户通过图像识别方式确定控制的设备400,所述设备控制系统连接确定的设备400,对所述设备400进行控制。例如,用户通过特定手势表示控制的是智能电视机,所述设备控制系统控制智能电视机,在输出单元200显示所述智能电视机的控制菜单供用户选择,用户可以通过相应的手势选择菜单和/或通过触屏的方式选择菜单。又例如,所述设备控制系统根据事先采集的环境信息建立虚拟实境或者通过代理摄像头获取实时环境信息,用户在输出单元200中显示另一房间的虚拟实境,用户指向设备400,所述设备控制系统对设备400进行图像识别后,连接确定的设备400,所 述输出单元200在用户所选择的设备400在虚拟实境中的图像,用户通过特定手势控制的设备400,所述输出单元200中的设备400图像在虚拟实境中根据所述用户特定手势对应的操作指令进行相应的作动。
方式三,用户使用所述设备控制系统,控制所述设备控制系统获取目标图像,将获取目标图像通过所述设备控制系统和/或服务器识别所述目标图像,并将控制菜单在输出单元200上显示。在识别目标图像的过程中,可通过所述信息采集单元100采集环境信息中的固定标识使识别目标图像更加准确。例如,用户控制所述设备控制系统指向快递柜,则控制设备控制单元300分析所述目标图像为快递柜,同时根据快递柜周围的固定标识可以定位这个快递柜的位置,设备控制系统打开对应该快递柜的软件(app),信息采集单元100提交身份认证信息后,自动发送取件码,打开快递柜门。例如用户指定设备控制系统识别车辆,所述设备控制系统和/或服务器识别成功,在输出单元200显示控制菜单供用户选择,所述控制菜单包括识别车辆型号信息、根据不同车辆信息呈现不同的搜索信息。例如识别为公交车,打开对应所述公交车的线路图;识别为小汽车打开打车软件(app),也可对车辆的零部件进一步图像分析、车辆路线或其他相关车辆信息识别。
方式四:用户使用所述设备控制系统,获取目标图像,所述目标图像为软件信息。例如,二维码和/或文字描述信息,所述控制单元300处理所述二维码和/或所述文字描述信息,所述控制单元300根据处理后所述二维码和/或所述文字描述信息,开启相应的软件。支付宝二维码中有汉字“支”字,在微信支付二维码中有汉字“微”,所述控制单元300通过识别其中的汉字,开启相应的软件。例如,二维码商家通常会在二维码的周围打印文字描述信息,对所述二维码进行说明。所述控制单元300根据处理后所述文字描述信息,开启相应的 软件。
上述四种方式在使用过程中,所述信息采集单元100同时采集用户身份特征信息,可直接验证用户身份特征信息进入打车软件(app),无需用户额外操作。可以理解,所述打车软件(app)也可以为其他需要身份信息验证的(app)。也可以理解,所述用户身份特征信息可以与所述设备控制系统连接设备进行身份验证,若验证通过则允许用户使用设备,若验证失败则不允许用户使用设备。
如图4所示,图4为本发明一实施例的设备控制方法图。
一种设备控制方法,应用于的设备控制系统,包括以下步骤:
步骤410,采集用户于当前环境中的信息;
步骤420,根据所述信息分析所述用户对所述设备的使用意图,并根据所述使用意图控制对应的所述设备进行相应动作。
所述设备控制方法还包括:采集用户身份特征信息,验证用户身份特征信息与数据库中预存的用户身份特征信息是否一致;若一致,则设备控制系统响应用户输入指令;若不一致,则设备控制系统不响应用户输入指令。
所述根据所述信息分析所述用户对所述设备的使用意图,并根据所述使用意图控制对应的所述设备进行相应动作步骤还包括:通过设备控制系统连接设备和/或通过服务器连接设备,识别用户的意图特征对设备进行控制和/或确定用户与设备之间的空间位置关系对设备进行控制;获取目标图像通过设备控制系统和/或服务器识别所述目标图像,开启设备控制系统中对应所述目标图像的软件。若所述软件需要身份验证,则通过信息采集单元采集用户身份特征信息进行验证。若所述软件不需要身份验证,则直接开启所述软件。
上述设备控制方法,用户使用所述设备控制系统时,验证用户身份特征信息与数据库中预存的用户身份特征信息是否一致,若一致,则允许用户使用所 述设备控制系统;若不一致,则禁止用户控制设备400。所述信息采集单元采集用户于当前环境中的信息并将所述信息发送至所述控制单元,所述控制单元根据所述信息分析所述用户对所述设备的使用意图,并根据所述使用意图控制对应的所述设备进行相应动作。所述信息采集单元100采集环境信息,分析所述环境信息定位用户位置,用户通过所述设备控制系统连接设备400,通过手势和语音对设备400进行控制。通过确定用户与设备400的位置关系对设备400进行控制,当所述信息采集单元100定位用户位置与设备400距离和方向达到预设值时,控制所述设备400作动。所述信息采集单元100获取目标图像,通过识别所述目标图像从而打开对应所述目标图像的软件(app),所述对应软件(app)包括控制设备系统上的软件或者在服务器的控制所述设备的软件。
本发明提供的设备控制方法和系统,用户通过设备控制系统连接设备400,所述设备控制系统获取设备400与用户之间位置关系与设备400信息,所述设备控制系统识别用户意图特征对设备400进行控制,使用户与设备400之间的交互更加方便、快捷。
以上所述实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本发明的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干变形和改进,这些都属于本发明的保护范围。因此,本发明专利的保护范围应以所附权利要求为准。

Claims (10)

  1. 一种设备控制系统,用于控制物联网内的设备与识别目标图像,其特征在于,包括:信息采集单元以及与所述信息采集单元通信连接的控制单元,所述信息采集单元用于采集用户于当前环境中的信息并将所述信息发送至所述控制单元,所述控制单元根据所述信息分析所述用户对所述设备的使用意图,并根据所述使用意图控制对应的所述设备执行相应动作。
  2. 根据权利要求1所述的设备控制系统,其特征在于,所述信息采集单元包括,
    环境采集模块,用于采集环境信息与设备信息;
    身份认证采集模块,用于采集用户身份特征信息;
    用户分析模块,用于识别环境中的用户并识别用户的意图特征;
    第一屏幕模块,用于输出图像;
    所述控制单元与所述环境采集模块、所述身份认证采集模块以及所述用户分析模块相连接,所述控制单元预设对应意图特征的操作指令,用于分析所述环境采集模块采集的环境信息定位用户位置、对比所述用户意图特征与预设的意图特征发送对应的操作指令至设备、验证所述用户身份特征信息分析用户是否有权限控制设备控制系统。
  3. 根据权利要求2所述的设备控制系统,其特征在于,所述信息采集单元还包括,
    第一无线模块,用于与物联网内的设备相连接;以及
    第一屏幕模块,用于输出图像。
  4. 根据权利要求2所述的设备控制系统,其特征在于,所述身份认证采集模块为红外摄像头,所述红外摄像头用于采集用户手指静脉图像与关节纹路图像。
  5. 根据权利要求1至4任一项所述的设备控制系统,其特征在于,所述设备控制系统还包括输出单元,所述输出单元通信连接所述信息采集单元和所述控制单元,所述输出单元包括,
    音频模块用于音频采集与音频输出;
    第二屏幕模块用于输出图像;以及
    输出控制模块与所述音频模块、所述第二屏幕模块和所述第二无线模块相连接,用于生成操作指令控制所述设备。
  6. 根据权利要求5所述的设备控制系统,其特征在于,所述信息采集单元和所述输出单元连接时,所述第一屏幕模块呈现的图像显示至所述第二屏幕模块,所述第一屏幕模块不显示图像。
  7. 根据权利要求1所述的设备控制系统,其特征在于,所述设备控制系统与服务器连接,所述服务器用于处理所述设备控制系统的数据信息,所述设备控制系统通过所述服务器,控制连接于所述服务器的设备。
  8. 一种设备控制方法,应用于权利要求1至7任一项所述的设备控制系统,其特征在于,包括以下步骤:
    采集用户于当前环境中的信息;
    根据所述信息分析所述用户对所述设备的使用意图,并根据所述使用意图控制对应的所述设备进行相应动作。
  9. 根据权利要求8所述的设备控制方法,其特征在于,所述方法还包括,
    采集用户身份特征信息,验证用户身份特征信息与控制单元中预设的用户身份特征信息是否一致;
    若一致,则设备控制系统响应用户输入指令;
    若不一致,则设备控制系统不响应用户输入指令。
  10. 根据权利要求8所述的设备控制方法,其特征在于,所述控制单元根据所述信息分析所述用户对所述设备的使用意图,并根据所述使用意图控制对应的所述设备进行相应动作步骤还包括:
    通过设备控制系统连接设备和/或通过服务器连接设备,识别用户的意图特征对设备进行控制和/或确定用户与设备之间的空间位置关系对设备进行控制;
    获取目标图像通过设备控制系统和/或服务器识别所述目标图像,开启设备控制系统中对应所述目标图像的软件。
PCT/CN2018/086104 2018-05-09 2018-05-09 设备控制方法和系统 WO2019213855A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/086104 WO2019213855A1 (zh) 2018-05-09 2018-05-09 设备控制方法和系统
US17/053,803 US20210224368A1 (en) 2018-05-09 2018-05-09 Device control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/086104 WO2019213855A1 (zh) 2018-05-09 2018-05-09 设备控制方法和系统

Publications (1)

Publication Number Publication Date
WO2019213855A1 true WO2019213855A1 (zh) 2019-11-14

Family

ID=68467210

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/086104 WO2019213855A1 (zh) 2018-05-09 2018-05-09 设备控制方法和系统

Country Status (2)

Country Link
US (1) US20210224368A1 (zh)
WO (1) WO2019213855A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1924052A2 (en) * 2003-10-30 2008-05-21 Frontera Azul Systems, S.L. System and procedure for communication based on virtual reality
CN104410883A (zh) * 2014-11-29 2015-03-11 华南理工大学 一种移动可穿戴非接触式交互系统与方法
CN105045140A (zh) * 2015-05-26 2015-11-11 深圳创维-Rgb电子有限公司 智能控制受控设备的方法和装置
CN105867626A (zh) * 2016-04-12 2016-08-17 京东方科技集团股份有限公司 头戴式虚拟现实设备及其控制方法、虚拟现实系统
CN105872685A (zh) * 2016-03-24 2016-08-17 深圳市国华识别科技开发有限公司 智能终端控制方法和系统、智能终端

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9909777B2 (en) * 2015-08-26 2018-03-06 Google Llc Thermostat with multiple sensing systems including presence detection systems integrated therein
TWI599966B (zh) * 2016-05-10 2017-09-21 H P B Optoelectronic Co Ltd 手勢控制模組化系統
JP6685397B2 (ja) * 2016-07-12 2020-04-22 三菱電機株式会社 機器制御システム
GB201613138D0 (en) * 2016-07-29 2016-09-14 Unifai Holdings Ltd Computer vision systems
CN106453568B (zh) * 2016-10-18 2019-07-02 北京小米移动软件有限公司 操作执行方法、装置及系统
US10402647B2 (en) * 2017-05-17 2019-09-03 Microsoft Technology Licensing, Llc Adapted user interface for surfacing contextual analysis of content
US11551195B2 (en) * 2017-07-18 2023-01-10 Tata Consultancy Services Limited Systems and methods for providing services to smart devices connected in an IoT platform
US10909979B1 (en) * 2017-11-21 2021-02-02 Ewig Industries Macao Commercial Offshore Limited Voice controlled remote thermometer
US10672243B2 (en) * 2018-04-03 2020-06-02 Chengfu Yu Smart tracker IP camera device and method
WO2020222539A1 (en) * 2019-05-02 2020-11-05 Samsung Electronics Co., Ltd. Hub device, multi-device system including the hub device and plurality of devices, and method of operating the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1924052A2 (en) * 2003-10-30 2008-05-21 Frontera Azul Systems, S.L. System and procedure for communication based on virtual reality
CN104410883A (zh) * 2014-11-29 2015-03-11 华南理工大学 一种移动可穿戴非接触式交互系统与方法
CN105045140A (zh) * 2015-05-26 2015-11-11 深圳创维-Rgb电子有限公司 智能控制受控设备的方法和装置
CN105872685A (zh) * 2016-03-24 2016-08-17 深圳市国华识别科技开发有限公司 智能终端控制方法和系统、智能终端
CN105867626A (zh) * 2016-04-12 2016-08-17 京东方科技集团股份有限公司 头戴式虚拟现实设备及其控制方法、虚拟现实系统

Also Published As

Publication number Publication date
US20210224368A1 (en) 2021-07-22

Similar Documents

Publication Publication Date Title
CN108632373B (zh) 设备控制方法和系统
US10242364B2 (en) Image analysis for user authentication
EP2942698B1 (en) Non-contact gesture control method, and electronic terminal device
CN104049721B (zh) 信息处理方法及电子设备
WO2018210219A1 (zh) 基于正视的人机交互方法与系统
EP2766790B1 (en) Authenticated gesture recognition
TWI416366B (zh) 生物特徵資料建立方法、電子裝置及電腦程式產品
WO2021135685A1 (zh) 身份认证的方法以及装置
CN108681399B (zh) 一种设备控制方法、装置、控制设备及存储介质
CN105139470A (zh) 基于人脸识别的考勤方法、装置及系统
CN110134232A (zh) 一种基于手势识别的车载手机支架调节方法和系统
CN110647732B (zh) 一种基于生物识别特征的语音交互方法、系统、介质和设备
CN110149618B (zh) 基于声纹授权的智能设备接入方法、装置、设备及介质
US20170351911A1 (en) System and method for control of a device based on user identification
CN115291724A (zh) 人机交互的方法、装置、存储介质和电子设备
WO2019213855A1 (zh) 设备控制方法和系统
CN116088992B (zh) 一种基于图像识别和语音识别的点击控制方法及系统
CN111459262A (zh) 智能音箱、手势处理方法、装置及电子设备
CN107992825B (zh) 一种基于增强现实的人脸识别的方法及系统
CN110516426A (zh) 身份认证方法、认证终端、装置及可读存储介质
CN111176594B (zh) 一种智能音箱的屏幕显示方法及智能音箱、存储介质
CN107577929B (zh) 一种基于生物特征的不同系统进入控制方法及电子设备
US11164023B2 (en) Method of enrolling a new member to a facial image database
JP2005202732A (ja) 生体照合装置、生体照合方法および通行制御装置
US20240012898A1 (en) Device and method for authenticating a user of a virtual reality helmet

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18917611

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18917611

Country of ref document: EP

Kind code of ref document: A1