WO2023000808A1 - 用于控制智能家电的方法及装置、智能眼镜 - Google Patents

用于控制智能家电的方法及装置、智能眼镜 Download PDF

Info

Publication number
WO2023000808A1
WO2023000808A1 PCT/CN2022/094941 CN2022094941W WO2023000808A1 WO 2023000808 A1 WO2023000808 A1 WO 2023000808A1 CN 2022094941 W CN2022094941 W CN 2022094941W WO 2023000808 A1 WO2023000808 A1 WO 2023000808A1
Authority
WO
WIPO (PCT)
Prior art keywords
smart home
home appliance
user
iris
smart
Prior art date
Application number
PCT/CN2022/094941
Other languages
English (en)
French (fr)
Inventor
杨智程
耿宝寒
孙晓
李明光
Original Assignee
青岛海尔空调器有限总公司
青岛海尔空调电子有限公司
海尔智家股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛海尔空调器有限总公司, 青岛海尔空调电子有限公司, 海尔智家股份有限公司 filed Critical 青岛海尔空调器有限总公司
Publication of WO2023000808A1 publication Critical patent/WO2023000808A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Definitions

  • the present application relates to the technical field of smart devices, for example, to a method, device and smart glasses for controlling smart home appliances.
  • the screen systems in smart home appliances mostly use touch to control and operate modes, malls, brightness, sound, etc.
  • the touch operation has the following disadvantages: the possibility of false triggering by touching the screen with fingers; or the possibility of touch failure when the finger is wet; User operation is inconvenient.
  • An existing technology is to project infrared rays to the eyeball when the eyeball recognition operation is activated; track the eyeball movement after the infrared ray projection and collect the eyeball iris image; process the collected eyeball iris image to obtain eyeball movement information;
  • the control instruction is set, and the corresponding control instruction is sent to the control system of the terminal device to realize the control of the terminal device.
  • Embodiments of the present disclosure provide a method, device, and smart glasses for controlling smart home appliances, which can realize remote and precise control of smart home appliances.
  • the method includes:
  • the image of the operation interface of the smart home appliance is projected onto the lens
  • the control command corresponding to the control button is sent to the smart home appliance.
  • the device includes: a processor and a memory storing program instructions, and the processor is configured to execute the aforementioned method for controlling a smart home appliance when executing the program instructions.
  • the smart glasses include:
  • the frame body is provided with lenses
  • the projection part is arranged on the frame body and is configured to project an image to the lens under control;
  • the iris information acquiring unit is arranged on the lens and is configured to collect iris information
  • the communication part is arranged on the frame body and is configured to perform wireless communication with smart home appliances;
  • the aforementioned device for controlling smart home appliances is arranged on the frame body and is electrically connected to the projection unit, the iris information acquisition unit and the communication unit respectively.
  • the method, device, and smart glasses for controlling smart home appliances provided by the embodiments of the present disclosure can achieve the following technical effects:
  • the smart glasses and the smart home appliances are linked and controlled, and the operation interface images of the smart home appliances are processed and projected onto the smart glasses, and then the gaze point of the user's eyes is captured and the positional relationship between it and the projected image is determined to determine Subsequent control of smart home appliances.
  • the embodiment of the present application takes advantage of the feature that the projected image can be clearly imaged, and projects the long-distance smart home appliance operation interface image onto the short-distance smart glasses, which greatly shortens the human-computer interaction distance. On the other hand, it can also make the user's control of the smart home appliance more flexible and convenient, and finally realize the long-distance and precise control of the user's smart home appliance.
  • FIG. 1 is a schematic diagram of smart glasses provided by an embodiment of the present disclosure
  • Fig. 2 is a schematic diagram of a method for controlling smart home appliances provided by an embodiment of the present disclosure
  • Fig. 3 is a schematic diagram of another method for controlling smart home appliances provided by an embodiment of the present disclosure.
  • Fig. 4 is a schematic diagram of another method for controlling smart home appliances provided by an embodiment of the present disclosure.
  • Fig. 5 is a schematic diagram of another method for controlling smart home appliances provided by an embodiment of the present disclosure.
  • Fig. 6 is a schematic diagram of another method for controlling smart home appliances provided by an embodiment of the present disclosure.
  • Fig. 7 is a schematic diagram of an apparatus for controlling a smart home appliance provided by an embodiment of the present disclosure.
  • A/B means: A or B.
  • a and/or B means: A or B, or, A and B, these three relationships.
  • correspondence may refer to an association relationship or a binding relationship, and the correspondence between A and B means that there is an association relationship or a binding relationship between A and B.
  • smart home appliances refer to home appliances formed by introducing microprocessors, sensor technologies, and network communication technologies into home appliances. They have the characteristics of intelligent control, intelligent perception, and intelligent applications. The application and processing of modern technologies such as the Internet of Things, the Internet, and electronic chips. For example, smart home appliances can realize remote control and management of smart home appliances by users by connecting electronic devices.
  • a terminal device refers to an electronic device with a wireless connection function.
  • the terminal device can communicate with the above smart home appliance by connecting to the Internet, or directly communicate with the above smart home appliance through Bluetooth, wifi, etc. .
  • the terminal device is, for example, a mobile device, a computer, or a vehicle-mounted device built into a hover vehicle, or any combination thereof.
  • the mobile device may include, for example, a mobile phone, a smart home device, a wearable device, a smart mobile device, a virtual reality device, etc., or any combination thereof, wherein the wearable device includes, for example, a smart watch, a smart bracelet, a pedometer, and the like.
  • the screen systems in smart home appliances mostly use touch to control and operate modes, malls, brightness, sound, etc.
  • the touch operation has the following disadvantages: the possibility of false triggering by touching the screen with fingers; or the possibility of touch failure when the finger is wet; User operation is inconvenient.
  • An existing technology is to project infrared rays to the eyeball when the eyeball recognition operation is activated; track the eyeball movement after the infrared ray projection and collect the eyeball iris image; process the collected eyeball iris image to obtain eyeball movement information;
  • the control instruction is set, and the corresponding control instruction is sent to the control system of the terminal device to realize the control of the terminal device.
  • this method is too far away, both the quality of the collected iris image and the accuracy of eye movements will be affected.
  • an embodiment of the present disclosure provides smart glasses, including a frame body 11 , a projection unit 12 , an iris information acquisition unit 13 , a communication unit 14 and a control device (not shown in the figure).
  • the frame body 11 is provided with lenses 111 .
  • the projection unit 12 is disposed on the frame body 11 and is configured to project an image to the lens 111 under control.
  • the iris information acquisition unit 13 is disposed on the lens 111 and configured to acquire iris information.
  • the communication unit 14 is disposed on the frame body 11 and is configured to perform wireless communication with smart home appliances.
  • the control device is arranged on the frame body 11 and is electrically connected to the projection unit 12 , the iris information acquisition unit 13 and the communication unit 14 respectively.
  • the projection unit 12 can project the processed smart home appliance operation interface image onto the lens 111 to form a projected image
  • the iris information acquisition unit 13 acquires the user's iris information to determine the position of the user's gaze point
  • the control device determines whether the communication unit 14 sends a relevant control command to the smart home appliance by judging the positional relationship between the gaze point of the user and the projected image.
  • the remote smart home appliance operation interface image is projected onto the short-distance smart glasses, which greatly shortens the human-computer interaction distance and improves the imaging quality.
  • the smart home appliance can improve the recognition accuracy of the smart home appliance for user control commands, and on the other hand On the one hand, it can also make the user's control of the smart home appliance more flexible and convenient, and finally realize the long-distance and precise control of the user's smart home appliance.
  • the lens 111 is an electronic display lens.
  • the electronic display lens is an Organic Light-Emitting Diode (OLED for short) microdisplay.
  • OLED Organic Light-Emitting Diode
  • the OLED microdisplay is thin and highly bendable, suitable for small devices such as smart glasses, and has a fast response speed, which is conducive to the subsequent control of smart home appliances.
  • the projection part 12 is a miniature LED projection lamp.
  • a plurality of micro LED projection lights can be arranged on the frame body 11 .
  • the iris information acquisition part 13 includes a base plate, a miniature infrared LED and a miniature optical sensing element.
  • the bottom plate is arranged on the lens 111 and adopts a transparent high-transmittance optical material.
  • Micro-infrared LEDs are set on the bottom plate and are configured to emit infrared beams to the user’s eyeballs. In order to ensure the accuracy of the position of the user’s eyeballs, multiple micro-infrared LEDs can be set at different positions on the bottom plate at the same time to emit infrared beams from different angles.
  • the miniature optical sensing element is configured to collect the image of the user's eyeball, and the image includes the user's iris reflection information and the user's iris imaging information under the irradiation of the infrared beam.
  • the communication unit 14 may be a Bluetooth module or a WiFi module configured to communicate wirelessly with smart home appliances.
  • the communication unit 14 may be an Internet communication module configured to communicate with smart home appliances over the Internet.
  • an embodiment of the present disclosure provides a method for controlling a smart home appliance, including:
  • the projection unit projects the operation interface image of the smart home appliance onto the lens.
  • the smart glasses determine the gaze point of the user's eyes.
  • the communication unit sends a control command corresponding to the control button to the smart home appliance.
  • the smart glasses and smart home appliances are linked to control, and the operation interface image of the smart home appliance is processed and projected onto the smart glasses, and then the user's eye gaze point is captured and judged.
  • the positional relationship of the image is used to determine the subsequent control of smart home appliances.
  • the embodiment of the present application takes advantage of the feature that the projected image can be clearly imaged, and projects the long-distance smart home appliance operation interface image onto the short-distance smart glasses, which greatly shortens the human-computer interaction distance.
  • it can also make the user's control of smart home appliances more flexible and convenient, and finally realize the long-distance and precise control of users on smart home appliances.
  • the projecting part of projecting the operation interface image of the smart home appliance onto the lens includes: the iris information acquisition part acquires iris imaging information of the user; the smart glasses analyze and extract the operation interface image of the smart home appliance from the user's iris imaging information; Set the ratio to magnify the operation interface image and project it onto the lens.
  • the smart glasses can obtain the image of the operation interface of the smart home appliance even when it is offline, and it is not susceptible to signal interference, has strong reliability and confidentiality, and also ensures real-time control of the smart home appliance.
  • the preset ratio can be adjusted according to the actual needs of users.
  • the preset ratio can be set to 50:1, that is, the projection unit can magnify the operation interface image of the smart home appliance extracted from the user's iris imaging by 50 times and project it onto the lens.
  • the preset ratio of 50:1 can be adjusted according to the specific situation, and can also be set to any other value such as 10:1 or 100:1.
  • the projecting by the projecting unit of the operation interface image of the smart home appliance onto the lens includes: the communication unit receiving the operation interface image sent by the smart home appliance; and the projecting unit projecting the operation interface image on the lens.
  • smart glasses can be linked with smart home appliances for linkage control, and the operation interface images of smart home appliances can be sent to smart glasses in real time, processed and then projected onto the lenses.
  • the embodiments of the present disclosure not only have higher imaging quality and faster response speed, but also do not rely on user iris imaging information, and are suitable for scenes where images of smart home appliance operation interfaces are obtained from a relatively long distance, which is beneficial to long-distance precise control of smart home appliances.
  • the smart glasses determining the gaze point of the user's eyes includes: the iris information acquisition unit acquires iris reflection information of the user under the irradiation of the infrared beam; the smart glasses determine the gaze point of the user's eyes according to the iris reflection information.
  • the embodiment of the present disclosure can capture the gaze point of the user's eyes in real time, infer the user's operation intention, and determine the actual control of the smart home appliance by identifying its positional relationship with the projected image, which is beneficial to the precise control of the smart home appliance.
  • the residence time threshold can be adjusted according to actual needs.
  • the dwell time threshold can be preset as 3s, that is, when the gaze point of the eyeball stays at the position of the control button in the projected image for more than 3s, the smart glasses immediately send the control command corresponding to the control button to the smart home appliance.
  • the duration of 3s can be adjusted according to the specific situation, and can also be set to other arbitrary values such as 1s or 5s.
  • the projected image includes: a status display area and control buttons.
  • the state display area is configured to display the current state of the smart home appliance, and the current state includes actual values of parameters of the smart home appliance such as brightness, power, and mode.
  • the control buttons correspond to buttons or icons on the operation interface of the smart home appliance, and are configured to control the smart home appliance to perform corresponding actions.
  • the projected image may be an augmented reality (Augmented Reality, AR for short) image or a virtual reality (Virtual Reality, VR for short) image, which is specifically selected according to the usage of the smart glasses.
  • the smart glasses analyze and extract the operation interface image of the smart home appliance from the user's iris imaging information and then project it.
  • the projected image can be an AR image.
  • the smart glasses receive the operation interface image sent by the smart home appliance and then project it.
  • the projected image can be a VR image.
  • the smart glasses also include a daily mode, a call mode, a navigation mode and a music mode.
  • the smart glasses when entering the daily mode, can perform myopia correction according to the difference in the degree of myopia of different users, and use the thickness of the lens to adjust, so as to ensure that users with different degrees of myopia can effectively use the smart glasses .
  • the smart glasses in the case of entering the call mode, can realize voice calls with relatives and friends by invoking the address book information and using the built-in miniature microphone and speaker.
  • the smart glasses when entering the navigation mode, the smart glasses can turn on the GPS system and project the planned route of the user's destination onto the glasses to complete the route navigation.
  • the smart glasses when entering the music mode, can perform skinny conduction through the module of the frame body located at the skeleton behind the ear to realize music playback.
  • an embodiment of the present disclosure provides another method for controlling smart home appliances, including:
  • the projection unit projects the operation interface image of the smart home appliance onto the lens.
  • the smart glasses determine the gaze point of the user's eyes.
  • the communication unit sends a control command corresponding to the control button to the smart home appliance.
  • the projection unit projects the image of the operation interface of the smart home appliance onto the lens again.
  • the smart glasses can automatically refresh the projected image, and ensure the recognition accuracy of the user's control commands through the refreshed high-quality projected image, which is beneficial to the user's understanding of smart home appliances. Precise control from a distance.
  • the preset refresh duration is [20s, 30s].
  • the preset refresh duration can be adjusted according to the actual needs of the user. Specifically, the preset refresh duration can be 20s, 25s or 30s.
  • an embodiment of the present disclosure provides another method for controlling smart home appliances, including:
  • the projection unit projects the operation interface image of the smart home appliance onto the lens.
  • the smart glasses determine the gaze point of the user's eyes.
  • the communication unit sends a control command corresponding to the control button to the smart home appliance.
  • the smart glasses exit the iris control mode.
  • the user does not need additional control operations, and can exit the iris control mode only by moving the gaze point of the eyes, which is very convenient and fast.
  • the embodiments of the present disclosure can avoid mistakenly exiting the iris control mode when the gaze point of the user's eyes deviates from the projected image for a short time, thereby ensuring the accuracy of the control.
  • the deviation duration threshold can be adjusted according to actual needs.
  • the dwell time threshold can be preset as 5s, that is, when the gaze point of the eyeball continues to be outside the projected image for more than 5s, the smart glasses will automatically exit the iris control mode.
  • the duration of 5s can be adjusted according to the specific situation, and can also be set to any other value such as 3s or 10s.
  • an embodiment of the present disclosure provides another method for controlling a smart home appliance, including:
  • the iris information acquiring unit acquires iris information of the user.
  • the smart glasses perform feature matching on the acquired iris information and one or more pre-stored iris information.
  • the smart glasses enable the smart home appliance to be turned on.
  • the projection unit projects the operation interface image of the smart home appliance onto the lens.
  • the smart glasses determine the gaze point of the user's eyes.
  • the communication unit sends a control command corresponding to the control button to the smart home appliance.
  • the smart glasses will collect the iris information of the current user and perform feature matching with the pre-stored iris information. When the consistent preset iris information is matched, the smart home appliances will be turned on.
  • the embodiments of the present disclosure introduce the iris recognition technology into the control of smart home appliances, and use the uniqueness of iris features to effectively identify whether the current user is a legitimate user, greatly improving the reliability and security of the linkage control between smart glasses and smart home appliances.
  • enabling the smart home appliance to be turned on by the smart glasses includes: if the pre-stored iris information consistent with the obtained iris information is matched, then the smart glasses send an activation command to the smart home appliance.
  • the embodiment of the present disclosure can enable the smart home appliance to start automatically after the current user is determined to be a legitimate user, which is beneficial to the subsequent control of the smart home appliance.
  • the features of the iris information include: spots, filaments, crowns, crypts, pits, rays, wrinkles, and stripes. These features determine the uniqueness of the iris features, as well as the uniqueness of identification, thus ensuring the reliability and safety of the linkage control between smart glasses and smart home appliances.
  • an embodiment of the present disclosure provides another method for controlling a smart home appliance, including:
  • the iris information acquiring unit acquires iris information of the user.
  • the smart glasses perform feature matching on the acquired iris information and one or more pre-stored iris information.
  • the smart glasses enable the smart home appliance to be turned on.
  • the projection unit projects the image of the operation interface of the smart home appliance onto the lens.
  • the smart glasses determine the gaze point of the user's eyes.
  • the communication unit sends a control command corresponding to the control button to the smart home appliance.
  • the smart glasses will collect the iris information of the current user and perform feature matching with the pre-stored iris information. When no preset iris information with the same characteristics is matched, the smart glasses will automatically Exit iris control mode.
  • the embodiment of the present disclosure utilizes the uniqueness of the iris feature to effectively identify whether the current user is a legitimate user, and will exit the control in time when it is judged to be an illegal user, so as to avoid leakage of user information in smart home appliances, which is conducive to ensuring the protection of smart glasses and smart home appliances. The reliability and safety of smart home appliance linkage control.
  • exiting the iris control mode for the smart glasses includes: exiting the iris control mode for the smart glasses if there is no pre-stored iris information matched with the characteristics of the acquired iris information. In this way, it can effectively prevent illegal users from using the smart glasses to steal user information in the smart home appliances, and can also effectively prevent the smart home appliances from malfunctioning when infants play with the smart glasses.
  • an embodiment of the present disclosure provides an apparatus for controlling a smart home appliance, including a processor (processor) 701 and a memory (memory) 702 .
  • the device may also include a communication interface (Communication Interface) 703 and a bus 704.
  • the processor 701 , the communication interface 703 , and the memory 702 can communicate with each other through the bus 704 .
  • the communication interface 703 can be used for information transmission.
  • the processor 701 can invoke logic instructions in the memory 702 to execute the method for controlling a smart home appliance in the above-mentioned embodiments.
  • logic instructions in the memory 702 may be implemented in the form of software functional units and may be stored in a computer-readable storage medium when sold or used as an independent product.
  • the memory 702 can be used to store software programs and computer-executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure.
  • the processor 701 executes the program instructions/modules stored in the memory 702 to execute functional applications and data processing, that is, to implement the method for controlling smart home appliances in the above-mentioned embodiments.
  • the memory 702 may include a program storage area and a data storage area, wherein the program storage area may store an operating system and an application program required by at least one function; the data storage area may store data created according to the use of the terminal device, and the like.
  • the memory 702 may include a high-speed random access memory, and may also include a non-volatile memory.
  • An embodiment of the present disclosure provides a readable storage medium storing computer-executable instructions, and the computer-executable instructions are configured to execute the above-mentioned method for controlling a smart home appliance.
  • the above-mentioned storage medium may be a transitory computer-readable storage medium, or a non-transitory computer-readable storage medium.
  • the technical solutions of the embodiments of the present disclosure can be embodied in the form of software products, which are stored in a storage medium and include one or more instructions to make a computer device (which can be a personal computer, a server, or a network equipment, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure.
  • the aforementioned storage medium can be a non-transitory storage medium, including: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disc, etc.
  • the term “and/or” as used in this application is meant to include any and all possible combinations of one or more of the associated listed ones.
  • the term “comprise” and its variants “comprises” and/or comprising (comprising) etc. refer to stated features, integers, steps, operations, elements, and/or The presence of a component does not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groupings of these.
  • an element defined by the statement “comprising a " does not exclude the presence of additional identical elements in the process, method or apparatus comprising said element.
  • the disclosed methods and products can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units may only be a logical function division.
  • multiple units or components may be combined Or it can be integrated into another system, or some features can be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • each functional unit in the embodiments of the present disclosure may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • each block in a flowchart or block diagram may represent a module, program segment, or part of code that includes one or more Executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures.
  • two blocks in succession may, in fact, be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • the operations or steps corresponding to different blocks may also occur in a different order than that disclosed in the description, and sometimes there is no specific agreement between different operations or steps.
  • each block in the block diagrams and/or flowcharts, and combinations of blocks in the block diagrams and/or flowcharts can be implemented by a dedicated hardware-based system that performs the specified function or action, or can be implemented by dedicated hardware implemented in combination with computer instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Otolaryngology (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种用于控制智能家电的方法、控制智能家电的装置及智能眼镜,涉及智能设备技术领域,包括:在进入虹膜控制模式的情况下,将智能家电的操作界面影像投射到镜片(111)上;确定用户眼球注视点;若用户眼球注视点位于投射影像中控制按钮的位置,且持续停留时长大于停留时长阈值,则发送控制按钮对应的控制指令至智能家电。利用了投射影像可以清晰成像的特点,将远距离的智能家电操作界面影像投射到近距离的智能眼镜上,大大缩短了人机交互距离,一方面可以提高智能家电对于用户控制命令的识别精度,另一方面也可以使用户对智能家电的控制更为灵活方便,最终实现了用户对智能家电的远距离精准控制。

Description

用于控制智能家电的方法及装置、智能眼镜
本申请基于申请号为202110827140.6、申请日为2021年7月21日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及智能设备技术领域,例如涉及一种用于控制智能家电的方法、装置和智能眼镜。
背景技术
目前,智能家电中的屏幕系统多采用触控来进行模式、商城、亮度、声音等的控制和操作。但触控操作存在以下不足之处:通过手指触控屏幕存在误触发的可能;或者在手指沾水操作时,会出现触控失灵的可能;亦或是智能家电屏幕的放置位置较高时,用户操作不便。现有一种技术是在激活眼球识别操作时,向眼球投射红外线;跟踪经红外线投射后的眼球动作并采集眼球虹膜图像;对采集的眼球虹膜图像进行处理获取眼球动作信息;根据预先为眼球动作信息设置的控制指令,将对应的控制指令发送至终端设备的控制系统以实现对终端设备的控制。
在实现本公开实施例的过程中,发现相关技术中至少存在如下问题:
该方法非常依赖眼球动作的准确性,且人机交互距离对控制精度的影响非常大。当距离过远时,无论是采集的虹膜图像质量还是眼球动作的精准度都会受到影响。
发明内容
为了对披露的实施例的一些方面有基本的理解,下面给出了简单的概括。所述概括不是泛泛评述,也不是要确定关键/重要组成元素或描绘这些实施例的保护范围,而是作为后面的详细说明的序言。
本公开实施例提供了一种用于控制智能家电的方法、装置和智能眼镜,能够实现对智能家电的远距离精准控制。
在一些实施例中,所述方法包括:
在进入虹膜控制模式的情况下,将智能家电的操作界面影像投射到镜片上;
确定用户眼球注视点;
若所述用户眼球注视点位于投射影像中控制按钮的位置,且持续停留时长大于停留时长阈值,则发送所述控制按钮对应的控制指令至所述智能家电。
在一些实施例中,所述装置包括:处理器和存储有程序指令的存储器,所述处理器被配置为在运行所述程序指令时,执行前述的用于控制智能家电的方法。
在一些实施例中,所述智能眼镜包括:
镜架本体,设置有镜片;
投射部,设置于所述镜架本体,被配置为受控向所述镜片投射影像;
虹膜信息获取部,设置于所述镜片,被配置为采集虹膜信息;
通信部,设置于所述镜架本体,被配置为与智能家电进行无线通信;和,
前述的用于控制智能家电的装置,设置于所述镜架本体,分别与所述投射部、所述虹膜信息获取部和所述通信部电连接。
本公开实施例提供的用于控制智能家电的方法、装置和智能眼镜,可以实现以下技术效果:
本申请实施例中,将智能眼镜与智能家电联动控制,智能家电的操作界面影像经过处理后投射到智能眼镜上,再捕捉用户眼球注视点并判断其与投射影像的位置关系,以此来决定对智能家电的后续控制。本申请实施例利用了投射影像可以清晰成像的特点,将远距离的智能家电操作界面影像投射到近距离的智能眼镜上,大大缩短了人机交互距离,一方面可以提高智能家电对于用户控制命令的识别精度,另一方面也可以使用户对智能家电的控制更为灵活方便,最终实现了用户对智能家电的远距离精准控制。
以上的总体描述和下文中的描述仅是示例性和解释性的,不用于限制本申请。
附图说明
一个或多个实施例通过与之对应的附图进行示例性说明,这些示例性说明和附图并不构成对实施例的限定,附图中具有相同参考数字标号的元件示为类似的元件,附图不构成比例限制,并且其中:
图1是本公开实施例提供的一个智能眼镜的示意图;
图2是本公开实施例提供的一个用于控制智能家电的方法的示意图;
图3是本公开实施例提供的另一个用于控制智能家电的方法的示意图;
图4是本公开实施例提供的另一个用于控制智能家电的方法的示意图;
图5是本公开实施例提供的另一个用于控制智能家电的方法的示意图;
图6是本公开实施例提供的另一个用于控制智能家电的方法的示意图;
图7是本公开实施例提供的一个用于控制智能家电的装置的示意图。
具体实施方式
为了能够更加详尽地了解本公开实施例的特点与技术内容,下面结合附图对本公开实施例的实现进行详细阐述,所附附图仅供参考说明之用,并非用来限定本公开实施例。在以下的技术描述中,为方便解释起见,通过多个细节以提供对所披露实施例的充分理解。然而,在没有这些细节的情况下,一个或多个实施例仍然可以实施。在其它情况下,为简化附图,熟知的结构和装置可以简化展示。
本公开实施例的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本公开实施例的实施例。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含。
除非另有说明,术语“多个”表示两个或两个以上。
本公开实施例中,字符“/”表示前后对象是一种“或”的关系。例如,A/B表示:A或B。
术语“和/或”是一种描述对象的关联关系,表示可以存在三种关系。例如,A和/或B,表示:A或B,或,A和B这三种关系。
术语“对应”可以指的是一种关联关系或绑定关系,A与B相对应指的是A与B之间是一种关联关系或绑定关系。
本公开实施例中,智能家电是指将微处理器、传感器技术、网络通信技术引入家电设备后形成的家电产品,具有智能控制、智能感知及智能应用的特征,智能家电的运作过程往往依赖于物联网、互联网以及电子芯片等现代技术的应用和处理,例如智能家电可以通过连接电子设备,实现用户对智能家电的远程控制和管理。
公开实施例中,终端设备是指具有无线连接功能的电子设备,终端设备可以通过连接互联网,与如上的智能家电进行通信连接,也可以直接通过蓝牙、wifi等方式与如上的智能家电进行通信连接。在一些实施例中,终端设备例如为移动设备、电脑、或悬浮车中内置的车载设备等,或其任意组合。移动设备例如可以包括手机、智能家居设备、可穿戴设备、智能移动设备、虚拟现实设备等,或其任意组合,其中,可穿戴设备例如包括:智能手表、智能手环、计步器等。
目前,智能家电中的屏幕系统多采用触控来进行模式、商城、亮度、声音等的控制和操作。但触控操作存在以下不足之处:通过手指触控屏幕存在误触发的可能;或者在手指 沾水操作时,会出现触控失灵的可能;亦或是智能家电屏幕的放置位置较高时,用户操作不便。现有一种技术是在激活眼球识别操作时,向眼球投射红外线;跟踪经红外线投射后的眼球动作并采集眼球虹膜图像;对采集的眼球虹膜图像进行处理获取眼球动作信息;根据预先为眼球动作信息设置的控制指令,将对应的控制指令发送至终端设备的控制系统以实现对终端设备的控制。但该方法在距离过远时,无论是采集的虹膜图像质量还是眼球动作的精准度都会受到影响。
结合图1,本公开实施例提供一种智能眼镜,包括镜架本体11、投射部12、虹膜信息获取部13、通信部14和控制装置(图中未示出)。镜架本体11设置有镜片111。投射部12设置于镜架本体11,被配置为受控向镜片111投射影像。虹膜信息获取部13设置于镜片111,被配置为采集虹膜信息。通信部14设置于镜架本体11,被配置为与智能家电进行无线通信。控制装置设置于镜架本体11,分别与投射部12、虹膜信息获取部13和通信部14电连接。
采用本公开实施例提供的智能眼镜,投射部12可以将经过处理后的智能家电操作界面影像投射到镜片111上形成投射影像,虹膜信息获取部13获取用户的虹膜信息来确定用户眼球注视点位置,最后控制装置通过判断用户眼球注视点位置与投射影像的位置关系,以此来决定通信部14是否发送相关控制指令至智能家电。本公开实施例将远距离的智能家电操作界面影像投射到近距离的智能眼镜上,大大缩短了人机交互距离且提高了成像质量,一方面可以提高智能家电对于用户控制命令的识别精度,另一方面也可以使用户对智能家电的控制更为灵活方便,最终实现了用户对智能家电的远距离精准控制。
可选地,镜片111选用电子显示镜片。优选地,该电子显示镜片为有机电激光显示(Organic Light-Emitting Diode,简称OLED)微显示屏。OLED微显示屏较薄且可弯曲程度较高,适用于智能眼镜这种小型设备,且响应速度快,有利于后续对智能家电的控制。
可选地,投射部12选用微型LED投射灯。为了提高投射影像的成像质量,可设置多个微型LED投射灯于镜架本体11上。
可选地,虹膜信息获取部13包括底板、微型红外LED和微型光学感测元件。其中,底板设置于镜片111上,采用透明的高透过率光学材料。微型红外LED设置于底板上,被配置为向用户的眼球发射红外线光束,为保证用户眼球注视点位置的准确性,可同时设置多个微型红外LED在底板不同位置,以从不同角度发射红外线光束。微型光学感测元件,被配置为采集用户的眼球的图像,该图像包括红外线光束照射下的用户虹膜反射信息和用户虹膜成像信息。
可选地,通信部14可以是蓝牙模块或WiFi模块,被配置为与智能家电进行无线通信。
可选地,通信部14可以是互联网通信模块,被配置为与智能家电进行互联网通信。
结合图2所示,本公开实施例提供一种用于控制智能家电的方法,包括:
S201,在进入虹膜控制模式的情况下,投射部将智能家电的操作界面影像投射到镜片上。
S202,智能眼镜确定用户眼球注视点。
S203,若用户眼球注视点位于投射影像中控制按钮的位置,且持续停留时长大于停留时长阈值,则通信部发送控制按钮对应的控制指令至智能家电。
采用本公开实施例提供的用于控制智能家电的方法,将智能眼镜与智能家电联动控制,智能家电的操作界面影像经过处理后投射到智能眼镜上,再捕捉用户眼球注视点并判断其与投射影像的位置关系,以此来决定对智能家电的后续控制。本申请实施例利用了投射影像可以清晰成像的特点,将远距离的智能家电操作界面影像投射到近距离的智能眼镜上,大大缩短了人机交互距离,一方面可以提高智能家电对于用户控制命令的识别精度,另一方面也可以使用户对智能家电的控制更为灵活方便,最终实现了用户对智能家电的远距离精准控制。
可选地,投射部将智能家电的操作界面影像投射到镜片上包括:虹膜信息获取部获取用户虹膜成像信息;智能眼镜从用户虹膜成像信息中分析提取智能家电的操作界面影像;投射部按预设比例将操作界面影像放大并投射到镜片上。这样,智能眼镜在离线情况下也能获取到智能家电操作界面图像,且不易受信号干扰、可靠性和保密性较强,同时也确保了对智能家电的实时控制。
可选地,预设比例可根据用户实际需求进行调整。具体地,预设比例可设为50:1,即,投射部可将从用户虹膜成像中提取到的智能家电操作界面图像放大50倍并投射到镜片上。这样,能保证成像的质量,从而提高对用户控制命令的识别精度,有利于用户对智能家电的远距离精准控制。50:1这个预设比例可以根据具体情况进行调整,也可以设置为10:1或100:1等其他任意值。
可选地,投射部将智能家电的操作界面影像投射到镜片上包括:通信部接收智能家电发送的操作界面影像;投射部将操作界面影像投射到镜片上。这样,智能眼镜可以与智能家电进行联动控制,智能家电的操作界面影像可实时发送到智能眼镜,经过处理后再投射到镜片上。本公开实施例不仅成像质量更高、响应速度更快,而且不依赖用户虹膜成像信息,适用于较远距离获取智能家电操作界面影像的场景,有利于对智能家电的远距离精准控制。
可选地,智能眼镜确定用户眼球注视点包括:虹膜信息获取部获取用户在红外线光束 照射下的虹膜反射信息;智能眼镜根据虹膜反射信息确定用户眼球注视点。本公开实施例可实时捕捉用户眼球注视点,并以此推断出用户的操作意向,通过识别其与投射影像的位置关系来决定对智能家电的实际控制,有利于对智能家电的精准控制。
可选地,停留时长阈值可根据实际需求进行调整。具体地,停留时长阈值可预设为3s,即,当眼球注视点持续停留在投射影像中控制按钮的位置超过3s时,智能眼镜立即发送控制按钮对应的控制指令至智能家电。3s这个时长可以根据具体情况进行调整,也可以设置为1s或5s等其他任意值。
可选地,投射影像包括:状态显示区域、控制按钮。状态显示区域被配置为显示智能家电的当前状态,当前状态包括亮度、电量、模式等智能家电参数的实际值。控制按钮对应于智能家电操作界面上的按钮或图标,被配置为控制智能家电进行相应动作。
可选地,投射影像可以是增强现实(Augmented Reality,简称AR)图像或虚拟现实(Virtual Reality,简称VR)图像,具体根据智能眼镜使用情况进行选用。在一些实施例中,智能眼镜从用户虹膜成像信息中分析提取智能家电的操作界面影像再进行投射,该投射影像可选用AR图像。在另一些实施例中,智能眼镜接收智能家电发送的操作界面影像再进行投射,该投射影像可选用VR图像。
可选地,智能眼镜除了虹膜控制模式还包括日常模式、通话模式、导航模式和音乐模式。在一些实施例中,在进入日常模式的情况下,智能眼镜可以根据不同用户近视度数的差别进行近视修正,利用镜片厚度的伸缩来进行调整,以保证不同近视度数的用户能够有效使用该智能眼镜。在另一些实施例中,在进入通话模式的情况下,智能眼镜可以通过调用通讯录信息,并利用内置的微型麦克风和扬声器实现与亲人朋友的语音通话。在另一些实施例中,在进入导航模式的情况下,智能眼镜可以通过打开GPS系统,将用户目的地规划路线投射到镜片上,完成路径导航。在另一些实施例中,在进入音乐模式的情况下,智能眼镜可以通过框架本体位于耳朵后骨架处的模块进行骨感传导,实现音乐播放。
结合图3所示,本公开实施例提供另一种用于控制智能家电的方法,包括:
S301,在进入虹膜控制模式的情况下,投射部将智能家电的操作界面影像投射到镜片上。
S302,智能眼镜确定用户眼球注视点。
S303,若用户眼球注视点位于投射影像中控制按钮的位置,且持续停留时长大于停留时长阈值,则通信部发送控制按钮对应的控制指令至智能家电。
S304,在预设刷新时长内没有发送控制指令至智能家电的情况下,投射部再次将智能家电的操作界面影像投射到镜片上。
采用本公开实施例提供的用于控制智能家电的方法,在超过预设刷新时长没有发送控制指令时,会重新投射智能家电的操作界面影像到镜片上,完成投射影像的再刷新。这样,在投射影像成像不清晰导致用户难以准确控制的情况下,智能眼镜可以自动刷新投射影像,通过刷新后高质量的投射影像来确保对用户控制命令的识别精度,有利于用户对智能家电的远距离精准控制。
可选地,预设刷新时长为[20s,30s]。预设刷新时长可根据用户实际需求进行调整,具体地,预设刷新时长取值可以为20s、25s或30s。
结合图4所示,本公开实施例提供另一种用于控制智能家电的方法,包括:
S401,在进入虹膜控制模式的情况下,投射部将智能家电的操作界面影像投射到镜片上。
S402,智能眼镜确定用户眼球注视点。
S403,若用户眼球注视点位于投射影像中控制按钮的位置,且持续停留时长大于停留时长阈值,则通信部发送控制按钮对应的控制指令至智能家电。
S404,若用户眼球注视点位于投射影像外,且持续偏离时长大于偏离时长阈值,则智能眼镜退出虹膜控制模式。
采用本公开实施例提供的用于控制智能家电的方法,用户不需要额外的控制操作,只需通过移动眼球注视点位置即可退出虹膜控制模式,非常方便快捷。且通过预设偏离时长阈值,本公开实施例也能在用户眼球注视点短时间偏离投射影像时避免误退出虹膜控制模式,从而保障了控制的准确性。
可选地,偏离时长阈值可根据实际需求进行调整。具体地,停留时长阈值可预设为5s,即,当眼球注视点持续位于投射影像外超过5s时,智能眼镜自动退出虹膜控制模式。5s这个时长可以根据具体情况进行调整,也可以设置为3s或10s等其他任意值。
结合图5所示,本公开实施例提供另一种用于控制智能家电的方法,包括:
S501,在进入虹膜控制模式的情况下,虹膜信息获取部获取用户的虹膜信息。
S502,智能眼镜将获取的虹膜信息与预存的一个或多个虹膜信息进行特征匹配。
S503,若匹配成功,则智能眼镜使智能家电处于开启状态。
S504,投射部将智能家电的操作界面影像投射到镜片上。
S505,智能眼镜确定用户眼球注视点。
S506,若用户眼球注视点位于投射影像中控制按钮的位置,且持续停留时长大于停留时长阈值,则通信部发送控制按钮对应的控制指令至智能家电。
采用本公开实施例提供的用于控制智能家电的方法,智能眼镜会采集当前用户的虹膜 信息并与预存虹膜信息进行特征匹配,当匹配到一致的预设虹膜信息,便使智能家电开启。本公开实施例将虹膜识别技术引入到智能家电控制中,利用虹膜特征的唯一性可以有效识别出当前用户是否为合法用户,大大提高了智能眼镜与智能家电联动控制的可靠性和安全性。
具体地,若匹配成功,则智能眼镜使智能家电处于开启状态包括:若匹配到与获取的虹膜信息特征一致的预存虹膜信息,则智能眼镜发送启动指令至智能家电。这样,本公开实施例在确定当前用户为合法用户后能使智能家电进行自启动,有利于后续对智能家电的控制。
可选地,虹膜信息的特征包括:斑点、细丝、冠状、隐窝、凹点、射线、皱纹和条纹等。这些特征决定了虹膜特征的唯一性,同时也决定了身份识别的唯一性,从而确保了智能眼镜与智能家电联动控制的可靠性和安全性。
结合图6所示,本公开实施例提供另一种用于控制智能家电的方法,包括:
S601,在进入虹膜控制模式的情况下,虹膜信息获取部获取用户的虹膜信息。
S602,智能眼镜将获取的虹膜信息与预存的一个或多个虹膜信息进行特征匹配。
S603,若匹配不成功,则智能眼镜退出虹膜控制模式。
S604,若匹配成功,则智能眼镜使智能家电处于开启状态。
S605,投射部将智能家电的操作界面影像投射到镜片上。
S606,智能眼镜确定用户眼球注视点。
S607,若用户眼球注视点位于投射影像中控制按钮的位置,且持续停留时长大于停留时长阈值,则通信部发送控制按钮对应的控制指令至智能家电。
采用本公开实施例提供的用于控制智能家电的方法,智能眼镜会采集当前用户的虹膜信息并与预存虹膜信息进行特征匹配,当没有匹配到特征一致的预设虹膜信息时,智能眼镜会自动退出虹膜控制模式。本公开实施例利用虹膜特征的唯一性可以有效识别出当前用户是否为合法用户,当判断为不合法用户时会及时退出控制,以避免智能家电中的用户信息造成泄露,有利于保障智能眼镜与智能家电联动控制的可靠性和安全性。
具体地,若匹配不成功,则智能眼镜退出虹膜控制模式包括:若没有匹配到与获取的虹膜信息特征一致的预存虹膜信息,则智能眼镜退出虹膜控制模式。这样,能有效防止不合法用户利用该智能眼镜窃取智能家电中的用户信息,也能有效避免婴幼儿玩耍该智能眼镜时导致智能家电误动作。
结合图7所示,本公开实施例提供一种用于控制智能家电的装置,包括处理器(processor)701和存储器(memory)702。可选地,该装置还可以包括通信接口 (Communication Interface)703和总线704。其中,处理器701、通信接口703、存储器702可以通过总线704完成相互间的通信。通信接口703可以用于信息传输。处理器701可以调用存储器702中的逻辑指令,以执行上述实施例的用于控制智能家电的方法。
此外,上述的存储器702中的逻辑指令可以通过软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。
存储器702作为一种计算机可读存储介质,可用于存储软件程序、计算机可执行程序,如本公开实施例中的方法对应的程序指令/模块。处理器701通过运行存储在存储器702中的程序指令/模块,从而执行功能应用以及数据处理,即实现上述实施例中用于控制智能家电的方法。
存储器702可包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序;存储数据区可存储根据终端设备的使用所创建的数据等。此外,存储器702可以包括高速随机存取存储器,还可以包括非易失性存储器。
本公开实施例提供了一种可读存储介质,存储有计算机可执行指令,所述计算机可执行指令设置为执行上述用于控制智能家电的方法。
上述的存储介质可以是暂态计算机可读存储介质,也可以是非暂态计算机可读存储介质。
本公开实施例的技术方案可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括一个或多个指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本公开实施例所述方法的全部或部分步骤。而前述的存储介质可以是非暂态存储介质,包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等多种可以存储程序代码的介质,也可以是暂态存储介质。
以上描述和附图充分地示出了本公开的实施例,以使本领域的技术人员能够实践它们。其他实施例可以包括结构的、逻辑的、电气的、过程的以及其他的改变。实施例仅代表可能的变化。除非明确要求,否则单独的部件和功能是可选的,并且操作的顺序可以变化。一些实施例的部分和特征可以被包括在或替换其他实施例的部分和特征。而且,本申请中使用的用词仅用于描述实施例并且不用于限制权利要求。如在实施例以及权利要求的描述中使用的,除非上下文清楚地表明,否则单数形式的“一个”(a)、“一个”(an)和“所述”(the)旨在同样包括复数形式。类似地,如在本申请中所使用的术语“和/或”是指包含一个或一个以上相关联的列出的任何以及所有可能的组合。另外,当用于本申请中时,术语“包括”(comprise)及其变型“包括”(comprises)和/或包括(comprising) 等指陈述的特征、整体、步骤、操作、元素,和/或组件的存在,但不排除一个或一个以上其它特征、整体、步骤、操作、元素、组件和/或这些的分组的存在或添加。在没有更多限制的情况下,由语句“包括一个…”限定的要素,并不排除在包括所述要素的过程、方法或者设备中还存在另外的相同要素。本文中,每个实施例重点说明的可以是与其他实施例的不同之处,各个实施例之间相同相似部分可以互相参见。对于实施例公开的方法、产品等而言,如果其与实施例公开的方法部分相对应,那么相关之处可以参见方法部分的描述。
本领域技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,可以取决于技术方案的特定应用和设计约束条件。所述技术人员可以对每个特定的应用来使用不同方法以实现所描述的功能,但是这种实现不应认为超出本公开实施例的范围。所述技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
本文所披露的实施例中,所揭露的方法、产品(包括但不限于装置、设备等),可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,可以仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例。另外,在本公开实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
附图中的流程图和框图显示了根据本公开实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或代码的一部分,所述模块、程序段或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这可以依所涉及的功能而定。在附图中的流程图和框图所对应的描述中,不同的方框所对应的操作或步骤也可以以不同于描述中所披露的顺序发 生,有时不同的操作或步骤之间不存在特定的顺序。例如,两个连续的操作或步骤实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这可以依所涉及的功能而定。框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。

Claims (10)

  1. 一种用于控制智能家电的方法,其特征在于,应用于一种智能眼镜,所述方法包括:
    在进入虹膜控制模式的情况下,将智能家电的操作界面影像投射到镜片上;
    确定用户眼球注视点;
    若所述用户眼球注视点位于投射影像中控制按钮的位置,且持续停留时长大于停留时长阈值,则发送所述控制按钮对应的控制指令至所述智能家电。
  2. 根据权利要求1所述的方法,其特征在于,所述将智能家电的操作界面影像投射到镜片上,包括:
    获取用户虹膜成像信息;
    从所述用户虹膜成像信息中分析提取所述智能家电的操作界面影像;
    按预设比例将所述操作界面影像放大并投射到镜片上。
  3. 根据权利要求1所述的方法,其特征在于,所述将智能家电的操作界面影像投射到镜片上,包括:
    接收所述智能家电发送的操作界面影像;
    将所述操作界面影像投射到镜片上。
  4. 根据权利要求1所述的方法,其特征在于,所述确定用户眼球注视点,包括:
    获取用户在红外线光束照射下的虹膜反射信息;
    根据所述虹膜反射信息确定所述用户眼球注视点。
  5. 根据权利要求1至4任一项所述的方法,其特征在于,还包括:
    在预设刷新时长内没有发送控制指令至所述智能家电的情况下,再次将所述智能家电的操作界面影像投射到镜片上。
  6. 根据权利要求1至4任一项所述的方法,其特征在于,还包括:
    若所述用户眼球注视点位于所述投射影像外,且持续偏离时长大于偏离时长阈值,则退出所述虹膜控制模式。
  7. 根据权利要求1至4任一项所述的方法,其特征在于,在进入虹膜控制模式的情况下,所述将智能家电的操作界面影像投射到镜片上之前,还包括:
    获取用户的虹膜信息;
    将获取的虹膜信息与预存的一个或多个虹膜信息进行特征匹配;
    若匹配成功,则使所述智能家电处于开启状态。
  8. 根据权利要求7所述的方法,其特征在于,还包括:
    若匹配不成功,则退出所述虹膜控制模式。
  9. 一种用于控制智能家电的装置,包括处理器和存储有程序指令的存储器,其特征在于,所述处理器被配置为在运行所述程序指令时,执行如权利要求1至8任一项所述的用于控制智能家电的方法。
  10. 一种智能眼镜,其特征在于,包括:
    镜架本体,设置有镜片;
    投射部,设置于所述镜架本体,被配置为受控向所述镜片投射影像;
    虹膜信息获取部,设置于所述镜片,被配置为采集虹膜信息;
    通信部,设置于所述镜架本体,被配置为与智能家电进行无线通信;和,
    如权利要求9所述的用于控制智能家电的装置,设置于所述镜架本体,分别与所述投射部、所述虹膜信息获取部和所述通信部电连接。
PCT/CN2022/094941 2021-07-21 2022-05-25 用于控制智能家电的方法及装置、智能眼镜 WO2023000808A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110827140.6 2021-07-21
CN202110827140.6A CN113655638B (zh) 2021-07-21 2021-07-21 用于控制智能家电的方法及装置、智能眼镜

Publications (1)

Publication Number Publication Date
WO2023000808A1 true WO2023000808A1 (zh) 2023-01-26

Family

ID=78489679

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/094941 WO2023000808A1 (zh) 2021-07-21 2022-05-25 用于控制智能家电的方法及装置、智能眼镜

Country Status (2)

Country Link
CN (1) CN113655638B (zh)
WO (1) WO2023000808A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118011893A (zh) * 2024-01-09 2024-05-10 西乔科技南京有限公司 一种基于人工智能的人机交互系统

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113655638B (zh) * 2021-07-21 2024-04-30 青岛海尔空调器有限总公司 用于控制智能家电的方法及装置、智能眼镜
CN115291734A (zh) * 2022-10-08 2022-11-04 深圳市天趣星空科技有限公司 基于智能眼镜的智能设备控制方法和系统

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110138285A1 (en) * 2009-12-09 2011-06-09 Industrial Technology Research Institute Portable virtual human-machine interaction device and operation method thereof
CN105138118A (zh) * 2015-07-31 2015-12-09 努比亚技术有限公司 实现人机交互的智能眼镜、方法以及移动终端
CN107528873A (zh) * 2016-06-22 2017-12-29 佛山市顺德区美的电热电器制造有限公司 智能家电的控制系统及虚拟现实投影装置
CN108762213A (zh) * 2018-06-04 2018-11-06 李宇轩 智能家居的控制方法、装置及智能眼镜
CN108983441A (zh) * 2018-07-12 2018-12-11 黄华新 一种智能眼镜系统及其工作方法
CN109086726A (zh) * 2018-08-10 2018-12-25 陈涛 一种基于ar智能眼镜的局部图像识别方法及系统
CN112445328A (zh) * 2019-09-03 2021-03-05 北京七鑫易维信息技术有限公司 映射控制方法及装置
CN113655638A (zh) * 2021-07-21 2021-11-16 青岛海尔空调器有限总公司 用于控制智能家电的方法及装置、智能眼镜

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6740613B2 (ja) * 2015-12-28 2020-08-19 セイコーエプソン株式会社 表示装置、表示装置の制御方法、及び、プログラム
US10948983B2 (en) * 2018-03-21 2021-03-16 Samsung Electronics Co., Ltd. System and method for utilizing gaze tracking and focal point tracking
US11533468B2 (en) * 2019-06-27 2022-12-20 Samsung Electronics Co., Ltd. System and method for generating a mixed reality experience
US10712817B1 (en) * 2019-06-27 2020-07-14 Tobii Ab Image re-projection for foveated rendering
KR20210079774A (ko) * 2019-12-20 2021-06-30 삼성전자주식회사 시선 추적 장치를 포함하는 웨어러블 디바이스 및 그 동작 방법

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110138285A1 (en) * 2009-12-09 2011-06-09 Industrial Technology Research Institute Portable virtual human-machine interaction device and operation method thereof
CN105138118A (zh) * 2015-07-31 2015-12-09 努比亚技术有限公司 实现人机交互的智能眼镜、方法以及移动终端
CN107528873A (zh) * 2016-06-22 2017-12-29 佛山市顺德区美的电热电器制造有限公司 智能家电的控制系统及虚拟现实投影装置
CN108762213A (zh) * 2018-06-04 2018-11-06 李宇轩 智能家居的控制方法、装置及智能眼镜
CN108983441A (zh) * 2018-07-12 2018-12-11 黄华新 一种智能眼镜系统及其工作方法
CN109086726A (zh) * 2018-08-10 2018-12-25 陈涛 一种基于ar智能眼镜的局部图像识别方法及系统
CN112445328A (zh) * 2019-09-03 2021-03-05 北京七鑫易维信息技术有限公司 映射控制方法及装置
CN113655638A (zh) * 2021-07-21 2021-11-16 青岛海尔空调器有限总公司 用于控制智能家电的方法及装置、智能眼镜

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118011893A (zh) * 2024-01-09 2024-05-10 西乔科技南京有限公司 一种基于人工智能的人机交互系统

Also Published As

Publication number Publication date
CN113655638A (zh) 2021-11-16
CN113655638B (zh) 2024-04-30

Similar Documents

Publication Publication Date Title
WO2023000808A1 (zh) 用于控制智能家电的方法及装置、智能眼镜
CN107390863B (zh) 设备的控制方法及装置、电子设备、存储介质
KR102056221B1 (ko) 시선인식을 이용한 장치 연결 방법 및 장치
CN109087485B (zh) 驾驶提醒方法、装置、智能眼镜及存储介质
CN111052043A (zh) 使用现实界面控制外部设备
US20170123491A1 (en) Computer-implemented gaze interaction method and apparatus
KR20160128119A (ko) 이동 단말기 및 이의 제어방법
US9696801B2 (en) Eye-controlled user interface to control an electronic device
WO2015011703A1 (en) Method and system for touchless activation of a device
CN109224432B (zh) 娱乐应用的操控方法、装置、存储介质及穿戴式设备
CN110866230B (zh) 已认证设备辅助的用户认证
CN110543233B (zh) 信息处理设备和非暂时性计算机可读介质
WO2015194017A1 (ja) ウェアラブル装置および認証方法
WO2021244145A1 (zh) 头戴显示设备的交互方法、终端设备及存储介质
WO2015093130A1 (ja) 情報処理装置、情報処理方法およびプログラム
US20200341284A1 (en) Information processing apparatus, information processing method, and recording medium
CN108369451B (zh) 信息处理装置、信息处理方法及计算机可读存储介质
CN109358744A (zh) 信息共享方法、装置、存储介质及穿戴式设备
CN112241199B (zh) 虚拟现实场景中的交互方法及装置
EP3582068A1 (en) Information processing device, information processing method, and program
US10444831B2 (en) User-input apparatus, method and program for user-input
KR101728707B1 (ko) 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법 및 제어프로그램
WO2019102680A1 (ja) 情報処理装置、情報処理方法、およびプログラム
JP2017120488A (ja) 表示装置、表示システム、表示装置の制御方法、及び、プログラム
CN107450722A (zh) 一种带有噪音过滤功能的双重交互控制头戴式显示设备和交互方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22844975

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE