CN111274870A - Intelligent glasses control method, intelligent glasses control device and storage medium - Google Patents

Intelligent glasses control method, intelligent glasses control device and storage medium Download PDF

Info

Publication number
CN111274870A
CN111274870A CN202010015056.XA CN202010015056A CN111274870A CN 111274870 A CN111274870 A CN 111274870A CN 202010015056 A CN202010015056 A CN 202010015056A CN 111274870 A CN111274870 A CN 111274870A
Authority
CN
China
Prior art keywords
mode
glasses
terminal
distance
smart
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010015056.XA
Other languages
Chinese (zh)
Inventor
崔祺琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202010015056.XA priority Critical patent/CN111274870A/en
Publication of CN111274870A publication Critical patent/CN111274870A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • General Health & Medical Sciences (AREA)
  • Otolaryngology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to a smart glasses control method, a smart glasses control device, and a storage medium. The intelligent glasses have at least two modes, the at least two modes comprise a first mode and a second mode, and the intelligent glasses control method comprises the following steps: in the process of using the intelligent glasses, detecting the distance between an object watched by the eyes of the user and the mirror surface of the intelligent glasses; and controlling the intelligent glasses to be switched to a first mode or a second mode according to the distance. Through this disclosure, at the in-process that uses intelligent glasses, do not need the manual operation to change the glasses mode, just can realize the switching of different glasses modes, improve intelligent glasses's intellectuality, promote user experience.

Description

Intelligent glasses control method, intelligent glasses control device and storage medium
Technical Field
The present disclosure relates to the field of intelligent control technologies, and in particular, to an intelligent glasses control method, an intelligent glasses control device, and a storage medium.
Background
As one of wearable smart devices which have been proposed in recent years and have a good prospect, smart glasses have features of being easy and convenient to use, having a small size, and the like, and it is becoming more and more common for a user to watch an object through the smart glasses.
At present, the functions of the smart glasses are also more and more diversified, for example, the smart glasses may include a plurality of working modes such as a flat lens mode and a presbyopic lens mode. However, how to realize the intelligence of the smart glasses function is one of the technical problems to be solved.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a smart glasses control method, a smart glasses control apparatus, and a storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided a smart glasses control method, the smart glasses having at least two modes, the at least two modes including a first mode and a second mode, the smart glasses control method including: in the process of using the intelligent glasses, detecting the distance between an object watched by the eyes of the user and the mirror surface of the intelligent glasses; and controlling the intelligent glasses to be switched to a first mode or a second mode according to the distance.
In one example, controlling the smart glasses to switch to the first mode or the second mode depending on the distance includes: when the distance is smaller than a specified distance threshold value, switching to a first mode; and when the distance is greater than the specified distance threshold, switching to a second mode.
In one example, the first mode is a presbyopic mode and the second mode is a flat mirror mode.
In one example, a camera is built in the smart glasses, and the smart glasses control method further includes: when it is determined that the object viewed by the user contains the specified content, shooting the specified content contained in the object; and establishing connection with the terminal, and processing the specified content in a linkage manner by the terminal.
In one example, the terminal is a sound box or a hearing aid; the linkage terminal processes the appointed content and comprises the following steps: and sending the specified content to the sound box or the hearing aid, and broadcasting the specified content in the sound box or the hearing aid.
In one example, the processing of the designated content by the linked terminal includes: and pushing the specified content to the terminal, and retrieving and displaying pushing information related to the specified content by the terminal.
In an example, detecting an object viewed by a user's eyes includes: and when the parallel angle between the mirror surface of the intelligent glasses and the object reaches a specified angle threshold value, determining that the user is watching the object.
According to a second aspect of the embodiments of the present disclosure, there is provided a smart glasses control device, the smart glasses having at least two modes, the at least two modes including a first mode and a second mode, the smart glasses control device comprising: the detection unit is configured to detect the distance between an object viewed by the eyes of the user and the mirror surface of the intelligent glasses in the process of using the intelligent glasses; the switching unit is configured to control the smart glasses to be switched to a first mode or a second mode according to the distance.
In one example, the switching unit controls the smart glasses to switch to the first mode or the second mode according to the distance in the following manner: when the distance is smaller than a specified distance threshold value, switching to a first mode; and when the distance is greater than the specified distance threshold, switching to a second mode.
In one example, the first mode is a presbyopic mode and the second mode is a flat mirror mode.
In an example, there is the camera built-in the smart glasses, and smart glasses controlling means still includes: a photographing unit configured to photograph the specified content contained in the object when it is determined that the specified content is contained in the object viewed by the user; and the linkage unit is configured to establish connection between the intelligent glasses and the terminal and process the specified content by linkage with the terminal.
In one example, the terminal is a sound box or a hearing aid; the linkage unit is used for processing the appointed contents by the linkage terminal in the following mode: and sending the specified content to the sound box or the hearing aid, and broadcasting the specified content in the sound box or the hearing aid.
In one example, the linking unit links the terminal to process the specified content in the following manner, including: and pushing the specified content to the terminal, and retrieving and displaying pushing information related to the specified content by the terminal, wherein the pushing information comprises one or more of commodity purchasing information, social information and network query information.
In one example, the detection unit detects an object viewed by the eyes of the user as follows: and when the parallel angle between the mirror surface of the intelligent glasses and the object reaches a specified angle threshold value, determining that the user is watching the object.
According to a third aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer executable instructions which, when executed by a processor, perform the steps of the smart eyewear control method of the preceding first aspect or any one of the examples of the first aspect.
According to a fourth aspect of the present disclosure, there is provided a handwritten notebook control apparatus including: a memory configured to store instructions. And a processor configured to invoke instructions to perform the steps of the smart glasses control method in the foregoing first aspect or any example of the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: in the process of using the intelligent glasses, the distance between an object watched by the eyes of a user and the mirror surface of the intelligent glasses is detected, and the intelligent glasses are controlled to be switched to the first mode or the second mode according to the distance between the object watched by the eyes of the user and the mirror surface of the intelligent glasses. Through this disclosure, at the in-process that uses intelligent glasses, do not need the manual operation to change the glasses mode, just can realize the switching of different glasses modes, improve intelligent glasses's intellectuality, promote user experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a smart glasses control method according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating a smart glasses control method according to an exemplary embodiment.
Fig. 3 is a block diagram illustrating a smart eyewear control device in accordance with an exemplary embodiment.
Fig. 4 is a block diagram illustrating a terminal device according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The function control method provided by the embodiment of the disclosure can be applied to an application scene for watching an object by using intelligent glasses. The intelligent glasses related to the embodiment of the present disclosure may be the intelligent glasses capable of switching the glasses mode and linking with the terminal to realize further function interaction. In the exemplary embodiments described below, a terminal is sometimes also referred to as an intelligent terminal device, where the terminal may be a Mobile terminal, and may also be referred to as a User Equipment (UE), a Mobile Station (MS), and the like. A terminal is a device that provides voice and/or data connection to a user, or a chip disposed in the device, such as a handheld device, a vehicle-mounted device, etc. having a wireless connection function. Examples of terminals may include, for example: the Mobile terminal comprises a Mobile phone, a tablet computer, a notebook computer, a palm computer, Mobile Internet Devices (MID), a wearable device, a Virtual Reality (VR) device, an Augmented Reality (AR) device, a wireless terminal in industrial control, a wireless terminal in unmanned driving, a wireless terminal in remote operation, a wireless terminal in a smart grid, a wireless terminal in transportation safety, a wireless terminal in a smart city, a wireless terminal in a smart home and the like.
Fig. 1 is a flowchart illustrating a smart glasses control method according to an exemplary embodiment, where, as shown in fig. 1, the smart glasses have at least two modes, and the at least two modes include a first mode and a second mode, and the smart glasses control method includes the following steps.
In step S11, during the use of the smart glasses, the distance between the object viewed by the user' S eyes and the mirror surface of the smart glasses is detected.
In one embodiment, the present disclosure may detect a distance between an object viewed by the eyes of the user and the mirror surface of the smart glasses, for example, through a distance sensor installed in the smart glasses, or may detect a distance between an object viewed by the eyes of the user and the mirror surface of the smart glasses, by photographing a distance between the object viewed by the eyes of the user and the mirror surface of the smart glasses through a camera installed in the smart glasses.
The object watched by the eyes of the user is detected, and the angle between the mirror surface of the intelligent glasses and the object can be determined.
For example, upon determining that the parallel angle between the mirror surface of the smart glasses and the object reaches a specified angle threshold, it is determined that the user is viewing the object.
In step S12, the smart glasses are controlled to switch to the first mode or the second mode according to the distance.
In this disclosure, in order to make things convenient for the user to watch close-range objects and watch distant-range objects through smart glasses, the distance between the object watched by the eyes of the user and the mirror surface of the smart glasses can be detected, and the smart glasses are controlled to switch the mode of the smart glasses to the first mode or the second mode.
For example, for the intelligent presbyopic glasses, the first mode may be a presbyopic glasses mode, the second mode may be a flat glasses mode, when it is detected that the distance between the object watched by the eyes of the user and the mirror surface of the intelligent glasses is smaller than a preset threshold value, the intelligent presbyopic glasses are switched to the presbyopic glasses mode, and when it is detected that the distance between the object watched by the eyes of the user and the mirror surface of the intelligent glasses is larger than the preset threshold value, the intelligent presbyopic glasses are switched to the flat glasses mode. Further, when detecting that the distance between the object watched by the user glasses and the intelligent glasses lens surface is smaller than a preset threshold value, the lenses of the intelligent presbyopic glasses are switched to the presbyopic glasses lenses, and when detecting that the distance between the object watched by the user glasses and the intelligent glasses lens surface is larger than the preset threshold value, the lenses of the intelligent presbyopic glasses are switched to the flat glasses lenses.
It is understood that the number of modes that the smart glasses have is not limited in this disclosure, and in possible embodiments, the smart glasses may have three modes. Also, the type of mode that the smart glasses have is not limited in this disclosure, for example, the smart glasses may have a near vision mode, a blue light prevention mode, and the like. Further, the first mode and the second mode in the present disclosure are merely examples, and do not serve as a basis for limiting the present disclosure.
In an exemplary embodiment of the present disclosure, in the using process of the smart glasses, by detecting a distance between an object viewed by the eyes of the user and the mirror surface of the smart glasses, the smart glasses are controlled to switch to the first mode or the second mode depending on the distance between the object viewed by the eyes of the user and the mirror surface of the smart glasses. Through this disclosure, at the in-process that uses intelligent glasses, avoid the people to operate the change glasses mode, just can realize the switching of different glasses modes, improve intelligent glasses's intellectuality, promote user experience.
Fig. 2 is a flowchart illustrating a method for controlling smart glasses according to an exemplary embodiment, where as shown in fig. 2, the smart glasses have at least two modes, and the at least two modes include a first mode and a second mode, and a camera is built in the smart glasses, and the method for controlling smart glasses includes the following steps.
In step S21, during the use of the smart glasses, the distance between the object viewed by the user' S eyes and the mirror surface of the smart glasses is detected.
In step S22, the smart glasses are controlled to switch to the first mode or the second mode according to the distance.
In step S23, upon determining that the specified content is contained in the object viewed by the user, the specified content contained in the object is photographed.
The specified content in the present disclosure may be text content on a newspaper or content specified in an application. The content specified in the application program may be, for example, shopping information, web search information, or the like.
In the present disclosure, an object viewed by a user is photographed by using a camera, and when it is determined that the object viewed by the user includes a specific content, the specific content included in the object viewed by the user is photographed.
In step S24, a connection is established with the terminal, and the terminal is linked to process the designated content.
According to the method and the device, when the object watched by the user is determined to contain the designated content, the designated content is shot, connection with the terminal is triggered, and the designated content is processed in a linkage manner by the terminal. The intelligent glasses can be connected with the terminal through the installed Bluetooth module or the installed wireless WiFi module.
For example, taking the smart glasses as the smart glasses, and taking the terminal connected with the smart glasses as the sound box or the hearing aid as an example, the smart glasses are connected with the sound box or the hearing aid through the installed bluetooth module or the installed wireless module, and the text content in the newspaper is processed by linking with the sound box or the hearing aid, for example, the specified content may be sent to the sound box or the hearing aid, and the specified content may be broadcasted in the sound box or the hearing aid.
For another example, taking the smart glasses as the smart glasses and the terminal connected with the smart glasses as the mobile phone, when it is determined that the specified content in the object viewed by the user is shopping information or specified content of the network search content, the shopping information viewed by the user or the network search content is pushed to the mobile phone, and the pushed information related to the shopping information or the network search content is retrieved and displayed by the mobile phone. For example, the push information retrieved and presented by the mobile phone related to the shopping information may be commodity purchase information, and the push information retrieved and presented by the mobile phone related to the network search content may be network query information.
In an exemplary embodiment of the present disclosure, when it is determined that the object viewed by the user includes the designated content, the designated content included in the object is photographed through the smart glasses, the smart glasses are connected to the terminal, and the terminal is linked to process the designated content. Through this disclosure, can realize that intelligent glasses and terminal link, improve intelligent glasses's use degree, promote user experience.
Based on the same inventive concept, the disclosure also provides an intelligent glasses control device.
It is understood that, in order to implement the above functions, the application control device provided in the embodiments of the present disclosure includes a hardware structure and/or a software module corresponding to each function. The disclosed embodiments can be implemented in hardware or a combination of hardware and computer software, in combination with the exemplary elements and algorithm steps disclosed in the disclosed embodiments. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
Fig. 3 is a block diagram illustrating a smart eyewear control device 100 according to an exemplary embodiment. Referring to fig. 3, the smart glasses have at least two modes including a first mode and a second mode, and the smart glasses control device includes: a detection module 101 and a switching unit 102.
The detection unit 101 is configured to detect a distance between an object viewed by the eyes of the user and the mirror surface of the smart glasses during the use of the smart glasses;
the switching unit 102 is configured to control the smart glasses to switch to the first mode or the second mode according to the distance.
In an example, the switching unit 102 controls the smart glasses to switch to the first mode or the second mode according to the distance as follows: when the distance is smaller than a specified distance threshold value, switching to a first mode; and when the distance is greater than the specified distance threshold, switching to a second mode.
In one example, the first mode is a presbyopic mode and the second mode is a flat mirror mode.
In an example, there is the camera built-in the smart glasses, and smart glasses controlling means still includes: a photographing unit 103 configured to photograph the specified content contained in the object when it is determined that the specified content is contained in the object viewed by the user; and the linkage unit 104 is configured to establish connection between the smart glasses and the terminal and to process the specified content in linkage with the terminal.
In one example, the terminal is a sound box or a hearing aid; the linkage unit 104 links the terminal to process the designated content in the following manner: and sending the specified content to the sound box or the hearing aid, and broadcasting the specified content in the sound box or the hearing aid.
In one example, the linking unit 104 links the terminal to process the designated content as follows: and pushing the specified content to the terminal, and retrieving and displaying pushing information related to the specified content by the terminal.
In one example, the detection unit 101 detects an object viewed by the user's eyes as follows: and when the parallel angle between the mirror surface of the intelligent glasses and the object reaches a specified angle threshold value, determining that the user is watching the object.
It is understood that, in order to implement the above functions, the application control device provided in the embodiments of the present disclosure includes a hardware structure and/or a software module corresponding to each function. The disclosed embodiments can be implemented in hardware or a combination of hardware and computer software, in combination with the exemplary elements and algorithm steps disclosed in the disclosed embodiments. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 4 is a block diagram illustrating an apparatus 400 for a terminal in conjunction with smart glasses according to an example embodiment. For example, the apparatus 400 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 4, the apparatus 400 may include one or more of the following components: processing components 402, memory 404, power components 406, multimedia components 408, audio components 410, input/output (I/O) interfaces 412, sensor components 414, and communication components 416.
The processing component 402 generally controls overall operation of the apparatus 400, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 402 may include one or more processors 420 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 402 can include one or more modules that facilitate interaction between the processing component 402 and other components. For example, the processing component 402 can include a multimedia module to facilitate interaction between the multimedia component 408 and the processing component 402.
The memory 404 is configured to store various types of data to support operations at the device 400. Examples of such data include instructions for any application or method operating on the device 400, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 404 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power supply components 406 provide power to the various components of device 400. The power components 406 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power supplies for the apparatus 400.
The multimedia component 408 includes a screen that provides an output interface between the device 400 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 408 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 400 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 410 is configured to output and/or input audio signals. For example, audio component 410 includes a Microphone (MIC) configured to receive external audio signals when apparatus 400 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 404 or transmitted via the communication component 416. In some embodiments, audio component 410 also includes a speaker for outputting audio signals.
The I/O interface 412 provides an interface between the processing component 402 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 414 includes one or more sensors for providing various aspects of status assessment for the apparatus 400. For example, the sensor component 414 can detect the open/closed state of the device 400, the relative positioning of components, such as a display and keypad of the apparatus 400, the sensor component 414 can also detect a change in the position of the apparatus 400 or a component of the apparatus 400, the presence or absence of user contact with the apparatus 400, orientation or acceleration/deceleration of the apparatus 400, and a change in the temperature of the apparatus 400. The sensor assembly 414 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 414 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 414 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 416 is configured to facilitate wired or wireless communication between the apparatus 400 and other devices. The apparatus 400 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 416 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 416 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 400 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 404 comprising instructions, executable by the processor 420 of the apparatus 400 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It is further understood that the use of "a plurality" in this disclosure means two or more, as other terms are analogous. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. The singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It will be further understood that the terms "first," "second," and the like are used to describe various information and that such information should not be limited by these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the terms "first," "second," and the like are fully interchangeable. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It is further to be understood that while operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

1. A smart eyewear control method, wherein the smart eyewear has at least two modes, including a first mode and a second mode, the method comprising:
in the process of using the intelligent glasses, detecting the distance between an object viewed by the eyes of a user and the mirror surface of the intelligent glasses;
and controlling the intelligent glasses to be switched to a first mode or a second mode according to the distance.
2. The smart glasses control method according to claim 1, wherein controlling the smart glasses to switch to a first mode or a second mode according to the distance comprises:
when the distance is smaller than a specified distance threshold value, switching to a first mode;
switching to a second mode when the distance is greater than a specified distance threshold.
3. The smart eyewear control method of claim 1 or claim 2, wherein the first mode is a presbyopic mode and the second mode is a flat mode.
4. The method of claim 1, wherein a camera is built in the smart glasses, and the smart glasses control method further comprises:
when it is determined that an object viewed by a user contains specified content, shooting the specified content contained in the object;
and establishing connection with a terminal, and processing the specified content in a linkage manner by the terminal.
5. The method of claim 4, wherein the terminal is a sound box or a hearing aid;
the linkage terminal processes the appointed content and comprises the following steps:
and sending the specified content to the sound box or the hearing aid, and broadcasting the specified content in the sound box or the hearing aid.
6. The method of claim 4, wherein processing the specified content by the linked terminal comprises:
and pushing the specified content to the terminal, and retrieving and displaying pushing information related to the specified content by the terminal.
7. The method of claim 1, wherein detecting an object viewed by the user's eyes comprises:
determining that a user is viewing an object when it is determined that a parallel angle between a mirror surface of the smart glasses and the object reaches a specified angle threshold.
8. An apparatus for controlling smart glasses, wherein the smart glasses have at least two modes, including a first mode and a second mode, the apparatus comprising:
the detection unit is configured to detect the distance between an object viewed by the eyes of a user and the mirror surface of the intelligent glasses in the process of using the intelligent glasses;
the switching unit is configured to control the intelligent glasses to be switched to a first mode or a second mode according to the distance.
9. The smart eyewear control device of claim 8, wherein the switching unit controls the smart eyewear to switch to the first mode or the second mode depending on the distance in the following manner:
when the distance is smaller than a specified distance threshold value, switching to a first mode;
switching to a second mode when the distance is greater than a specified distance threshold.
10. The smart eyewear control device of claim 8 or 9, wherein the first mode is a presbyopic mode and the second mode is a flat mode.
11. The apparatus of claim 8, wherein the smart glasses have a camera built therein, and the smart glasses control apparatus further comprises:
a photographing unit configured to photograph a specified content contained in an object viewed by a user when it is determined that the specified content is contained in the object;
and the linkage unit is configured to establish connection between the intelligent glasses and a terminal and process the specified content in linkage with the terminal.
12. The device of claim 11, wherein the terminal is a sound box or a hearing aid;
the linkage unit is used for processing the appointed content by the linkage terminal in the following mode:
and sending the specified content to the sound box or the hearing aid, and broadcasting the specified content in the sound box or the hearing aid.
13. The apparatus according to claim 11, wherein the linkage unit is configured to process the designated content by linking the terminal in the following manner:
and pushing the specified content to the terminal, and retrieving and displaying pushing information related to the specified content by the terminal.
14. The apparatus according to claim 8, wherein the detection unit detects the object viewed by the user's eyes by:
determining that a user is viewing an object when it is determined that a parallel angle between a mirror surface of the smart glasses and the object reaches a specified angle threshold.
15. An intelligent glasses control device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: the steps of performing the smart eyewear control method of any one of claims 1-7.
16. A non-transitory computer readable storage medium storing computer executable instructions which, when executed by a processor, perform the steps of the smart eyewear control method of any one of claims 1-7.
CN202010015056.XA 2020-01-07 2020-01-07 Intelligent glasses control method, intelligent glasses control device and storage medium Pending CN111274870A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010015056.XA CN111274870A (en) 2020-01-07 2020-01-07 Intelligent glasses control method, intelligent glasses control device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010015056.XA CN111274870A (en) 2020-01-07 2020-01-07 Intelligent glasses control method, intelligent glasses control device and storage medium

Publications (1)

Publication Number Publication Date
CN111274870A true CN111274870A (en) 2020-06-12

Family

ID=71000103

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010015056.XA Pending CN111274870A (en) 2020-01-07 2020-01-07 Intelligent glasses control method, intelligent glasses control device and storage medium

Country Status (1)

Country Link
CN (1) CN111274870A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103838372A (en) * 2013-11-22 2014-06-04 北京智谷睿拓技术服务有限公司 Intelligent function start/stop method and system for intelligent glasses
CN106502422A (en) * 2016-11-23 2017-03-15 上海擎感智能科技有限公司 Intelligent glasses and its control method, control device
CN106896531A (en) * 2017-04-17 2017-06-27 云南中科物联网科技有限公司 A kind of automatic pre- myopic-preventing control method and intelligent glasses based on Internet of Things
CN107402632A (en) * 2017-07-12 2017-11-28 青岛海信移动通信技术股份有限公司 Switching shows the method and intelligent glasses of augmented reality image and virtual reality image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103838372A (en) * 2013-11-22 2014-06-04 北京智谷睿拓技术服务有限公司 Intelligent function start/stop method and system for intelligent glasses
CN106502422A (en) * 2016-11-23 2017-03-15 上海擎感智能科技有限公司 Intelligent glasses and its control method, control device
CN106896531A (en) * 2017-04-17 2017-06-27 云南中科物联网科技有限公司 A kind of automatic pre- myopic-preventing control method and intelligent glasses based on Internet of Things
CN107402632A (en) * 2017-07-12 2017-11-28 青岛海信移动通信技术股份有限公司 Switching shows the method and intelligent glasses of augmented reality image and virtual reality image

Similar Documents

Publication Publication Date Title
EP3096209B1 (en) Method and device for recognizing object
CN106527682B (en) Method and device for switching environment pictures
KR20150131815A (en) Mobile terminal and controlling method thereof
EP3048508A1 (en) Methods, apparatuses and devices for transmitting data
US20180139790A1 (en) Methods, apparatuses and storage medium for controlling a wireless connection
CN111611034A (en) Screen display adjusting method and device and storage medium
CN107797662B (en) Viewing angle control method and device and electronic equipment
CN112217990A (en) Task scheduling method, task scheduling device, and storage medium
EP3438924B1 (en) Method and device for processing picture
CN112331158B (en) Terminal display adjusting method, device, equipment and storage medium
CN112667074A (en) Display method, display device and storage medium
CN111225111A (en) Function control method, function control device, and storage medium
CN109255839B (en) Scene adjustment method and device
CN112423092A (en) Video recording method and video recording device
EP3599763A2 (en) Method and apparatus for controlling image display
CN107948876B (en) Method, device and medium for controlling sound box equipment
CN106951171B (en) Control method and device of virtual reality helmet
CN115576417A (en) Interaction control method, device and equipment based on image recognition
CN113315904B (en) Shooting method, shooting device and storage medium
EP3961363A1 (en) Number input method, apparatus, and storage medium
EP3905660A1 (en) Method and device for shooting image, and storage medium
CN111274870A (en) Intelligent glasses control method, intelligent glasses control device and storage medium
CN112148149A (en) Touch screen control method, touch screen control device and storage medium
CN112130787A (en) Electronic equipment, display signal transmission system, method and device
CN111314232A (en) Application acceleration method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination