CN115033097A - Control method and device of intelligent glasses and intelligent glasses - Google Patents

Control method and device of intelligent glasses and intelligent glasses Download PDF

Info

Publication number
CN115033097A
CN115033097A CN202210474967.8A CN202210474967A CN115033097A CN 115033097 A CN115033097 A CN 115033097A CN 202210474967 A CN202210474967 A CN 202210474967A CN 115033097 A CN115033097 A CN 115033097A
Authority
CN
China
Prior art keywords
user
display screen
virtual cursor
head
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210474967.8A
Other languages
Chinese (zh)
Inventor
李剑文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yingmu Technology Co ltd
Original Assignee
Shenzhen Yingmu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yingmu Technology Co ltd filed Critical Shenzhen Yingmu Technology Co ltd
Priority to CN202210474967.8A priority Critical patent/CN115033097A/en
Publication of CN115033097A publication Critical patent/CN115033097A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the field of wearable equipment, in particular to a control method and device of intelligent glasses and the intelligent glasses. A smart eyewear control method, the method comprising: when the head action of a user is detected, determining the moving distance of a virtual cursor on a display screen of the intelligent glasses according to the head action of the user; according to the moving distance, carrying out position movement on the virtual cursor in the display screen; and after the virtual cursor is determined to move to the target position, controlling the virtual cursor to execute a corresponding operation instruction on the content on the display screen according to a preset action. According to the embodiment of the invention, the head action of the user is sensed and acquired by the sensor carried by the intelligent glasses, so that the user can control the virtual cursor in the display screen to move to the specified position according to the head action, and then the virtual cursor is controlled and operated through the customized related preset action, so that the interaction with the intelligent glasses is realized.

Description

Control method and device of intelligent glasses and intelligent glasses
[ technical field ] A
The application relates to the field of wearable equipment, in particular to a control method and device of intelligent glasses and the intelligent glasses.
[ background ] A method for producing a semiconductor device
As a new intelligent wearable terminal, the intelligent glasses are morphologically very different from the traditional intelligent equipment. Traditional smart machine all possesses basic demonstration touch-sensitive screen, can conveniently use finger touch-control to realize the interaction with interface element. The intelligent glasses are used as the head wearing device, the display screen of the intelligent glasses is different from that of traditional intelligent equipment, and generally, the optical device forms a virtual display screen to be displayed to a user by means of reflection or diffraction principle, which means that the intelligent glasses cannot be provided with a display touch screen owned by the traditional intelligent equipment. Therefore, how to interact with the interface elements on the smart glasses, controlling the functional use of the smart glasses is a great challenge for the development of the smart glasses.
[ summary of the invention ]
In view of this, embodiments of the present invention provide a method and an apparatus for controlling smart glasses, and smart glasses, so as to implement control over smart glasses without setting an entity touch screen.
In a first aspect, an embodiment of the present invention provides a method for controlling smart glasses, where the method includes:
when the head action of a user is detected, determining the moving distance of a virtual cursor on a display screen of the intelligent glasses according to the head action of the user;
according to the moving distance, carrying out position movement on the virtual cursor in the display screen;
and after the virtual cursor is determined to move to the target position, controlling the virtual cursor to execute a corresponding operation instruction on the content on the display screen according to a preset action.
Optionally, the determining, according to the head motion of the user, a moving distance of a virtual cursor on a display screen of the smart glasses is before, the method includes:
determining the resolution of the display screen and the maximum angle which the head of a user can rotate;
determining the unit pixel distance of the corresponding movement of the virtual cursor in the display screen when the head of the user rotates a unit angle according to the resolution of the display screen and the maximum angle of rotation of the head of the user;
the unit pixel distance of the corresponding movement of the virtual cursor is used for determining the movement distance of the virtual cursor on the display screen of the intelligent glasses according to the head action of the user.
Optionally, the moving distance of the virtual cursor on the display screen of the smart glasses is determined according to the head action of the user, and the method includes:
when the head action of a user is detected, receiving the head data of the user, which is collected by a sensor carried on the intelligent glasses;
determining the rotation angle of the head of the user according to the data of the head of the user;
determining the pixel distance of the virtual cursor to be moved on the display screen according to the rotation angle of the head of the user and the unit pixel distance of the virtual cursor moved corresponding to the head action of the user, which is obtained by pre-calculation;
and determining the moving distance of the virtual cursor on the display screen of the intelligent glasses according to the pixel distance of the virtual cursor on the display screen needing to move.
Optionally, the controlling the virtual cursor to execute a corresponding operation instruction on the content on the display screen according to a preset action includes:
presetting a preset action and an operation instruction corresponding to the preset action;
and when the user is identified to make the preset action, controlling the virtual cursor to execute an operation instruction corresponding to the preset action.
Optionally, the preset action includes: gesture motion, facial motion, and head motion.
Optionally, the method for controlling the smart glasses further includes:
establishing communication connection between the intelligent glasses and first terminal equipment;
and controlling the virtual cursor to execute a corresponding operation instruction on the content on the display screen through the first terminal equipment.
In a second aspect, an embodiment of the present invention provides a control device for smart glasses, including:
the determining module is used for determining the moving distance of the virtual cursor on the display screen of the intelligent glasses according to the head action of the user when the head action of the user is detected;
the moving module moves the position of the virtual cursor in the display screen according to the moving distance;
and the control module is used for controlling the virtual cursor to execute a corresponding operation instruction on the content on the display screen according to a preset action after determining that the virtual cursor moves to a target position.
Optionally, the determining module further includes:
determining the resolution of the display screen and the maximum angle which the head of a user can rotate;
and determining the unit pixel distance of the corresponding movement of the virtual cursor in the display screen when the head of the user rotates by a unit angle according to the resolution of the display screen and the maximum angle of rotation of the head of the user.
In a third aspect, an embodiment of the present invention provides a control device for smart glasses, including:
at least one processor; and
at least one memory communicatively coupled to the processor, wherein:
the memory stores program instructions executable by the processor, the processor being capable of performing the method of any of the first aspects when invoked by the program instructions.
In a fourth aspect, an embodiment of the present invention provides a pair of smart glasses, including: the method for controlling smart glasses according to any one of the first aspect, the device for controlling smart glasses according to any one of the second aspect, and the device for controlling smart glasses according to the third aspect.
According to the method, the head action of the user is sensed and obtained through the sensor carried by the intelligent glasses, the head moving distance of the user is mapped into the display screen of the intelligent glasses and is reflected by the moving distance of the virtual cursor, so that the user can control the virtual cursor in the display screen to move to the specified position according to the head action under the condition of no touch screen, and then the virtual cursor is controlled and operated through the customized related preset action, so that the interaction between the user and the intelligent glasses is realized.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a control method for smart glasses according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a control device of smart glasses according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
[ detailed description ] embodiments
For better understanding of the technical solutions of the present invention, the following detailed descriptions of the embodiments of the present invention are provided with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
According to the embodiment of the invention, the head action of the user is sensed and acquired by the aid of the sensor carried by the intelligent glasses, the virtual cursor in the display screen is controlled to move to the specified position according to the head action of the user, and the virtual cursor is operated through the customized related preset action, so that the interaction with the intelligent glasses is realized.
The embodiment of the invention provides a control method of intelligent glasses, which is used for realizing the operation and control of the intelligent glasses under the condition that an entity display touch screen is not arranged. As shown in fig. 1, the processing steps of the method include:
and 101, determining the unit pixel distance of the corresponding movement of the virtual cursor in the display screen when the head of the user rotates by a unit angle.
Because the actual conditions among users are different, unified standards cannot be formulated for the head actions of different users, when the user uses the intelligent glasses for the first time, the virtual cursor moving distance on the display screen needs to be set individually according to the actual conditions of the head rotation of the user.
Specifically, the resolution of the display screen presented by the smart glasses is first determined, and the maximum angle that the user's head can be rotated in each direction, i.e., pitch and yaw, is determined.
How many pixel blocks the display screen has in the x-axis and the y-axis, respectively, can be determined according to the resolution of the display screen, respectively. And dividing the number of pixel blocks on the x axis and the y axis of the display screen by the maximum rotating angle of the head of the user in the yaw angle and the pitch angle respectively to obtain the unit pixel distance of the corresponding movement of the virtual cursor when the head of the user rotates by a unit angle. Generally, the unit angle of the head rotation of the user is 1 °, and in practical application, the accuracy of the unit angle can be modified according to the actual requirement of the user, so as to change the accuracy of the movement of the virtual cursor.
The pitch angle of the rotation of the head of the user is an included angle between the head of the user and the horizontal plane, is used for measuring the up-and-down movement capacity and range of the head of the user and corresponds to a y axis in a two-dimensional coordinate of a display screen; the yaw angle of the head of the user is an included angle between the head of the user and a vertical plane, is used for measuring the capability and the range of the head of the user to move left and right, and corresponds to an x axis in a two-dimensional coordinate of a display screen.
In a specific embodiment, the range of motion of the yaw angle of the head of the user is 100 °, 500 ten thousand pixel blocks are arranged on the x-axis of the display screen of the smart glasses, and the division can determine that the unit pixel distance of the corresponding movement of the virtual cursor is 5 ten thousand when the head of the user moves left and right, that is, the virtual cursor on the display screen moves 5 ten thousand pixel blocks to the corresponding position every time the head of the user deflects 1 °.
In some embodiments, the resolutions of the smart glasses in the respective modes are different, and when the smart glasses are set to the energy saving mode, the resolution of the display screen presented by the smart glasses may decrease, the number of displayed pixel blocks may decrease, and the unit pixel distance that the virtual cursor correspondingly moves may correspondingly decrease; when the intelligent glasses are set to be in the high-performance mode, the resolution of a display screen displayed by the intelligent glasses can be improved, the number of displayed pixel blocks can be increased, and the unit pixel distance of the virtual cursor corresponding to movement can be correspondingly increased.
When the unit pixel distance of the corresponding movement of the virtual cursor is determined according to the head of the user, the unit pixel distance of the corresponding movement of the virtual cursor under different resolutions needs to be calculated according to different modes of the intelligent glasses.
In the embodiment of the invention, the unit pixel distance of the virtual cursor correspondingly moved according to the head action of the user only needs to be calculated when the user uses the intelligent glasses for the first time, and the calculation result is stored without recalculation in each wearing process.
102, when the head action of the user is detected, determining the moving distance of the virtual cursor on the display screen of the intelligent glasses according to the head action of the user.
When the sensor carried by the intelligent glasses detects the head action of the user, the processing unit receives the head data of the user collected and sent by the sensor, and accordingly the rotation direction and the angle of the head of the user are determined. And determining the pixel distance of the virtual cursor to be moved on the display screen according to the pre-calculated unit pixel distance of the virtual cursor moved corresponding to the head action of the user and the rotation direction and angle of the head of the user.
In one embodiment, the unit pixel distance that the virtual cursor moves corresponding to the user's head motion at the yaw angle is 5 tens of thousands. When the user is determined to rotate 20 degrees rightwards according to the head data of the user collected by the sensor, a virtual cursor on a display screen correspondingly moves to the x-axis positive half shaft by the distance of 100 ten thousand pixel blocks.
Wherein, the sensor that intelligent glasses carried includes gyroscope sensor and earth magnetism sensor. The data collected by the gyroscope sensor and the geomagnetic sensor are subjected to fusion calculation, so that the head data of the user can be obtained more accurately.
And 103, according to the moving distance, carrying out position movement on the virtual cursor in the display screen.
Specifically, among the virtual contents displayed on the display screen, only the virtual cursor moves along with the head movement of the user, and the other virtual contents on the display screen are relatively static, so that the user can control the virtual cursor to operate the virtual cursor.
And 104, after the virtual cursor is determined to move to the target position, controlling the virtual cursor to execute a corresponding operation instruction on the content on the display screen according to the preset action.
After the user moves the virtual cursor to the target position, the intelligent glasses can recognize the virtual cursor by making a preset action, so that an operation instruction of the virtual cursor is realized.
Specifically, the user may enter some preset actions and operation instructions corresponding to the preset actions in advance through an auxiliary service provided by the system mounted on the smart glasses. When the intelligent glasses recognize and capture the preset actions of the user, the virtual cursor is controlled to execute the corresponding operation instruction.
The preset actions which can be input by the user comprise gesture actions, facial actions and head actions, and the operation instructions for controlling the virtual cursor to execute comprise single click, double click and sliding.
Hardware equipment carried on the intelligent glasses such as a camera and a sensor is used for capturing preset actions made by a user, a front camera carried on the intelligent glasses can capture facial actions made by the user, a rear camera can capture gesture actions made by the user, and the sensor can capture head actions made by the user.
In a specific embodiment, when a front camera carried by the intelligent glasses catches that the right eye of a user blinks for three times continuously, the front camera controls a virtual cursor to execute preset double-click operation on the content at the current position on a display screen; when the rear camera carried by the intelligent glasses catches that the thumb and the forefinger of the right hand of a user are bent at the same time, the virtual cursor can be controlled to execute preset clicking operation on the content at the current position on the display screen.
In some embodiments, preset actions made by the user may be captured according to other smart wearable devices worn by the user. Specifically, due to the limitation of hardware equipment conditions of the smart glasses, other preset actions such as limb actions and the like made by the user cannot be identified. Other intelligent wearing equipment who wears through the user gather the user like intelligent bracelet like limbs action such as swing arm to after discerning the user and making corresponding action of predetermineeing, with information transmission for intelligent glasses, supply intelligent glasses to control virtual cursor execution and correspond operating instruction.
According to the embodiment of the invention, the head action of the user is sensed and acquired by the sensor carried by the intelligent glasses, the head moving distance of the user is mapped into the display screen of the intelligent glasses and is reflected by the moving distance of the virtual cursor, so that the user can control the virtual cursor in the display screen to move to a specified position according to the head action, and then the virtual cursor is controlled and operated by the customized related preset action, so that the interaction with the intelligent glasses is realized.
In some embodiments, the user may also complete control of the smart glasses by binding the terminal device. Specifically, the user can download the application program provided by the intelligent glasses on the terminal device, establish the communication connection between the terminal device and the intelligent glasses through bluetooth or a network, and issue a relative operation instruction to the intelligent glasses in the application program, so as to realize the control of the intelligent glasses.
In some embodiments, other virtual content in the display screen besides the virtual cursor may also be controlled to move. Specifically, when a user needs to display some content in the display screen all the time, the corresponding content can be selected by controlling the virtual cursor, and the selected content is controlled to move together with the head action of the user through the corresponding preset action. Other virtual contents in the screen can be displayed by the operation and are moved under the control of the head action of the user as the virtual cursor.
Corresponding to the control method of the intelligent glasses, the embodiment of the invention also provides a control device of the intelligent glasses. Referring to fig. 2, the control device of the smart glasses may include: a determination module 201, a movement module 202 and a control module 203.
The determining module 201 is configured to determine, when detecting a head action of a user, a moving distance of a virtual cursor on a display screen of the smart glasses according to the head action of the user;
the moving module 202 is used for moving the position of the virtual cursor in the display screen according to the moving distance;
and the control module 203 controls the virtual cursor to execute a corresponding operation instruction on the content on the display screen according to a preset action after determining that the virtual cursor moves to the target position.
The control device of the smart glasses provided in the embodiment shown in fig. 2 may be used to implement the technical solutions of the method embodiments shown in this specification, and the implementation principles and technical effects thereof may further refer to the related descriptions in the method embodiments.
Fig. 3 is a schematic structural diagram of an embodiment of an electronic device in the present specification, where the electronic device may be a control device of smart glasses. As shown in fig. 3, the electronic device may include at least one processor; and at least one memory communicatively coupled to the processing unit, wherein: the memory stores program instructions executable by the processing unit, and the processor calls the program instructions to execute the control method of the smart glasses provided by the embodiment.
The electronic device may be a device capable of performing an intelligent conversation with a user, for example: the cloud server and the embodiment of the present specification do not limit the specific form of the electronic device. It is understood that the electronic device is a machine mentioned in the method embodiment.
FIG. 3 illustrates a block diagram of an exemplary electronic device suitable for use in implementing embodiments of the present specification. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present specification.
As shown in fig. 3, the electronic device is in the form of a general purpose computing device. Components of the electronic device may include, but are not limited to: one or more processors 310, a communication interface 320, a memory 330, and a communication bus 340 that couples various system components including the memory 330, the communication interface 320, and the processors 310.
Communication bus 340 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic devices typically include a variety of computer system readable media. Such media may be any available media that is accessible by the electronic device and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 330 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) and/or cache Memory. The electronic device may further include other removable/non-removable, volatile/nonvolatile computer system storage media. Memory 430 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the present description.
A program/utility having a set (at least one) of program modules may be stored in memory 330, such program modules including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may include an implementation of a network environment. The program modules generally perform the functions and/or methodologies of the embodiments described herein.
The processor 310 executes various functional applications and data processing by executing programs stored in the memory 330, for example, implementing a control method of the smart glasses provided by the embodiments shown in this specification.
The embodiment of the present specification provides a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute a method for controlling smart glasses provided by the embodiment shown in the present specification.
The non-transitory computer readable storage medium described above may take any combination of one or more computer readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a flash Memory, an optical fiber, a portable compact disc Read Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present description may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of Network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The embodiment of the invention also provides the intelligent glasses. The intelligent glasses are used for executing the control method of the intelligent glasses, so that a user can be helped to control the intelligent glasses under the condition that the touch screen is not available.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present specification, "a plurality" means at least two, e.g., two, three, etc., unless explicitly defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present description in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present description.
The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be noted that, the terminal referred to in the embodiments of the present disclosure may include, but is not limited to, a Personal Computer (Personal Computer; hereinafter, PC), a Personal Digital Assistant (Personal Digital Assistant; hereinafter, PDA), a wireless handheld device, a Tablet Computer (Tablet Computer), a mobile phone, an MP3 player, an MP4 player, and the like.
In the embodiments provided in the present specification, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present description may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a Processor (Processor) to execute some steps of the methods described in the embodiments of the present disclosure.
The above description is only a preferred embodiment of the present disclosure, and should not be taken as limiting the present disclosure, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (10)

1. A method of controlling smart eyewear, the method comprising:
when the head action of a user is detected, determining the moving distance of a virtual cursor on a display screen of the intelligent glasses according to the head action of the user;
according to the moving distance, carrying out position movement on the virtual cursor in the display screen;
and after the virtual cursor is determined to move to the target position, controlling the virtual cursor to execute a corresponding operation instruction on the content on the display screen according to a preset action.
2. The method according to claim 1, wherein the determining that the virtual cursor is before the moving distance of the display screen of the smart glasses according to the head action of the user comprises:
determining the resolution of the display screen and the maximum angle which the head of a user can rotate;
determining a unit pixel distance of the corresponding movement of the virtual cursor in the display screen when the head of the user rotates a unit angle according to the resolution of the display screen and the maximum angle of rotation of the head of the user;
the unit pixel distance of the corresponding movement of the virtual cursor is used for determining the movement distance of the virtual cursor on the display screen of the intelligent glasses according to the head action of the user.
3. The method according to claim 2, wherein the moving distance of the virtual cursor on the display screen of the smart glasses is determined according to the head action of the user, and the method comprises the following steps:
when the head action of a user is detected, receiving the head data of the user, which is collected by a sensor carried on the intelligent glasses;
determining the rotation angle of the head of the user according to the data of the head of the user;
determining the pixel distance of the virtual cursor to be moved on the display screen according to the rotation angle of the head of the user and the unit pixel distance of the virtual cursor moved corresponding to the head action of the user, which is obtained by pre-calculation;
and determining the moving distance of the virtual cursor on the display screen of the intelligent glasses according to the pixel distance of the virtual cursor on the display screen needing to move.
4. The method according to claim 1, wherein the controlling the virtual cursor to execute the corresponding operation instruction on the content on the display screen according to the preset action comprises:
presetting a preset action and an operation instruction corresponding to the preset action;
and when the user is identified to make the preset action, controlling the virtual cursor to execute an operation instruction corresponding to the preset action.
5. The method of claim 4, wherein the preset action comprises: gesture movements, facial movements, and head movements.
6. The method according to claim 1, wherein the method of controlling smart glasses further comprises:
establishing communication connection between the intelligent glasses and first terminal equipment;
and controlling the virtual cursor to execute a corresponding operation instruction on the content on the display screen through the first terminal equipment.
7. A control device of intelligent glasses, comprising:
the determining module is used for determining the moving distance of the virtual cursor on the display screen of the intelligent glasses according to the head action of the user when the head action of the user is detected;
the moving module moves the position of the virtual cursor in the display screen according to the moving distance;
and the control module is used for controlling the virtual cursor to execute a corresponding operation instruction on the content on the display screen according to a preset action after determining that the virtual cursor moves to a target position.
8. The control device of smart glasses according to claim 7, wherein the determining module further comprises:
determining the resolution of the display screen and the maximum angle which the head of a user can rotate;
and determining the unit pixel distance of the corresponding movement of the virtual cursor in the display screen when the head of the user rotates by a unit angle according to the resolution of the display screen and the maximum angle of rotation of the head of the user.
9. A control device of smart glasses, comprising:
at least one processor; and
at least one memory communicatively coupled to the processor, wherein:
the memory stores program instructions executable by the processor, the processor invoking the program instructions to perform the method of any of claims 1 to 6.
10. A smart eyewear, comprising: the control method of smart glasses according to any one of claims 1 to 6, the control device of smart glasses according to any one of claims 7 to 8, and the control apparatus of smart glasses according to claim 9.
CN202210474967.8A 2022-04-29 2022-04-29 Control method and device of intelligent glasses and intelligent glasses Pending CN115033097A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210474967.8A CN115033097A (en) 2022-04-29 2022-04-29 Control method and device of intelligent glasses and intelligent glasses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210474967.8A CN115033097A (en) 2022-04-29 2022-04-29 Control method and device of intelligent glasses and intelligent glasses

Publications (1)

Publication Number Publication Date
CN115033097A true CN115033097A (en) 2022-09-09

Family

ID=83119594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210474967.8A Pending CN115033097A (en) 2022-04-29 2022-04-29 Control method and device of intelligent glasses and intelligent glasses

Country Status (1)

Country Link
CN (1) CN115033097A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115686203A (en) * 2022-10-17 2023-02-03 北京多屏未来科技有限公司 Head-moving interaction method, device, equipment and storage medium of intelligent glasses terminal
CN115795421A (en) * 2022-10-24 2023-03-14 北京多屏未来科技有限公司 Method, device, equipment and storage medium for unlocking screen of intelligent glasses terminal

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115686203A (en) * 2022-10-17 2023-02-03 北京多屏未来科技有限公司 Head-moving interaction method, device, equipment and storage medium of intelligent glasses terminal
CN115795421A (en) * 2022-10-24 2023-03-14 北京多屏未来科技有限公司 Method, device, equipment and storage medium for unlocking screen of intelligent glasses terminal

Similar Documents

Publication Publication Date Title
US11983326B2 (en) Hand gesture input for wearable system
US10289214B2 (en) Method and device of controlling virtual mouse and head-mounted displaying device
JP2022540315A (en) Virtual User Interface Using Peripheral Devices in Artificial Reality Environment
CN115033097A (en) Control method and device of intelligent glasses and intelligent glasses
US20110148755A1 (en) User interface apparatus and user interfacing method based on wearable computing environment
JP7382994B2 (en) Tracking the position and orientation of virtual controllers in virtual reality systems
KR20140100547A (en) Full 3d interaction on mobile devices
US20230093983A1 (en) Control method and device, terminal and storage medium
US20170344131A1 (en) Pointer-based gui for mobile devices
Osunkoya et al. Gesture-based human-computer-interaction using Kinect for windows mouse control and powerpoint presentation
CN106598422B (en) hybrid control method, control system and electronic equipment
CN106569716B (en) Single-hand control method and control system
WO2021004413A1 (en) Handheld input device and blanking control method and apparatus for indication icon of handheld input device
Lang et al. A multimodal smartwatch-based interaction concept for immersive environments
US10739866B2 (en) Using a wearable device to control characteristics of a digital pen
US10379639B2 (en) Single-hand, full-screen interaction on a mobile device
CN112181551A (en) Information processing method and related equipment
CN110908568B (en) Control method and device for virtual object
KR20180058097A (en) Electronic device for displaying image and method for controlling thereof
US10082936B1 (en) Handedness determinations for electronic devices
CN113457117B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN113457144B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
US10191553B2 (en) User interaction with information handling systems using physical objects
CN110162251B (en) Image scaling method and device, storage medium and electronic equipment
US20180253213A1 (en) Intelligent Interaction Method, Device, and System

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination