CN117930970A - Electronic equipment control method, device, equipment and medium - Google Patents

Electronic equipment control method, device, equipment and medium Download PDF

Info

Publication number
CN117930970A
CN117930970A CN202311852082.8A CN202311852082A CN117930970A CN 117930970 A CN117930970 A CN 117930970A CN 202311852082 A CN202311852082 A CN 202311852082A CN 117930970 A CN117930970 A CN 117930970A
Authority
CN
China
Prior art keywords
control
screen
instruction
eye
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311852082.8A
Other languages
Chinese (zh)
Inventor
曾凡进
叶嘉豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Bounds Inc
Original Assignee
Meta Bounds Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meta Bounds Inc filed Critical Meta Bounds Inc
Priority to CN202311852082.8A priority Critical patent/CN117930970A/en
Publication of CN117930970A publication Critical patent/CN117930970A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application relates to the technical field of electronic equipment, and provides a control method, a device, equipment and a storage medium of electronic equipment. The operation target is determined by identifying the current instruction coordinates of the eye gaze point of the current user in the screen coordinate system. The operation actions of the user are identified through the action sensor, the instruction type is further determined, the user can conveniently and directly generate and issue the operation instructions through the near-to-eye display device, remote control of the controlled device is achieved, multi-terminal operation modes of the controlled device are avoided, and control convenience of the electronic device is improved.

Description

Electronic equipment control method, device, equipment and medium
Technical Field
The present application relates to the field of electronic devices, and in particular, to a method and apparatus for controlling an electronic device, a computer device, and a storage medium.
Background
Electronic devices in the market at present, including mobile phones, computers, tablets and smart televisions, are generally controlled by finger touch, remote controllers, mice, keyboards, game pads, and the like. Wherein, besides smart phones, several other devices are connected with the controlled equipment by wire/wireless. Then, the control device sends an instruction to the controlled device, and the controlled device responds to the control instruction to complete the corresponding function.
However, the existing electronic device control method has the problems that part of the control devices (such as remote controllers) are easy to lose, or part of the devices (keyboards) are large in size and inconvenient to carry. And the control of the electronic equipment cannot be separated from the hands, and part of the equipment (smart phone, keyboard and the like) must occupy two hands.
Therefore, how to improve the operation convenience of the electronic device is a technical problem to be solved.
Disclosure of Invention
The application provides a control method, a control device and a storage medium of electronic equipment, and aims to improve the operation convenience of the electronic equipment.
In a first aspect, the present application provides an electronic device control method, including:
When the near-eye display equipment is in a wearing state and is in communication connection with the controlled equipment, projecting a screen display picture of the controlled equipment onto a projection display plane of the near-eye display equipment to obtain a screen coordinate system;
Identifying current instruction coordinates of the eye gaze point of the current user in the screen coordinate system;
Collecting the operation action of the current user and determining the instruction type;
And generating a control instruction based on the current instruction coordinate and the instruction type, and sending the control instruction to the controlled equipment so as to control the controlled equipment to execute the operation action corresponding to the control instruction.
In a second aspect, the present application also provides an electronic device control apparatus, including:
The system comprises a screen coordinate system acquisition module, a control module and a control module, wherein the screen coordinate system acquisition module is used for projecting a screen display picture of a controlled device onto a projection display plane of the near-eye display device when the near-eye display device is in a wearing state and the near-eye display device is in communication connection with the controlled device, so as to acquire a screen coordinate system;
the current instruction coordinate recognition module is used for recognizing the current instruction coordinate of the eye gaze point of the current user in the screen coordinate system;
the instruction type determining module is used for collecting the operation actions of the current user and determining the instruction type;
And the equipment control module is used for generating a control instruction based on the current instruction coordinate and the instruction type, and sending the control instruction to the controlled equipment so as to control the controlled equipment to execute the operation action corresponding to the control instruction.
In a third aspect, the present application also provides a computer device comprising a processor, a memory, and a computer program stored on the memory and executable by the processor, wherein the computer program when executed by the processor implements the steps of the electronic device control method as described above.
In a fourth aspect, the present application also provides a computer-readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of the electronic device control method as described above.
The application provides a control method, a device, equipment and a storage medium of electronic equipment, wherein the method comprises the steps of projecting a screen display picture of controlled equipment onto a projection display plane of near-eye display equipment when the near-eye display equipment is in a wearing state and the near-eye display equipment is in communication connection with the controlled equipment, so as to obtain a screen coordinate system; identifying current instruction coordinates of the eye gaze point of the current user in the screen coordinate system; collecting the operation action of the current user and determining the instruction type; and generating a control instruction based on the current instruction coordinate and the instruction type, and sending the control instruction to the controlled equipment so as to control the controlled equipment to execute the operation action corresponding to the control instruction. By the method, the screen display picture of the controlled device is projected onto the projection display plane through the near-eye display device, the screen coordinate system is constructed, a user can conveniently and directly check the display content of the controlled device through the near-eye display device, the picture mapping relation between the screen coordinate system and the controlled device is realized, and the operation accuracy of the electronic device operation mode based on eye tracking is improved. The operation target is determined according to the screen coordinate system and the picture mapping relation of the controlled equipment by identifying the current instruction coordinate of the eye gaze point of the current user in the screen coordinate system, and the hand operation is liberated, so that the control convenience of the controlled equipment is improved. The operation actions of the user are identified through the action sensor, the instruction type is further determined, the user can conveniently and directly generate and issue the operation instructions through the near-to-eye display device, remote control of the controlled device is achieved, multi-terminal operation modes of the controlled device are avoided, and control convenience of the electronic device is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a first embodiment of a control method of an electronic device according to an embodiment of the present application;
Fig. 2 is a schematic diagram of coordinate tracking of a human eye annotation point according to an embodiment of the present application;
Fig. 3 is a schematic flow chart of a second embodiment of a control method of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a third embodiment of a control method of an electronic device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a first embodiment of an electronic device control apparatus provided by the present application;
fig. 6 is a schematic block diagram of a computer device according to an embodiment of the present application.
The achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The flow diagrams depicted in the figures are merely illustrative and not necessarily all of the elements and operations/steps are included or performed in the order described. For example, some operations/steps may be further divided, combined, or partially combined, so that the order of actual execution may be changed according to actual situations.
Some embodiments of the present application are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a flowchart of a first embodiment of a control method for an electronic device according to an embodiment of the present application.
As shown in fig. 1, the electronic device control method includes steps S101 to S104.
S101, when near-eye display equipment is in a wearing state and is in communication connection with controlled equipment, projecting a screen display picture of the controlled equipment onto a projection display plane of the near-eye display equipment to obtain a screen coordinate system;
In one embodiment, near-to-eyedisplay (near-eye) displays, also known as head-mounted or wearable displays, create virtual images in the field of view of a single or both eyes, and near-eye displays are techniques for reconstructing a virtual scene in front of the eyes by rendering light field information to the eyes through a display device placed within a non-apparent distance of the eyes.
By way of example, the near-eye display device may include a wearable display device such as smart glasses (e.g., AR glasses, VR glasses, etc.), smart helmets, and the like.
In an embodiment, the controlled device may be a terminal device such as a mobile phone, a tablet computer, or a smart television.
In an embodiment, the near-eye display device and the controlled device may be connected in a communication manner by a data line, bluetooth, WIFI, or other communication technologies.
In an embodiment, the wearing state of the near-eye display device may be detected by a sensor device such as a pressure sensor and a temperature sensor, for example, the pressure sensor and/or the temperature sensor may be disposed at a nose pad or an ear pad of the smart glasses, and when the pressure sensor detects the continuous pressure or the temperature sensor detects the temperature of the human body, it may be determined that the near-eye display device is in the wearing state.
In an embodiment, the near-eye display device may further determine whether the near-eye display device is in a wearing state through an iris recognition technology. By detecting the iris information of the current user, the near-to-eye display equipment can be considered to be in a wearing state and started when the iris information of the current user can be identified and the identity authentication of the current user passes.
In an embodiment, the near-eye display device may be further manually started by a current user, and the near-eye display device may be determined to be in a wearing state when started.
In an embodiment, the wearing state of the near-eye display device may also be detected and determined by other means. When the near-to-eye display equipment is detected to be in a wearing state, the peripheral matched and connected controlled equipment (the controlled equipment needs to start a connection authorization function and permission, such as Bluetooth, WIFI and the like) can be searched, and when the controlled equipment is searched, the matched and connected equipment can be actively selected; or a connection establishment request is sent to the controlled equipment, and when the request passes, the communication connection between the near-eye display equipment and the controlled equipment can be realized.
Further, the screen display picture of the controlled equipment is obtained; based on the near-eye display device, projecting the screen display picture into the projection display plane to obtain a projection display picture; and constructing the screen coordinate system based on the pixel resolution of the projection display picture.
In an embodiment, the screen display of the controlled device may be transmitted to the near-eye display device through a wired screen-casting manner such as HDMI (HighDefinitionMultimediaInterface, high-definition multimedia interface) or DP (display interface).
The HDMI is a full digital video and audio transmission interface, and is capable of transmitting uncompressed audio and video signals. HDMI can be used for set top boxes, DVD players, personal computers, televisions, game consoles, combination expansion machines, digital audio and television sets, and other devices. HDMI can transmit audio and video signals simultaneously.
DisplayPort (DP) is a digital video interface standard developed by the PC and chip manufacturer association and standardized by the Video Electronics Standards Association (VESA). DP is a display communication port that relies on packetized data transmission techniques. The DisplayPort protocol is based on small data messages, called micro-messages, which enable embedding timer signals in the data stream, which has the advantage that a higher resolution can be achieved with a smaller pin count. It can be used for both internal display connections and external display connections. DP may be used to transmit audio and video simultaneously, each of which may be transmitted separately without the other.
In an embodiment, the screen display of the controlled device may also be transmitted to the near-eye display device by using a DLMA (DigitalLivingNetworkAlliance ) or other wireless screen projection technology.
In one embodiment, DLNA provides wireless networking functionality for audio devices and video devices, including computers, gaming machines, digital cameras, and portable/mobile products. It is used to enable sharing of digital media and content services between compatible devices over a wireless network. DLNA requires that the interconnected devices are necessarily within the same network segment and that the network transmission efficiency is also high due to the protocols dedicated to the media playback of the local area network.
In an embodiment, the near-eye display device may also transmit the screen display of the controlled device to the near-eye display device by using remote desktop technology or the like, so long as the same function can be achieved.
In one embodiment, the near-eye display device projects the display content of the controlled device onto a fixed plane (i.e., a projection display plane) at a distance Zk from the human eye, to obtain a projection display screen, as shown in fig. 2.
In an embodiment, the screen display content of the projection display screen is the screen display content of the controlled device, so that the coordinates of the eye gaze point in the screen display screen of the controlled device can be mapped by capturing the gaze point coordinates of the eye gaze point of the current user in the projection display screen.
In an embodiment, the screen coordinate system may be constructed according to the resolution of the projection display, for example, a first pixel point at the upper left corner of the projection display is taken as the coordinate origin of the screen coordinate system, a horizontal side of the projection display with the coordinate origin as the starting point is taken as the X-axis of the screen coordinate system, a vertical side of the projection display with the coordinate origin as the starting point is taken as the Y-axis of the screen coordinate system, a straight line parallel to the normal line of the projection display or the vertical line of the human eye and the projection display with the coordinate origin as the starting point is taken as the Z-axis, and the resolution of the projection display is taken as the coordinate point of the screen coordinate system in the directions of the X-axis and the Y-axis.
S102, identifying current instruction coordinates of the eye gaze point of the current user in the screen coordinate system;
in one embodiment, the current user's eye gaze point may be tracked by eye tracking techniques. Eye tracking is the process of measuring eye movement. The most interesting event for eye tracking studies is the determination of where a human or animal looks (e.g. "gaze point" or "gaze point"). More precisely, the method comprises the steps of performing an image processing technology through instrument equipment, positioning pupil positions, acquiring coordinates, and calculating the point coordinates of the eye fixation or fixation point through a certain algorithm.
By way of example, eye tracking techniques may be "non-invasive" techniques based on eye video analysis (VOG, videooculographic), the basic principle of which is: a beam of light (near infrared light) and a camera are aligned with the eyes of the testee, the gazing direction of the testee is deduced through light and back-end analysis, and the camera records the interactive process.
In one embodiment, the heart of video-based eye tracking is one or more cameras that can capture a series of eye images. Eye tracking software uses image processing algorithms to identify two key locations on each image sent by the eye tracking camera-pupil center and cornea reflection center. The center of corneal reflection is the point at which light from a fixed light source (infrared illuminator) is reflected back on the cornea.
In one embodiment, the position of the pupil center on the camera sensor changes as the eye rotates, but the position of the cornea reflection center is relatively fixed to the camera sensor when the head is stable (because the reflection source does not move relative to the camera). If the eye is completely fixed in space and simply rotates about its own center, the gaze/gaze location can be determined by merely tracking changes in pupil center on the camera sensor.
In an embodiment, gaze point identification may be performed in a near-eye display device using only eye tracking techniques of pupil tracking, because the relationship between the camera and the eye remains relatively fixed regardless of head movement while the near-eye display device is worn.
In an embodiment, an eye gaze point of a current user is identified through an eye movement tracking technology, point coordinates of the eye gaze point in a screen coordinate system are calculated and used as current instruction coordinates, and a control corresponding to the current instruction coordinates is used as a target control to be controlled.
S103, collecting operation actions of the current user, and determining an instruction type;
in one embodiment, the instruction types may include move, click, long press, focus, and the like.
In an embodiment, the operation actions of the current user, such as a pressure sensor or an inertial sensor IMU, etc., may be acquired by an action sensor.
In an embodiment, the pressure sensor may be disposed below a touch area of the near-eye display device, where the touch area may be a key or a touch screen.
For example, the touch area of the near-eye display device may be configured according to the structure of the near-eye display device, for example, the smart glasses may configure a key touch area at the temple portion or the frame portion; the near-to-eye display device of the helmet type may be provided with a key touch area or a touch screen type touch area in a brain side area.
For example, the touch area may be a key mode, the pressure sensor is disposed below the key, and the user may touch the pressure sensor by pressing the key corresponding to the operation action and generate a corresponding command signal, so that the near-eye display device may recognize the command type.
For example, the touch area may be in a touch screen mode, the pressure sensor may be disposed below the touch screen, the user triggers the operation instruction by clicking a control on the touch screen or by sliding, clicking, or the like on the touch screen, and the pressure sensor identifies the corresponding operation instruction by identifying the operation action of the user, such as sliding the corresponding page switching instruction, clicking the detail page entering instruction corresponding to the control, or the like.
In one embodiment, the inertial sensor IMU inertial measurement unit (Inertialmeasurementunit) is a device that measures three-axis attitude angles and accelerations of an object. The IMU typically includes a gyroscope (Gyroscope), an Accelerometer (ACCELERMETERS), and sometimes a magnetometer (Magnetometers). The gyroscope is used for measuring the angle/angular speed of the three axes, the accelerometer is used for measuring the acceleration of the three axes, and the magnetometer provides magnetic field orientation information.
For example, the IMU may be disposed in a near-eye display device, and the IMU may recognize a user's operation, such as deflecting left as a movement, deflecting right as a click, deflecting up as a long press, and deflecting down as a focus. If the IMU recognizes that the current user deflects to the right by a certain angle, such as a 60-degree angle, the IMU can determine that the current user triggers a click command; or the IMU recognizes the action that the current user continuously deflects twice to the right, and determines that the current user triggers a click command.
The IMU sensor may also be disposed in a peripheral operating device, such as an intelligent finger ring, where the intelligent finger ring is communicatively connected to the near-eye display device, and an operating pointer may be displayed in the projection display interface, where the operating pointer corresponds to a position of the intelligent finger ring. The hand operation actions of the user, such as the synchronous movement of the operation pointer in the projection display interface when the hand slides, are identified through the IMU sensor; when the hand performs a double click action, the operation pointer performs a click operation or the like. And determining the instruction type by collecting the hand operation action of the user.
And S104, generating a control instruction based on the current instruction coordinate and the instruction type, and sending the control instruction to the controlled equipment so as to control the controlled equipment to execute the operation action corresponding to the control instruction.
In an embodiment, the near-eye display device generates the control command according to the current command coordinates and the command type, and the near-eye display device may send the control command to the controlled device through wireless communication modes such as bluetooth, infrared communication technology, WIFI and the like.
In one embodiment, after the controlled device receives the control instruction, the current instruction coordinate and the instruction type information contained in the control instruction are resolved. And then determining a target control according to the current instruction coordinates, determining an operation action according to the instruction type, and further executing a control instruction to finish actions such as mouse clicking, cursor movement and the like.
In this embodiment, the near-to-eye display device has advantages of small size, portability and storage compared with operation devices such as a remote controller and a keyboard, and the head-mounted operation device can relieve the operation of both hands of a user compared with the operation devices such as the remote controller and the keyboard, so that the operation convenience of the user on the electronic device is improved.
The embodiment provides a control method of electronic equipment, which projects a screen display picture of controlled equipment onto a projection display plane through near-eye display equipment to construct a screen coordinate system, so that a user can conveniently and directly view display content of the controlled equipment through the near-eye display equipment, the picture mapping relation between the screen coordinate system and the controlled equipment is realized, and the operation accuracy of an eye tracking operation-based electronic equipment mode is improved. The operation target is determined according to the screen coordinate system and the picture mapping relation of the controlled equipment by identifying the current instruction coordinate of the eye gaze point of the current user in the screen coordinate system, and the hand operation is liberated, so that the control convenience of the controlled equipment is improved. The operation actions of the user are identified through the action sensor, the instruction type is further determined, the user can conveniently and directly generate and issue the operation instructions through the near-to-eye display device, remote control of the controlled device is achieved, multi-terminal operation modes of the controlled device are avoided, and control convenience of the electronic device is improved.
Referring to fig. 3, fig. 3 is a flowchart of a second embodiment of a control method for an electronic device according to an embodiment of the application.
As shown in fig. 3, based on the embodiment shown in fig. 1, the step S102 specifically includes:
Further, the step S102 specifically includes:
s201, based on an eye tracking technology, identifying the current fixation coordinates of the eye fixation point in a fixation plane coordinate system;
The gaze plane coordinate system is constructed on a gaze plane, the gaze plane coincides with the projection display plane, and the origin of coordinates of the gaze plane coordinate system coincides with the origin of coordinates of the screen coordinate system.
In an embodiment, the eye gaze point of the current user is identified by an eye-tracking technique, and current gaze coordinates of the eye gaze point in a gaze plane coordinate system are calculated. It may be assumed that the point of gaze coordinate system origin O (X o,Yo,Zo) coincides with the display device screen coordinate system origin (X a,Ya,Za). The screen coordinate system coordinate of the eye point of regard is the offset of the point of regard relative to the screen coordinate system coordinate origin (the first pixel point in the upper left corner of the projection display picture); because the projection display plane is located at a fixed distance Z k from the user's field of view centerline, i.e., the projection display plane is in the same plane as the annotation plane of the human eye. The depth information (Z-axis coordinate) of the gaze point can therefore be ignored, i.e. the planar two-dimensional coordinates of the gaze point of the human eye in the screen coordinate system can be recognized directly.
Further, acquiring the size parameter of the screen display picture and the screen resolution; and calculating the pixel density of the screen display picture based on the size parameter and the screen resolution.
In an embodiment, parameters of a display device of a controlled device are acquired, for example: dimension width W and height H in millimeters (mm), screen resolution width S w and height S h in pixels (px).
In an embodiment, when the controlled device and the near-eye display device are in communication connection, if the controlled device and the near-eye display device are in first pairing connection, the near-eye display device may read device information (such as a device number, a device name, etc.) of the controlled device and parameters of the display device from the controlled device, and may save the device information of the controlled device and the parameters of the display device by means of a file establishment or the like. If the controlled device is not in first pairing connection with the near-eye display device, the near-eye display device can acquire the device information of the controlled device, and query the parameters of the display device of the controlled device from the archive according to the device information of the controlled device.
In one embodiment, pixel density PPM (Pixel PER MILLIMETRE, pixels per millimeter) represents the number of pixels per inch. The higher the PPM value, the higher the density at which the representative display screen is able to display an image. The pixel density includes a pixel density in a horizontal direction and a pixel density in a vertical direction.
In an embodiment, the size parameters of the screen display include a length size and a width size of the screen display, and the screen resolution includes a length pixel value and a width pixel value of the screen display.
Further, calculating the pixel density of the screen display in a horizontal direction based on the length dimension and the length pixel value; the pixel density of the screen display in the vertical direction is calculated based on the width dimension and the width pixel value.
In one embodiment, the screen resolution is determined according to the size parameters of the screen display: the size width W and the height H are expressed in millimeters (mm), the screen resolution width S w and the height S h are expressed in pixels (px), and the pixel density W ppd=Sw/W in the horizontal direction of the screen display and the pixel density H ppm=Sh/H in the vertical direction of the screen display can be calculated.
S202, calculating the current instruction coordinate of the eye gaze point in the screen coordinate system based on the current gaze coordinate and the pixel density of the screen display picture.
In one embodiment, the point coordinates of the eye gaze point in the gaze plane coordinate system are obtained by eye tracking techniques as K (X k,Yk,Zk) in millimeters. The screen coordinates (X, Y) of the eye gaze point K (X k,Yk,Zk) in the display screen of the controlled device are calculated by the following formula in pixels (px).
In an embodiment, because the projection display is obtained by projecting the display of the controlled device, parameters (resolution and pixel density) of the screen display of the controlled device can be used as the picture parameters of the projection display, so as to obtain the target.
In this embodiment, the eye gaze point coordinates of the human eye are obtained by using the eye movement tracking technology, and the screen display content of the control device is output to the head display device by using the screen projection technology. And converting the gaze point coordinates with the pixel coordinates of the display screen of the controlled device to obtain the screen coordinates of the human eye gaze point relative to the display device, so as to achieve a suspension effect similar to the touch of a mouse pointer or a finger. The operation dependence on hands is eliminated, the control of the equipment can be realized only by using the gaze point of eyes, and the operation convenience of the electronic equipment is improved.
Referring to fig. 4, fig. 4 is a flowchart of a third embodiment of a control method for an electronic device according to an embodiment of the present application.
As shown in fig. 4, based on the embodiment shown in fig. 1, the step S104 specifically further includes:
s301, determining a target control based on a preset control information table and the current instruction coordinates;
In an embodiment, after the current instruction coordinate of the eye gaze point of the current user in the screen coordinate system is identified by the eye tracking technology, the control coordinate area corresponding to the current instruction coordinate may be queried in the control information table. For example, the current instruction coordinate is (20, 30), and it can be determined that the current instruction coordinate is located in the coordinate area [ (10, 10), (40, 40) ] so that the target control pointed by the current instruction coordinate is control a.
Further, before the step S301, the method further includes: acquiring control information of at least one control in the screen display picture; determining a control coordinate area corresponding to each control based on the screen coordinate system; and associating the control information with the control coordinate area to construct a control information table of the controlled equipment.
In an embodiment, the controlled device may be an electronic device such as a mobile phone, a computer, a tablet, etc., where at least one application control, such as a chat software application control, a game application control, etc., exists in the electronic device, and application control information in the controlled device may be sent to the near-eye display device. The application control refers to the data of the application program and the encapsulation result of the operation method, and can be used as an interactive button of the application program in the near-eye display equipment and the controlled equipment for inputting, editing or displaying the data.
In an embodiment, the control information may include information of a control name, a control type, a control function, and the like.
In an embodiment, the positions and sizes of the controls in the screen display can be calculated, and the control coordinate areas of the controls in the screen coordinate system are calculated, because the coordinate axes in the screen coordinate system take the pixel resolution of the screen display as the coordinate points, and the controls often occupy the area with uniform size in the screen display, namely comprise a plurality of pixel points, so that the control coordinates corresponding to the controls are the coordinate areas comprising the pixel coordinate points. For example, control a [ (10, 10), (40, 40) ], indicates that the control coordinate region of control a is the coordinate region surrounded by coordinate points (10, 10), (10, 40), (40, 10), (40, 40).
In an embodiment, the control information is associated with the control coordinate area corresponding to the control, namely, a unique corresponding relation between the control and the control coordinate area is established, and a control information table is created so as to match the corresponding coordinate area according to the coordinates of the eye gaze point, and then the target control is queried.
S302, determining the operation action corresponding to the control instruction based on the instruction type;
In an embodiment, the instruction type may include a control operation manner such as moving, clicking, long pressing, focusing, and the like, and an operation action that the current user wants to perform on the target control may be determined according to the instruction type, so as to control the controlled device to perform the operation action on the target control.
And S303, controlling the controlled equipment to execute the operation action on the target control based on the control instruction.
In an embodiment, the near-eye display device may package the current instruction coordinate and the instruction type to generate a control instruction, send the control instruction to the controlled device, analyze the control instruction by the controlled device, determine a target control corresponding to the current instruction coordinate and an operation corresponding to the instruction type, and then execute the operation on the target control.
In an embodiment, the near-eye display device may also pre-configure a control information table of the controlled device, match a target control in the control information table according to the current instruction coordinate, package the target control and the instruction type to generate a control instruction, send the control instruction to the controlled device, analyze the control instruction by the controlled device, and execute an operation action corresponding to the instruction type on the target control.
In the embodiment, the control information table of the controlled equipment is constructed, the target control corresponding to the current instruction coordinate is queried through the control information table, and then the control instruction is generated by combining the target control and the instruction type, so that the controlled equipment is controlled to execute the corresponding operation action on the target control, and the control efficiency and accuracy of the controlled equipment are improved.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a first embodiment of an electronic device control apparatus according to the present application, where the electronic device control apparatus is configured to execute the foregoing electronic device control method.
As shown in fig. 5, the electronic device control apparatus 400 includes: a screen coordinate system acquisition module 401, a current instruction coordinate recognition module 402, an instruction type determination module 403, and a device control module 404.
A screen coordinate system obtaining module 401, configured to, when a near-eye display device is in a wearing state and the near-eye display device is in communication connection with a controlled device, project a screen display screen of the controlled device onto a projection display plane of the near-eye display device, and obtain a screen coordinate system;
a current instruction coordinate recognition module 402, configured to recognize a current instruction coordinate of a gaze point of a human eye of the current user in the screen coordinate system;
an instruction type determining module 403, configured to collect an operation action of the current user, and determine an instruction type;
And the device control module 404 is configured to generate a control instruction based on the current instruction coordinate and the instruction type, and send the control instruction to the controlled device, so as to control the controlled device to execute an operation action corresponding to the control instruction.
In an embodiment, the screen coordinate system obtaining module 401 includes:
A screen display screen acquisition unit configured to acquire the screen display screen of the controlled device;
A projection display screen obtaining unit configured to project the screen display screen into the projection display plane based on the near-eye display device, obtaining a projection display screen;
and the screen coordinate system construction unit is used for constructing the screen coordinate system based on the pixel resolution of the projection display picture.
In one embodiment, the current instruction coordinate recognition module 402 includes:
A current annotation coordinate identifying unit, configured to identify a current gaze coordinate of the eye gaze point in a gaze plane coordinate system based on an eye movement tracking technique;
A current instruction coordinate calculation unit, configured to calculate, based on the current gaze coordinate and a pixel density of the screen display, the current instruction coordinate of the eye gaze point in the screen coordinate system;
The gaze plane coordinate system is constructed on a gaze plane, the gaze plane coincides with the projection display plane, and the origin of coordinates of the gaze plane coordinate system coincides with the origin of coordinates of the screen coordinate system.
In an embodiment, the current instruction coordinate recognition module 402 further includes:
a screen parameter obtaining unit, configured to obtain a size parameter and a screen resolution of the screen display;
and the pixel density calculating unit is used for calculating the pixel density of the screen display picture based on the size parameter and the screen resolution.
In an embodiment, the size parameters of the screen display include a length size and a width size of the screen display, and the screen resolution includes a length pixel value and a width pixel value of the screen display;
the pixel density calculating unit includes:
a horizontal pixel density calculating subunit configured to calculate the pixel density of the screen display in a horizontal direction based on the length dimension and the length pixel value;
And a vertical pixel density calculating subunit for calculating the pixel density of the screen display in the vertical direction based on the width dimension and the width pixel value.
In one embodiment, the device control module 404 includes:
the target control determining unit is used for determining a target control based on a preset control information table and the current instruction coordinates;
an operation action determining unit, configured to determine the operation action corresponding to the control instruction based on the instruction type;
And the equipment control unit is used for controlling the controlled equipment to execute the operation action on the target control based on the control instruction.
In an embodiment, the electronic device control apparatus 400 further includes:
The control information acquisition unit is used for acquiring control information of at least one control in the screen display picture;
the control coordinate area determining unit is used for determining control coordinate areas corresponding to the controls based on the screen coordinate system;
And the control information table construction unit is used for associating the control information with the control coordinate area to construct a control information table of the controlled equipment.
It should be noted that, for convenience and brevity of description, a person skilled in the art may clearly understand that, for the specific working process of the above-described apparatus and each module, reference may be made to a corresponding process in the foregoing embodiment of the electronic device control method, which is not described herein again.
The apparatus provided by the above embodiments may be implemented in the form of a computer program which may be run on a computer device as shown in fig. 6.
Referring to fig. 6, fig. 6 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device may be a server.
With reference to FIG. 6, the computer device includes a processor, memory, and a network interface connected by a system bus, where the memory may include a non-volatile storage medium and an internal memory.
The non-volatile storage medium may store an operating system and a computer program. The computer program comprises program instructions that, when executed, cause the processor to perform any of a number of electronic device control methods.
The processor is used to provide computing and control capabilities to support the operation of the entire computer device.
The internal memory provides an environment for the execution of a computer program in a non-volatile storage medium that, when executed by a processor, causes the processor to perform any of a number of electronic device control methods.
The network interface is used for network communication such as transmitting assigned tasks and the like. It will be appreciated by those skilled in the art that the structure shown in FIG. 6 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
It should be appreciated that the processor may be a central processing unit (CentralProcessingUnit, CPU), which may also be other general purpose processors, digital signal processors (DigitalSignalProcessor, DSP), application specific integrated circuits (ApplicationSpecificIntegratedCircuit, ASIC), field programmable gate arrays (Field-ProgrammableGateArray, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Wherein in one embodiment the processor is configured to run a computer program stored in the memory to implement the steps of:
When the near-eye display equipment is in a wearing state and is in communication connection with the controlled equipment, projecting a screen display picture of the controlled equipment onto a projection display plane of the near-eye display equipment to obtain a screen coordinate system;
Identifying current instruction coordinates of the eye gaze point of the current user in the screen coordinate system;
Collecting the operation action of the current user and determining the instruction type;
And generating a control instruction based on the current instruction coordinate and the instruction type, and sending the control instruction to the controlled equipment so as to control the controlled equipment to execute the operation action corresponding to the control instruction.
In an embodiment, when implementing the near-eye-based display device, the processor is configured to, when obtaining a screen coordinate system, implement:
acquiring the screen display picture of the controlled equipment;
Based on the near-eye display device, projecting the screen display picture into the projection display plane to obtain a projection display picture;
and constructing the screen coordinate system based on the pixel resolution of the projection display picture.
In an embodiment, the processor, when implementing the current instruction coordinates that identify the eye gaze point of the current user in the screen coordinate system, is configured to implement:
based on an eye tracking technology, identifying current gaze coordinates of the eye gaze point in a gaze plane coordinate system;
calculating the current instruction coordinate of the human eye gazing point in the screen coordinate system based on the current gazing coordinate and the pixel density of the screen display picture;
The gaze plane coordinate system is constructed on a gaze plane, the gaze plane coincides with the projection display plane, and the origin of coordinates of the gaze plane coordinate system coincides with the origin of coordinates of the screen coordinate system.
In an embodiment, before implementing the calculating the current instruction coordinates of the eye gaze point in the screen coordinate system based on the current gaze coordinates and the pixel density of the screen display, the processor is further configured to implement:
acquiring the size parameter and the screen resolution of the screen display picture;
and calculating the pixel density of the screen display picture based on the size parameter and the screen resolution.
In an embodiment, the size parameters of the screen display include a length size and a width size of the screen display, and the screen resolution includes a length pixel value and a width pixel value of the screen display;
The processor is configured to, when implementing the calculating the pixel density of the screen display based on the size parameter and the screen resolution, implement:
calculating the pixel density of the screen display in a horizontal direction based on the length dimension and the length pixel value;
the pixel density of the screen display in the vertical direction is calculated based on the width dimension and the width pixel value.
In an embodiment, when implementing the generating a control instruction based on the current instruction coordinate and the instruction type, the processor is configured to implement:
determining a target control based on a preset control information table and the current instruction coordinates;
determining the operation action corresponding to the control instruction based on the instruction type;
and controlling the controlled equipment to execute the operation action on the target control based on the control instruction.
In an embodiment, before implementing the determining the target control based on the preset control information table and the current instruction coordinates, the processor is further configured to implement:
acquiring control information of at least one control in the screen display picture;
determining a control coordinate area corresponding to each control based on the screen coordinate system;
And associating the control information with the control coordinate area to construct a control information table of the controlled equipment.
The embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores a computer program, the computer program comprises program instructions, and the processor executes the program instructions to realize any electronic equipment control method provided by the embodiment of the application.
The computer readable storage medium may be an internal storage unit of the computer device according to the foregoing embodiment, for example, a hard disk or a memory of the computer device. The computer readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk provided on the computer device, a smart memory card (SMARTMEDIACARD, SMC), a secure digital (SecureDigital, SD) card, a flash memory card (FLASHCARD), or the like.
While the application has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (10)

1. A method of controlling an electronic device, the method comprising:
When the near-eye display equipment is in a wearing state and is in communication connection with the controlled equipment, projecting a screen display picture of the controlled equipment onto a projection display plane of the near-eye display equipment to obtain a screen coordinate system;
identifying the current instruction coordinates of the eye gaze point of the current user in the screen coordinate system;
Collecting the operation action of the current user and determining the instruction type;
And generating a control instruction based on the current instruction coordinate and the instruction type, and sending the control instruction to the controlled equipment so as to control the controlled equipment to execute the operation action corresponding to the control instruction.
2. The method according to claim 1, wherein the projecting the screen display screen of the controlled device onto the projection display plane based on the near-eye display device to obtain a screen coordinate system includes:
acquiring the screen display picture of the controlled equipment;
Based on the near-eye display device, projecting the screen display picture into the projection display plane to obtain a projection display picture;
and constructing the screen coordinate system based on the pixel resolution of the projection display picture.
3. The electronic device control method according to claim 1, wherein the identifying the current instruction coordinates of the eye gaze point of the current user in the screen coordinate system includes:
based on an eye tracking technology, identifying current gaze coordinates of the eye gaze point in a gaze plane coordinate system;
calculating the current instruction coordinate of the human eye gazing point in the screen coordinate system based on the current gazing coordinate and the pixel density of the screen display picture;
The gaze plane coordinate system is constructed on a gaze plane, the gaze plane coincides with the projection display plane, and the origin of coordinates of the gaze plane coordinate system coincides with the origin of coordinates of the screen coordinate system.
4. The electronic device control method according to claim 3, characterized in that the calculating the current instruction coordinates of the eye gaze point in the screen coordinate system based on the current gaze coordinates and the pixel density of the screen display picture further comprises:
acquiring the size parameter and the screen resolution of the screen display picture;
and calculating the pixel density of the screen display picture based on the size parameter and the screen resolution.
5. The method according to claim 4, wherein the size parameter of the screen display includes a length size and a width size of the screen display, and the screen resolution includes a length pixel value and a width pixel value of the screen display;
The calculating the pixel density of the screen display picture based on the size parameter and the screen resolution includes:
calculating the pixel density of the screen display in a horizontal direction based on the length dimension and the length pixel value;
the pixel density of the screen display in the vertical direction is calculated based on the width dimension and the width pixel value.
6. The electronic device control method according to claim 1, wherein the generating a control instruction based on the current instruction coordinates and the instruction type, and transmitting the control instruction to the controlled device, to control the controlled device to execute an operation action corresponding to the control instruction, includes:
determining a target control based on a preset control information table and the current instruction coordinates;
determining the operation action corresponding to the control instruction based on the instruction type;
and controlling the controlled equipment to execute the operation action on the target control based on the control instruction.
7. The method according to claim 6, wherein before determining the target control based on the preset control information table and the current command coordinates, further comprising:
acquiring control information of at least one control in the screen display picture;
determining a control coordinate area corresponding to each control based on the screen coordinate system;
And associating the control information with the control coordinate area to construct a control information table of the controlled equipment.
8. An electronic device control apparatus, characterized by comprising:
The system comprises a screen coordinate system acquisition module, a control module and a control module, wherein the screen coordinate system acquisition module is used for projecting a screen display picture of a controlled device onto a projection display plane of the near-eye display device when the near-eye display device is in a wearing state and the near-eye display device is in communication connection with the controlled device, so as to acquire a screen coordinate system;
the current instruction coordinate recognition module is used for recognizing the current instruction coordinate of the eye gaze point of the current user in the screen coordinate system;
the instruction type determining module is used for collecting the operation actions of the current user and determining the instruction type;
And the equipment control module is used for generating a control instruction based on the current instruction coordinate and the instruction type, and sending the control instruction to the controlled equipment so as to control the controlled equipment to execute the operation action corresponding to the control instruction.
9. A computer device comprising a processor, a memory, and a computer program stored on the memory and executable by the processor, wherein the computer program, when executed by the processor, implements the steps of the electronic device control method of any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, implements the steps of the electronic device control method according to any one of claims 1 to 7.
CN202311852082.8A 2023-12-28 2023-12-28 Electronic equipment control method, device, equipment and medium Pending CN117930970A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311852082.8A CN117930970A (en) 2023-12-28 2023-12-28 Electronic equipment control method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311852082.8A CN117930970A (en) 2023-12-28 2023-12-28 Electronic equipment control method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN117930970A true CN117930970A (en) 2024-04-26

Family

ID=90767706

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311852082.8A Pending CN117930970A (en) 2023-12-28 2023-12-28 Electronic equipment control method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117930970A (en)

Similar Documents

Publication Publication Date Title
US11886633B2 (en) Virtual object display interface between a wearable device and a mobile device
US9791927B2 (en) Systems and methods of eye tracking calibration
US10429941B2 (en) Control device of head mounted display, operation method and operation program thereof, and image display system
JPWO2014141504A1 (en) 3D user interface device and 3D operation processing method
JP2013258614A (en) Image generation device and image generation method
JP6344530B2 (en) Input device, input method, and program
KR20150130495A (en) Detection of a gesture performed with at least two control objects
KR20120068253A (en) Method and apparatus for providing response of user interface
WO2020124976A1 (en) Image processing method and apparatus, and electronic device and storage medium
WO2017147748A1 (en) Wearable system gesture control method and wearable system
US10607340B2 (en) Remote image transmission system, display apparatus, and guide displaying method thereof
KR102110208B1 (en) Glasses type terminal and control method therefor
US10444831B2 (en) User-input apparatus, method and program for user-input
JP2012238293A (en) Input device
WO2019062852A1 (en) Displaying content control method, device and computer readable medium
WO2018146922A1 (en) Information processing device, information processing method, and program
CN106201284B (en) User interface synchronization system and method
GB2544875B (en) Gesture control using depth data
WO2022199597A1 (en) Method, apparatus and system for cropping image by vr/ar device
CN117930970A (en) Electronic equipment control method, device, equipment and medium
CN110858095A (en) Electronic device capable of being controlled by head and operation method thereof
JP6373546B2 (en) Information processing apparatus, information processing method, and program
JP3201596U (en) Operation input device
KR20160062906A (en) augmented reality Input Method for Wearable device
TW202336711A (en) Display device and display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination