CN117590942A - Control method, device, equipment and storage medium of electronic equipment - Google Patents

Control method, device, equipment and storage medium of electronic equipment Download PDF

Info

Publication number
CN117590942A
CN117590942A CN202311509833.6A CN202311509833A CN117590942A CN 117590942 A CN117590942 A CN 117590942A CN 202311509833 A CN202311509833 A CN 202311509833A CN 117590942 A CN117590942 A CN 117590942A
Authority
CN
China
Prior art keywords
eye display
electronic equipment
determining
user
interaction interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311509833.6A
Other languages
Chinese (zh)
Inventor
胡立天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Bounds Inc
Original Assignee
Meta Bounds Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meta Bounds Inc filed Critical Meta Bounds Inc
Priority to CN202311509833.6A priority Critical patent/CN117590942A/en
Publication of CN117590942A publication Critical patent/CN117590942A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application relates to the technical field of man-machine interaction, and provides a control method, a device, equipment and a storage medium of electronic equipment, wherein the method comprises the following steps: acquiring a target image obtained by shooting electronic equipment by near-eye display equipment; according to the target image and the eye movement information of the user, determining the visual focus position of the user on the man-machine interaction interface of the electronic equipment; determining a target operation position in the man-machine interaction interface according to the visual focus position; and controlling the electronic equipment to execute corresponding operation according to the target operation position so as to improve the convenience of controlling the electronic equipment.

Description

Control method, device, equipment and storage medium of electronic equipment
Technical Field
The present disclosure relates to the field of man-machine interaction technologies, and in particular, to a method, an apparatus, a device, and a storage medium for controlling an electronic device.
Background
Currently, when a user controls an electronic device such as a computer, a television, or the like, the user needs to control the electronic device by means of a control device. For example, a television is controlled by a remote controller, and a computer is controlled by a mouse and a keyboard. However, because part of the control devices such as the remote controller is easy to lose and part of the control devices such as the keyboard and the mouse are inconvenient to carry, the control convenience of the electronic equipment is easy to be poor, and the use experience of the user on the electronic equipment is affected.
Disclosure of Invention
The main purpose of the application is to provide a control method, a device, equipment and a storage medium of electronic equipment, aiming at improving the convenience degree of controlling the electronic equipment.
In a first aspect, the present application provides a control method of an electronic device, applied to a near-eye display device, the control method including the steps of:
acquiring a target image obtained by shooting electronic equipment by near-eye display equipment;
according to the target image and the eye movement information of the user, determining the visual focus position of the user on the man-machine interaction interface of the electronic equipment;
determining a target operation position in the man-machine interaction interface according to the visual focus position;
and controlling the electronic equipment to execute corresponding operation according to the target operation position.
In a second aspect, the present application further provides a control device of an electronic apparatus, where the control device includes:
the acquisition module is used for acquiring a target image obtained by shooting the electronic equipment by the near-eye display equipment;
the first determining module is used for determining the visual focus position of the human-computer interaction interface of the electronic equipment by the user according to the target image and the eye movement information of the user;
the second determining module is used for determining a target operation position in the man-machine interaction interface according to the visual focus position;
And the control module is used for controlling the electronic equipment to execute corresponding operation according to the target operation position.
In a third aspect, the present application also provides a near-eye display device comprising a memory and a processor;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and implement the control method of the electronic device as described above when the computer program is executed.
In a fourth aspect, the present application further provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of a control method of an electronic device as described above.
The application provides a control method, a device, equipment and a storage medium of electronic equipment, wherein the control method of the electronic equipment comprises the following steps: acquiring a target image obtained by shooting electronic equipment by near-eye display equipment; according to the target image and eye movement information of the user, determining a visual focus position of a human-computer interaction interface of the electronic equipment by the user; determining a target operation position in the man-machine interaction interface according to the visual focus position; and controlling the electronic equipment to execute corresponding operation according to the target operation position. Based on the wearability of the near-eye display device, when a user controls the electronic device by using the near-eye display device, the situation that the user cannot control the electronic device due to the fact that a control device of the electronic device is easy to lose or inconvenient to carry can be avoided. Moreover, based on the convenience of operation of the near-eye display device, the user can operate the near-eye display device in various modes, so that the situation that the user cannot control the electronic device through the control device of the electronic device due to the fact that the two hands of the user are occupied can be avoided. In summary, the near-to-eye display device controls the electronic device according to the control method of the electronic device provided by the application, which is beneficial to improving the convenience degree of controlling the electronic device.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a control method of an electronic device according to an embodiment of the present application;
fig. 2 is an application scenario diagram of a control method of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic block diagram of a control device of an electronic apparatus provided in an embodiment of the present application;
fig. 4 is a schematic block diagram of a structure of a near-eye display device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The flow diagrams depicted in the figures are merely illustrative and not necessarily all of the elements and operations/steps are included or performed in the order described. For example, some operations/steps may be further divided, combined, or partially combined, so that the order of actual execution may be changed according to actual situations.
The embodiment of the application provides a control method, a control device, control equipment and a storage medium of electronic equipment. The control method of the electronic device can be applied to near-eye display devices.
By way of example, the electronic device may include a television, computer, tablet, cell phone, etc., without limitation. The near-eye display device may include, without limitation, virtual Reality (VR) glasses, augmented Reality (Augmented Reality, AR) glasses, VR helmets, AR helmets, and the like.
Some embodiments of the present application are described in detail below with reference to the accompanying drawings. The embodiments described below and features of the embodiments may be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a flowchart of a control method of an electronic device according to an embodiment of the present application. The embodiment of the invention provides a control method for an electronic device, which can be used for a near-eye display device.
In the related art, a user can control an electronic device through a control device. For example, the computer is controlled by at least one of a mouse and a keyboard. Or the television is controlled by a remote controller. Or, controlling the tablet personal computer through the capacitance pen, and the like. For example, the electronic device may be integrally configured with the control device, for example, a device such as a mobile phone, a tablet computer, etc. may be integrally configured with the touch device, so that a user may control the device such as the mobile phone, the tablet computer, etc. through the device such as the touch mobile phone, the tablet computer, etc.
However, when a user controls an electronic device through a control device such as a mouse, a keyboard, a remote controller, a capacitive pen, etc., the user is likely to have poor convenience in controlling the electronic device due to the easy loss of a part of the control device such as the remote controller, the difficulty in carrying about a part of the control device such as the keyboard, and the small number of keys of a part of the control device such as the remote controller, which makes it impossible to input a complicated control command. Moreover, when a user controls the electronic device through the touch device, the user generally needs to operate the electronic device through both hands. Under the condition that the hands of the user are occupied, the convenience of the user for controlling the electronic equipment is reduced. For example, a user may not be able to control an electronic device in a timely manner while the user is handling other things.
By way of example, the electronic device may be controlled by the near-eye display device according to the control method of the electronic device provided by the application. Based on the wearability of the near-eye display device, the near-eye display device can be prevented from being lost, and the carrying convenience of the near-eye display device is improved. Based on the operation diversity of the near-eye display device, a user can select a proper operation mode to operate the near-eye display device according to the actual situation of the user so as to control the electronic device without considering whether the hands of the user are occupied or not, so that the convenience of controlling the electronic device is improved.
As shown in fig. 1, the control method of the electronic device includes steps S101 to S104.
S101, acquiring a target image obtained by shooting the electronic equipment by the near-eye display equipment.
Illustratively, the near-eye display device is provided with a photographing function. In some embodiments, the near-eye display device is configured with a camera module, and the near-eye display device may take a photograph through the camera module. For example, the camera module may shoot an object within a preset range corresponding to the near-eye display device. For example, when the electronic device is located in the preset range corresponding to the near-eye display device, the near-eye display device may shoot the electronic device through the camera module, which is not limited herein.
For example, the near-eye display device may detect whether there is an electronic device establishing a communication connection with the near-eye display device. The communication connection may include, without limitation, a bluetooth connection, a WiFi technology connection, and the like. When the communication connection between the electronic equipment and the near-eye display equipment exists, the near-eye display equipment can shoot the electronic equipment to obtain a target image. Of course, the near-eye display device is not limited thereto, and for example, the near-eye display device may photograph an object within a predetermined range in real time or at a fixed time to obtain a predetermined image. Based on the obtained preset image, whether the preset image contains the electronic device or not can be detected, so that the preset image containing the electronic device is determined as a target image, and the method is not limited.
Therefore, the target image obtained by shooting the electronic equipment through the near-to-eye display equipment can be used for determining the control position corresponding to the electronic equipment subsequently so as to control the electronic equipment, and the convenience degree of controlling the electronic equipment is improved.
S102, determining the visual focus position of the man-machine interaction interface of the electronic equipment by the user according to the target image and the eye movement information of the user.
For example, since the target image is obtained by photographing the electronic device with the near-eye display device, the target image may include a human-computer interaction interface of the electronic device, and the target image may be used to determine an interface range corresponding to the human-computer interaction interface on the electronic device. Furthermore, when the user controls the electronic device through the near-eye display device, the control requirement of the user on the electronic device can be determined through the eye movement information, for example, the required operation position of the user on the electronic device is determined. For example, a desired operation location for a human-machine interaction interface of an electronic device is determined by an eye looking at a corresponding location within an interface range corresponding to the human-machine interaction interface. Based on the method, the visual focus position of the human-computer interaction interface of the electronic equipment by combining the target image and the eye movement information of the user can be determined for subsequent determination of the target operation position in the human-computer interaction interface.
In some embodiments, determining an interface range corresponding to a human-computer interaction interface on the electronic device according to the target image; and determining the visual focus position of the user in the interface range according to the eye movement information.
For example, in the case that a user controls the electronic device through the near-eye display device, the near-eye display device needs to determine a range of the electronic device that is available for the user to control, such as an interface range corresponding to a human-computer interaction interface on the electronic device. Based on the above, the near-eye display device can determine the interface range corresponding to the man-machine interaction interface on the electronic device according to the acquired target image.
Taking an electronic device as an example of a desktop computer, the desktop computer may include a host computer and a display. When a user operates the desktop computer through at least one of a mouse and a keyboard, the user generally operates the man-machine interaction interface displayed by the display, so that the interface range corresponding to the man-machine interaction interface displayed by the display in the desktop computer can be determined according to the target image of the desktop computer shot by the near-to-eye display device, so that the subsequent control of the desktop computer can be performed.
Of course, the invention is not limited thereto, and the projector may project the man-machine interface onto a projection curtain or a wall surface, taking the electronic device as an example of the projector. However, when the user views the man-machine interaction interface projected by the projector, for example, parameters such as a projection position of the man-machine interaction interface, a projection volume of the projector, a projection brightness and the like need to be adjusted, then the interface range corresponding to the man-machine interaction interface projected by the projector can be determined according to the target image of the man-machine interaction interface projected by the projector, which is shot by the near-eye display device, so that the projector can be controlled subsequently. And are not limited herein.
For example, eye movement information of a user may be detected by eye movement tracking techniques to determine the visual focus position of the user at the interface. The visual focus position may be used to indicate a gaze location when the user gazes at the electronic device. In determining user interface scopeIn the case of a visual focus position, the near-eye display device may determine that the user desires to control the electronic device accordingly at the visual focus position. As shown in fig. 2, the visual focus position may be represented as L (X l ,Y l ,Z l ). Because the user interacts with the man-machine interaction interface on the electronic equipment through the near-eye display equipment to control the electronic equipment, the target operation position of the user in the man-machine interaction interface can be determined according to the visual focus position of the user on the electronic equipment.
Therefore, the visual focus position of the man-machine interaction interface of the electronic equipment by the user is determined according to the target image and the eye movement information of the user, so that the target operation position of the user in the man-machine interaction interface can be determined later, and the convenience of controlling the electronic equipment is improved.
In some implementations, a corner location of a display of an electronic device is determined from a target image; and converting the angular point position according to shooting parameters of the near-eye display equipment to obtain an interface range corresponding to the human-computer interaction interface on the electronic equipment.
For example, the corner position of the display of the electronic device may be used to indicate the range of positions in the target image where the display of the electronic device is located. Taking an electronic device as a desktop computer as an example, in a target image obtained by shooting the desktop computer by using the near-to-eye display device, for example, a display of the desktop computer at least comprises a rectangle, and four corners of the rectangle can be used as corner points of the display of the desktop computer so as to determine the corner point positions of the display of the desktop device.
For example, in the case of obtaining a target image, the target image may be input into an algorithm model that detects the corner position of the display, and then the algorithm model may output the corner position of the display. For example, as shown in fig. 2, the corner points of the display of the electronic device may include corner points a to D, wherein the corner positions of the respective corner points a to D may be represented as a (U a ,V a ),B(U b ,V b ),C(U c ,V c ) D (U) d ,V d ). Wherein the displayIs in pixels. Of course, the present invention is not limited thereto.
In some embodiments, since the target image is captured by the near-eye display device, and the near-eye display device has corresponding capturing parameters, when the user controls the electronic device through the near-eye display device, an actual operation range of the electronic device by the near-eye display device, such as an interface range corresponding to a man-machine interaction interface on the electronic device, may be different from an operation range limited by a corner position of a display of the electronic device. If the operation range limited by the angular point position of the display of the electronic device is used as the actual operation range of the near-eye display device for the electronic device, when the user controls the electronic device through the near-eye display device, the situation that the user cannot control the electronic device through the near-eye display device due to the fact that the interface range corresponding to the man-machine interaction interface on the electronic device is different from the operation range limited by the angular point position of the display of the electronic device easily occurs, and therefore the control experience of the user on the electronic device is affected.
Based on the method, the interface range corresponding to the man-machine interaction interface on the electronic equipment can be determined according to shooting parameters of the near-eye display equipment and the angular point position of the display of the electronic equipment, so that the actual operation range of the near-eye display equipment on the electronic equipment is determined. For example, the angular point position may be converted according to the shooting parameters of the near-eye display device, so as to obtain an interface range corresponding to the man-machine interaction interface on the electronic device.
For example, as shown in fig. 2, taking the corner positions of the corner positions from the corner a to the corner D of the display as an example, the corner positions of the corner a to the corner D may be converted according to the shooting parameters of the near-eye display device, so as to obtain the corner positions of the corner a to the corner D after the conversion. For example, when the near-eye display device photographs the electronic device with the corresponding photographing parameters to obtain the target image, the near-eye display device converts the electronic device located in the three-dimensional space into the electronic device on the two-dimensional image (i.e., the target image) by the corresponding photographing parameters. Accordingly, each of the corner points A to DThe positions of the self corner points are equivalent to the positions of the corner points A to D on the image coordinate system corresponding to the target image. Based on the above, the conversion processing can be performed on the respective corner positions of the corner points a to D according to the shooting parameters of the near-eye display device, so as to convert the respective positions of the corner points a to D on the image coordinate system corresponding to the target image into the respective positions of the corner points a to D on the three-dimensional coordinate system corresponding to the three-dimensional space, and obtain the respective converted corner positions of the corner points a to D. The corner positions after the conversion processing of each of the corner points a to D can be expressed as a (X a ,Y a ,Z a ),B(X b ,Y b ,Z b ),C(X c ,Y c ,Z c ) D (X) d ,Y d ,r d ). And then, according to the corner positions of the corner points A to D after the conversion treatment, determining an interface range corresponding to the man-machine interaction interface of the electronic equipment, namely determining the actual operation range of the near-to-eye display equipment on the electronic equipment. Of course, the present invention is not limited thereto.
Therefore, the interface range corresponding to the man-machine interaction interface on the electronic equipment can be determined by combining the target image with the shooting parameters of the near-eye display equipment, and the convenience degree for determining the interface range corresponding to the man-machine interaction interface on the electronic equipment is improved. And based on the determination of the interface range corresponding to the man-machine interaction interface on the electronic equipment, when a subsequent user controls the electronic equipment through the near-eye display equipment, the actual operation range of the near-eye display equipment to the electronic equipment can be determined according to the interface range corresponding to the man-machine interaction interface on the electronic equipment, so that when the user controls the electronic equipment through the near-eye display equipment, the situation that the user cannot control the electronic equipment through the near-eye display equipment due to the difference between the angular point position of the display of the electronic equipment determined according to the target image and the interface range corresponding to the man-machine interaction interface on the electronic equipment is avoided, and the control accuracy and convenience of the electronic equipment are improved.
In some embodiments, according to the camera internal parameter matrix of the near-eye display device and the camera external parameter matrix of the near-eye display device, the angular point position is converted to obtain an interface range corresponding to the man-machine interaction interface on the electronic device.
For example, the near-eye display device may obtain a camera intrinsic matrix of the near-eye display device. The camera reference matrix of the near-eye display device may be preset. For example, the camera module of the near-eye display device has determined a camera reference matrix of the camera module at the time of shipment, and the near-eye display device may acquire the camera reference matrix of the camera module. Of course, the camera reference matrix of the near-eye display device may also be determined by a camera calibration technique, which is not limited herein.
For example, the near-eye display device may also obtain camera distortion parameters of the near-eye display device. The camera distortion parameters may include radial distortion parameters and tangential distortion parameters. The radial distortion parameters include, for example, parameter k 1 Parameter k 2 Parameter k 3 . Tangential distortion parameters include, for example, parameter p 1 Parameter p 2 . The camera distortion parameters of the near-eye display device may be preset, for example, the camera module of the near-eye display device has determined the camera distortion parameters of the camera module at the time of shipment, and the near-eye display device may acquire the camera distortion parameters of the camera module. Of course, the distortion parameters of the near-eye display device may be determined by camera calibration techniques, and are not limited in this regard.
For example, based on the acquired camera intrinsic matrix and the camera distortion parameters of the near-eye display device, a camera extrinsic matrix of the near-eye display device may be determined. For example, the near-eye display device may process, through a preset visual detection technology, a camera internal parameter matrix of the near-eye display device, a camera distortion parameter, a corner position of a display of the electronic device, and a device parameter of the display of the electronic device, to obtain a camera external parameter matrix of the near-eye display device. Visual detection techniques may include, without limitation, a monocular relative pose estimation function (solvePnP) in OpenCV, or other techniques. The device parameters of the display of the electronic device may include, among other things, a size parameter of the display and a screen resolution of the display. The device parameters of the display may be acquired by the near-eye display device via a communication connection established between the near-eye display device and the electronic device, without limitation. Of course, the camera external parameter matrix of the near-eye display device may also be determined by a camera calibration technique, which is not limited herein.
For example, when the angular point position of the display of the electronic device is determined, the angular point position may be converted according to the camera internal parameter matrix and the camera external parameter matrix of the near-eye display device, so as to obtain an interface range corresponding to the man-machine interaction interface on the electronic device.
The corner positions after conversion are uniformly represented as (U, V) and (X w ,Y w ,Z w ) The camera internal parameter matrix is a matrix K, and the camera external parameter matrix is a matrix M.
The matrix K can be expressed as:
wherein f x F y Focal length of camera for near-eye display device, c x C y Is the camera optical center position of the near-eye display device.
For example, the matrix M may be solved by a monocular relative pose estimation function (solvePnP). For example, by a solvePnP function, a matrix K, camera distortion parameters (K 1 ,k 2 ,p 1 ,p 2 ,k 3 ) The location of the corner points (U, V) of the display of the electronic device and the device parameters of the display of the electronic device.
The matrix M can be expressed as:
wherein R is a camera rotation matrix of the near-eye display device, and R 11 To r 33 A rotation vector of a camera rotation matrix of the near-eye display device, t is a camera translation matrix of the near-eye display device, t x 、t y T z A translation vector that is a camera translation matrix of the near-eye display device.
For example, corner positions (U, V) and converted corner positions (X w ,Y w ,Z w ) The association relation can be established through a matrix K and a matrix M. Wherein, the association relationship can be expressed as:
in some embodiments, a projection matrix P may be defined, which may be expressed as:
Based on this, the association relationship can be simplified as:
for example, as shown in fig. 2, when the corner positions of the display include the corner positions of each of the corner a to the corner D, such as a (U a ,V a ),B(U b ,V b ),C(U c ,V c ) D (U) d ,V d ) When the method is used, the respective corner positions from the corner A to the corner D can be respectively substituted into the association relation, and the conversion processing is carried out on the corner positions to obtain the respective converted corner positions from the corner A to the corner D, such as A (X) a ,Y a ,Z a ),B(X b ,Y b ,Z b ),C(X c ,Y c ,Z c ) D (X) d ,Y d ,Z d ). And determining the interface range corresponding to the man-machine interaction interface on the electronic equipment based on the determined corner position after the conversion processing.
Therefore, the interface range corresponding to the man-machine interaction interface on the electronic equipment is obtained by combining the angular point positions according to the camera internal parameter matrix of the near-eye display equipment and the camera external parameter matrix of the near-eye display equipment, and the accuracy and the convenience of determining the interface range corresponding to the man-machine interaction interface on the electronic equipment are improved. Based on the determination of the interface range corresponding to the man-machine interaction interface on the electronic equipment, when a subsequent user controls the electronic equipment through the near-eye display equipment, the situation that the user cannot control the electronic equipment through the near-eye display equipment due to the difference between the angular point position of the display of the electronic equipment determined according to the target image and the interface range corresponding to the man-machine interaction interface on the electronic equipment can be avoided, and the accuracy and convenience of the control of the electronic equipment are improved.
S103, determining a target operation position in the man-machine interaction interface according to the visual focus position.
For example, in the case that the visual focus position of the user on the electronic device is determined, the interface range may be combined, and the target operation position in the man-machine interaction interface may be determined according to the visual focus position of the user on the man-machine interaction interface of the electronic device.
In some embodiments, a projection of the visual focus position on an interface range corresponding to the human-computer interaction interface can be determined to determine a target operation position in the human-computer interaction interface. As shown in fig. 2, the interface range corresponding to the man-machine interface may be defined by a (X a ,Y a ,Z a ),B9X b ,Y b ,Z b ),C(X c ,Y c ,Z c ) D (X) d ,Y d ,Z d ) A three-dimensional space is constructed, and the visual focus position L (X l ,Y l ,Z l ) Projecting the visual focus position of the user on the man-machine interaction interface of the electronic equipment into the three-dimensional space range so as to convert the visual focus position of the user on the visual focus position into a target operation position in the man-machine interaction interface. Of course, the present invention is not limited thereto.
Therefore, according to the visual focus position and in combination with the interface range corresponding to the man-machine interaction interface, the target operation position in the man-machine interaction interface can be determined, and convenience and accuracy of the target operation position are improved. Based on the determination of the target operation position in the man-machine interaction interface, the near-eye display device can interact with the man-machine interaction interface of the electronic device according to the target operation position. For example, the near-eye display device may further acquire operation information of the user on the near-eye display device based on the target operation position. The operation information may include touch operation, key operation, gesture changing operation, eye movement trajectory, and the like, which are not limited herein. Based on the association relation between the preset operation information and the control instruction, the corresponding control instruction is determined according to the operation information of the user so as to control the electronic equipment, and the convenience degree of controlling the electronic equipment is improved.
Illustratively, the interface range includes a plurality of reference positions.
In some embodiments, a projected location of the visual focus position at the interface range is determined from the visual focus position and the plurality of reference positions; and determining the target operation position in the human-computer interaction interface according to the projection position.
For example, when the user controls the electronic device through the near-eye display device, the position of the user may be at a certain angle with respect to the position of the electronic device. For example, a user may face the electronic device with the near-eye display device worn to control the electronic device through the near-eye display device. For another example, the user may face the electronic device side-to-side with the near-eye display device in order to control the electronic device through the near-eye display device. In the case where the angle presented between the location where the user is located and the location where the electronic device is located is different, the user may be different from the target operation location in the human-machine interaction interface. Based on the above, when determining the target operation position in the man-machine interaction interface, a plurality of reference positions can be selected from the interface range corresponding to the man-machine interaction interface, so that the projection position of the visual focus position of the user on the interface range corresponding to the man-machine interaction interface is determined according to the plurality of reference positions, and the target operation position is determined.
As shown in fig. 2, the interface range corresponding to the man-machine interaction interface may be determined by the corner positions after conversion processing of each of the corner a to the corner D of the display of the electronic device, and then a plurality of corner positions after conversion processing may be selected from the corner positions after conversion processing of each of the corner a to the corner as reference positions. With reference position A (X a ,Y a ,Z a ),B(X b ,Y b ,Z b ) D (X) d ,Y d ,Z d ) For example, it is possible to use the method according to A (X a ,Y a ,Z a ) B (X) b ,Y b ,Z b ) The X axis corresponding to the man-machine interface is determined according to A (X a ,Y a ,Z a ) D (X) d ,Y d ,Z d ) And determining a Y axis corresponding to the human-computer interaction interface to respectively determine the projection position of the visual focus L on the X axis corresponding to the human-computer interaction interface and the projection position of the visual focus L on the Y axis corresponding to the human-computer interaction interface. For example, a vector pointing from corner a to corner B may be determinedVector from corner a to corner D +.>And a vector +.>For example, vector +.>Can be expressed asVector->Can be expressed asVector->Can be expressed as +.> According to vector->Vector->Vector->The projection position of the visual focus L on the X-axis can be determined asAnd determining the projection position of the visual focus L on the Y axis as +.>According to projection position- >Projection position +.>A target operational location in the human-machine interaction interface may be determined.
In this way, the target operation position of the user in the man-machine interaction interface is obtained by processing the visual focus position by utilizing at least one reference position included in the interface range corresponding to the man-machine interaction interface, so that the influence of the user on the target operation position in the man-machine interaction interface can be determined subsequently by comprehensively considering the included angle between the position of the user and the position of the electronic equipment, the accuracy of the user on the target operation position in the man-machine interaction interface can be improved, and the convenience and accuracy of the electronic equipment can be improved.
In some implementations, a target operational location in a human-machine interaction interface is determined based on a projection location and a pixel density of a display of an electronic device.
For example, when the pixel density of the display of the electronic device is different, the determined target operation position in the human-computer interaction interface may be different according to the same projection position. Based on this, it is necessary to obtain the pixel density of the display of the electronic device to determine the target operation position in the human-computer interaction interface according to the projection position and the pixel density of the display.
For example, device parameters of a display of an electronic device may be acquired. The device parameters may include a size parameter of the display and a screen resolution of the display. The dimensional parameter may include a width W of the display and a height H of the display. The screen resolution may include a screen resolution width S of the display w Screen resolution height S of display h . The unit of the size parameter may be millimeter (mm), and the unit of the screen resolution may be pixel (px), which is not limited thereto. Based on the acquired width W of the display and the screen resolution width S of the display w The pixel density W of the display in the horizontal direction can be determined ppi Such as the pixel density W of the display in the X-axis direction ppi Pixel density W ppi Can be expressed asBased on the acquired height H of the display and the screen resolution height S of the display h The pixel density H of the display in the vertical direction can be determined ppi Such as the pixel density H of the display in the Y-axis direction ppi Pixel density H ppi Can be expressed as +.>
Taking the visual focus L as an example, as shown in FIG. 2, the projection position of the visual focus L on the X-axis is determined asAnd determining the projection position of the visual focus L on the Y axis as +. >Can be multiplied by the pixel density W of the display in the X-axis direction according to the projection position of the visual focus L on the X-axis ppi The target operating position of the visual focus L on the X-axis is determined and the pixel density H of the display in the Y-axis direction can be multiplied by the projected position of the visual focus L on the Y-axis ppi A target operating position of the visual focus L on the Y-axis is determined. Based on the determination of the target operation position of the visual focus L on the X-axis and the target operation position of the visual focus L on the Y-axis, the target operation position in the human-computer interaction interface may be determined, e.g., the target operation position may be expressed as (X L ,Y L ). Target operating position (X) L ,Y L ) The expression of (2) is as follows:
of course, the target operation position is not limited to this, and for example, when the visual focus position is a position of a visual focus other than the visual focus L, a different target operation position may be obtained, which is not limited herein.
Therefore, the target operation position in the man-machine interaction interface is determined according to the projection position and the pixel density of the display of the electronic equipment, the influence of the pixel density of the display of the electronic equipment on the target operation position in the man-machine interaction interface can be comprehensively considered, and the accuracy of determining the target operation position in the man-machine interaction interface by a user is improved, so that the convenience and accuracy of subsequent control of the electronic equipment are improved.
S104, controlling the electronic equipment to execute corresponding operation according to the target operation position.
For example, the user may perform different operations at different operation positions in the human-machine interaction interface. Under the condition that the target operation position is determined, the near-to-eye display device can determine that the user needs to operate the man-machine interaction interface at the target operation position, so as to control the electronic device to execute corresponding operation according to the operation of the user on the man-machine interaction interface.
In some embodiments, the near-eye display device may further acquire eye movement information of the user based on the target operation position, thereby determining an eye movement trajectory of the user. And determining a control instruction for the electronic equipment according to the target operation position and the eye movement information of the user, so as to control the electronic equipment to execute corresponding operation according to the control instruction.
Taking a human-computer interaction interface as a video playing interface displayed by a computer as an example, and under the condition that the target operation position is the left edge area of the video playing interface, a display brightness adjustment instruction is associated with the left edge area of the video playing interface. When the sight line of the user moves upwards from the target operation position according to the eye movement track of the user, the near-eye display device can determine that the control instruction of the computer is to improve the display brightness of the video, and then the near-eye display device can control the computer to improve the display brightness of the video which is being played by the computer. Accordingly, when it is determined that the line of sight of the user moves downward from the target operation position according to the eye movement locus of the user, the near-eye display device may determine that the control instruction to the computer is to reduce the display brightness of the video, and the near-eye display device may control the computer to reduce the display brightness of the video being played by the computer.
And under the condition that the target operation position is the right edge area of the video playing interface, a playing volume adjusting instruction is associated with the right edge area of the video playing interface. When the sight line of the user moves upwards from the target operation position according to the eye movement track of the user, the near-eye display device can determine that the control instruction of the computer is to turn up the playing volume of the video, and then the near-eye display device can control the computer to turn up the playing volume of the video played by the computer. Accordingly, when it is determined that the line of sight of the user moves downward from the target operation position according to the eye movement track of the user, the near-eye display device can determine that the control instruction to the computer is to turn down the playing volume of the video, and the near-eye display device can control the computer to turn down the playing volume of the video being played by the computer.
Of course, the method is not limited thereto, and taking a man-machine interaction interface as an example of a television interface displayed by a television, and in the case that the target operation position is a channel selection list of the television interface, the near-to-eye display device can control the television to select a television channel to be played by the television according to a corresponding control instruction, which is not limited herein.
Therefore, by controlling the electronic equipment to execute corresponding operation according to the target operation position, the near-to-eye display equipment can determine the control requirement of a user on the electronic equipment according to the target operation position so as to control the electronic equipment to execute corresponding operation, thereby being beneficial to improving the convenience and accuracy of controlling the electronic equipment.
In some embodiments, determining a control instruction corresponding to the electronic device according to operation information of a user on the near-eye display device; and controlling the electronic equipment to execute the operation corresponding to the control instruction at the target operation position.
For example, the user may perform the same operation at different operation positions in the human-machine interface. Taking a human-computer interaction interface as a document interface projected by a projector as an example, the near-to-eye display device can control the projector to display a document operation list on the document interface or control the projector to highlight corresponding characters on the document interface under the condition that the target operation position is any one of a text area of the document interface and a blank area of the document interface. Based on the control requirement, the near-eye display device can acquire the operation information of the user on the near-eye display device so as to clearly control the projector, such as determining a control instruction corresponding to the projector. For example, when the operation information of the user on the near-eye display device is a click operation, it may be determined that the control instruction is for instructing the projector to display the document operation list on the target operation position of the document interface, that is, the projector may be controlled to display the document operation list on the target operation position of the document interface. When the operation information of the near-eye display device is a mobile operation, it may be determined that the control instruction is used to instruct the projector to highlight the text within the range where the target operation position is located in the document interface, i.e., the projector may be controlled to highlight the text within the range where the target operation position is located in the document interface. Of course, the present invention is not limited thereto.
Therefore, the control instruction corresponding to the electronic equipment is determined according to the operation information of the user on the near-eye display equipment, so that the electronic equipment is controlled to execute the operation corresponding to the control instruction at the target operation position, and the convenience and accuracy of controlling the electronic equipment are improved.
In some embodiments, the control instruction corresponding to the electronic device is determined according to at least one of touch operation, key operation and gesture change operation of the near-eye display device by the user.
For example, the user may operate the near-eye display device in different modes of operation.
For example, the near-eye display device may be configured with a touch module, so that a user may touch the touch module, and the near-eye display device may determine a touch operation of the user on the near-eye display device according to the obtained touch information of the user on the touch module. When the user moves the touch near-to-eye display device according to the touch operation, the control instruction corresponding to the electronic device can be determined to be a movement control instruction, so that the electronic device can be controlled to execute the operation corresponding to the movement control instruction. When the long-press touch near-to-eye display device is determined according to the touch operation, the control instruction corresponding to the electronic device can be determined to be the long-press control instruction, so that the electronic device can be controlled to execute the operation corresponding to the long-press control instruction. Of course, the present invention is not limited thereto.
For example, the near-eye display device may be configured with a key module, so that the user may press the key module, and the near-eye display device may determine, according to the obtained information of the user pressing the key module, the key operation of the user on the near-eye display device. When the user presses the near-to-eye display device for a single time according to the key operation, the control instruction corresponding to the electronic device can be determined to be a click control instruction, so that the subsequent control electronic device can execute the operation corresponding to the click control instruction. When the user presses the near-to-eye display device for a long time according to the key operation, the control instruction corresponding to the electronic device can be determined to be a long-press control instruction, so that the electronic device can be controlled to execute the operation corresponding to the long-press control instruction. Of course, the present invention is not limited thereto.
For example, the near-eye display device may be configured with a gesture detection module to detect a gesture change condition of the user, and the near-eye display device may determine a gesture change operation of the user on the near-eye display device according to the gesture change condition of the user. The gesture detection module may include an IMU sensor, without limitation. When the user nods according to the gesture changing operation, the control instruction corresponding to the electronic device can be determined to be a click control instruction, so that the subsequent control electronic device can execute the operation corresponding to the click control instruction. When the user is determined to watch the target operation position for a long time according to the gesture changing operation, the control instruction corresponding to the electronic device can be determined to be a focusing control instruction, so that the electronic device can be controlled to execute the operation corresponding to the focusing control instruction. Of course, the present invention is not limited thereto.
For example, when at least two of a touch operation, a key operation, and a gesture changing operation of the near-eye display device by the user are detected, the near-eye display device may select one of the touch operation, the key operation, and the gesture changing operation, determine a control instruction corresponding to the electronic device, and may also determine a control instruction corresponding to the electronic device by integrating at least two of the touch operation, the key operation, and the gesture changing operation. For example, when the control instruction corresponding to the electronic device determined according to the touch operation is different from the control instruction corresponding to the electronic device determined according to the key operation, the control instruction corresponding to the electronic device determined by the operation with the smallest time interval between the current time and the time when the touch operation is detected and the time when the key operation is detected may be selected to control the electronic device. When the control instruction corresponding to the electronic device determined according to the touch operation is different from the control instruction corresponding to the electronic device determined according to the key operation, the electronic device can be controlled to execute the operations corresponding to the different control instructions at the same time. Of course, the present invention is not limited thereto.
Therefore, the control instruction corresponding to the electronic equipment is determined according to at least one of touch operation, key operation and gesture change operation of the user on the near-eye display equipment, so that the user can flexibly select different operation modes to operate the near-eye display equipment according to the occupation condition of the hands of the user, the control instruction corresponding to the electronic equipment is determined, and the control flexibility and convenience of the electronic equipment are improved.
The control method of the electronic device provided in the above embodiment includes: acquiring a target image obtained by shooting electronic equipment by near-eye display equipment; according to the target image and eye movement information of the user, determining a visual focus position of a human-computer interaction interface of the electronic equipment by the user; determining a target operation position in the man-machine interaction interface according to the visual focus position; and controlling the electronic equipment to execute corresponding operation according to the target operation position so as to improve the convenience of controlling the electronic equipment.
Referring to fig. 3, fig. 3 is a schematic block diagram of a control device of an electronic apparatus according to an embodiment of the present application. The control device of the electronic device may be configured in a server or a near-eye display device, for executing the control method of the electronic device.
As shown in fig. 3, the control device of the electronic apparatus includes: the acquisition module 110, the first determination module 120, the second determination module 130, and the control module 140.
The acquiring module 110 is configured to acquire a target image obtained by capturing an electronic device with a near-eye display device.
The first determining module 120 is configured to determine, according to the target image and eye movement information of the user, a visual focus position of the user on a human-computer interaction interface of the electronic device.
And the second determining module 130 is configured to determine a target operation position in the human-computer interaction interface according to the visual focus position.
And the control module 140 is used for controlling the electronic equipment to execute corresponding operation according to the target operation position.
The first determination module 120, for example, includes an interface range determination sub-module and a visual focus determination sub-module.
An interface range determining sub-module, configured to determine an interface range corresponding to a man-machine interaction interface on the electronic device according to the target image;
and the visual focus determining submodule is used for determining the visual focus position of the user in the interface range according to the eye movement information.
Illustratively, the interface range determination submodule includes a corner position determination submodule and a corner position conversion submodule.
And the angular point position determining submodule is used for determining the angular point position of the display of the electronic equipment according to the target image.
And the angular point position conversion sub-module is used for converting the angular point position according to the shooting parameters of the near-eye display device to obtain an interface range corresponding to the man-machine interaction interface on the electronic device.
Illustratively, the corner position conversion submodule includes a parameter conversion submodule.
And the parameter conversion sub-module is used for carrying out conversion processing on the angular point positions according to the camera internal parameter matrix of the near-eye display equipment and the camera external parameter matrix of the near-eye display equipment to obtain an interface range corresponding to the man-machine interaction interface on the electronic equipment.
Illustratively, the interface range includes at least one reference position.
The second determination module 130 includes a projection position determination sub-module and an operation position determination sub-module.
And the projection position determining sub-module is used for determining the projection position of the visual focus position in the interface range according to the visual focus position and the plurality of reference positions.
And the operation position determining sub-module is used for determining the target operation position in the man-machine interaction interface according to the projection position.
Illustratively, the operational position determination submodule includes an operational position operator module.
And the operation position operation sub-module is used for determining a target operation position in the man-machine interaction interface according to the projection position and the pixel density of a display of the electronic equipment.
The control module 140, illustratively, includes an instruction determination sub-module and a control sub-module.
The instruction determining submodule is used for determining a control instruction corresponding to the electronic equipment according to the operation information of the user on the near-eye display equipment.
And the control sub-module is used for controlling the electronic equipment to execute the operation corresponding to the control instruction at the target operation position.
It should be noted that, for convenience and brevity of description, specific working processes of the apparatus, each module, and unit described above may refer to corresponding processes in the foregoing method embodiments, which are not repeated herein.
The methods of the present application are operational with numerous general purpose or special purpose computer system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The above-described methods, apparatus may be implemented, for example, in the form of a computer program that is executable on a near-eye display device. By way of example, the near-eye display device may include VR glasses, AR glasses, VR helmets, AR helmets, and the like, without limitation.
Referring to fig. 4, fig. 4 is a schematic block diagram of a near-eye display device according to an embodiment of the present application.
As shown in fig. 4, the near-eye display device includes a memory and a processor. The memory and the processor may be connected by a system bus, and the memory may include a storage medium and an internal memory.
The storage medium may store an operating system and a computer program. The computer program, when executed, may cause the processor to perform any of a number of control methods for an electronic device.
The processor is configured to provide computing and control capabilities to support operation of the entire near-eye display device.
The internal memory provides an environment for the execution of a computer program in a storage medium that, when executed by a processor, causes the processor to perform any of the methods of controlling an electronic device.
Those skilled in the art will appreciate that the structure shown in fig. 4 is merely a block diagram of a portion of the structure associated with the present application and does not constitute a limitation of the near-eye display device to which the present application is applied, and that a particular near-eye display device may include more or fewer components than shown in the figures, or may combine certain components, or have a different arrangement of components.
It should be appreciated that the processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other variable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Wherein in one embodiment the processor is adapted to execute a computer program and to implement the following steps when the computer program is executed:
acquiring a target image obtained by shooting electronic equipment by near-eye display equipment;
according to the target image and the eye movement information of the user, determining the visual focus position of the user on the man-machine interaction interface of the electronic equipment;
determining a target operation position in the man-machine interaction interface according to the visual focus position;
and controlling the electronic equipment to execute corresponding operation according to the target operation position.
It should be noted that, for convenience and brevity of description, the specific working process of the vision test described above may refer to the corresponding process in the foregoing embodiment of the control method of the electronic device, which is not described herein again.
Embodiments of the present application also provide a computer readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, a method implemented by the processor may refer to various embodiments of a control method of an electronic device of the present application.
The computer readable storage medium may be an internal storage unit of the near-eye display device according to the foregoing embodiment, for example, a hard disk or a memory of the near-eye display device. The computer readable storage medium may also be an external storage device of the near-eye display device, for example, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the near-eye display device.
It is to be understood that the terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations. It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments. While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A control method of an electronic device, characterized by being applied to a near-eye display device, the control method comprising:
acquiring a target image obtained by shooting electronic equipment by near-eye display equipment;
according to the target image and the eye movement information of the user, determining the visual focus position of the user on the man-machine interaction interface of the electronic equipment;
determining a target operation position in the man-machine interaction interface according to the visual focus position;
and controlling the electronic equipment to execute corresponding operation according to the target operation position.
2. The control method according to claim 1, wherein determining the visual focus position of the human-computer interaction interface of the electronic device according to the target image and the eye movement information of the user includes:
determining an interface range corresponding to a human-computer interaction interface on the electronic equipment according to the target image;
and determining the visual focus position of the user in the interface range according to the eye movement information.
3. The control method according to claim 2, wherein the determining, according to the target image, an interface range corresponding to a human-computer interaction interface on the electronic device includes:
Determining the angular point position of a display of the electronic equipment according to the target image;
and converting the angular point position according to shooting parameters of the near-eye display equipment to obtain an interface range corresponding to a human-computer interaction interface on the electronic equipment.
4. The control method according to claim 3, wherein the converting the angular point position according to the shooting parameters of the near-eye display device to obtain the interface range corresponding to the man-machine interaction interface on the electronic device includes:
and converting the angular point position according to the camera internal parameter matrix of the near-eye display device and the camera external parameter matrix of the near-eye display device to obtain an interface range corresponding to a human-computer interaction interface on the electronic device.
5. The control method of claim 2, wherein the interface range includes a plurality of reference positions;
the determining, according to the visual focus position, a target operation position in the man-machine interaction interface includes:
determining a projection position of the visual focus position in the interface range according to the visual focus position and the plurality of reference positions;
And determining a target operation position in the man-machine interaction interface according to the projection position.
6. The control method according to claim 5, wherein determining the target operation position in the human-computer interaction interface according to the projection position includes:
and determining a target operation position in the man-machine interaction interface according to the projection position and the pixel density of a display of the electronic equipment.
7. The control method according to any one of claims 1 to 6, characterized in that the controlling the electronic apparatus to perform the corresponding operation according to the target operation position includes:
determining a control instruction corresponding to the electronic equipment according to the operation information of the user on the near-eye display equipment;
and controlling the electronic equipment to execute the operation corresponding to the control instruction at the target operation position.
8. A control device of an electronic apparatus, characterized in that the control device comprises:
the acquisition module is used for acquiring a target image obtained by shooting the electronic equipment by the near-eye display equipment;
the first determining module is used for determining the visual focus position of the human-computer interaction interface of the electronic equipment by the user according to the target image and the eye movement information of the user;
The second determining module is used for determining a target operation position in the man-machine interaction interface according to the visual focus position;
and the control module is used for controlling the electronic equipment to execute corresponding operation according to the target operation position.
9. A near-eye display device, the near-eye display device comprising a memory and a processor;
the memory is used for storing a computer program;
the processor being configured to execute the computer program and to implement the control method of the electronic device according to any one of claims 1 to 7 when the computer program is executed.
10. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, realizes the steps of the control method of an electronic device as claimed in any one of claims 1 to 7.
CN202311509833.6A 2023-11-13 2023-11-13 Control method, device, equipment and storage medium of electronic equipment Pending CN117590942A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311509833.6A CN117590942A (en) 2023-11-13 2023-11-13 Control method, device, equipment and storage medium of electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311509833.6A CN117590942A (en) 2023-11-13 2023-11-13 Control method, device, equipment and storage medium of electronic equipment

Publications (1)

Publication Number Publication Date
CN117590942A true CN117590942A (en) 2024-02-23

Family

ID=89919255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311509833.6A Pending CN117590942A (en) 2023-11-13 2023-11-13 Control method, device, equipment and storage medium of electronic equipment

Country Status (1)

Country Link
CN (1) CN117590942A (en)

Similar Documents

Publication Publication Date Title
US11290651B2 (en) Image display system, information processing apparatus, image display method, image display program, image processing apparatus, image processing method, and image processing program
US10948993B2 (en) Picture-taking within virtual reality
JP5981591B1 (en) Computer program and computer system for controlling object operations in an immersive virtual space
US11636644B2 (en) Output of virtual content
US11301051B2 (en) Using natural movements of a hand-held device to manipulate digital content
US20140253433A1 (en) Image projection apparatus, system, and image projection method
CN111353930B (en) Data processing method and device, electronic equipment and storage medium
US10979700B2 (en) Display control apparatus and control method
US20140204083A1 (en) Systems and methods for real-time distortion processing
JP2012238293A (en) Input device
CN111176425A (en) Multi-screen operation method and electronic system using same
WO2011096571A1 (en) Input device
US11106278B2 (en) Operation method for multi-monitor and electronic system using the same
US9424808B2 (en) Image cropping manipulation method and portable electronic device
JP2016181302A (en) Computer program and computer system for controlling object operation in immersive virtual space
CN117590942A (en) Control method, device, equipment and storage medium of electronic equipment
CN114201028B (en) Augmented reality system and method for anchoring display virtual object thereof
KR101709529B1 (en) Apparatus and method for controlling image screen using portable terminal
JP7335726B2 (en) Information display device
CN112954197A (en) Shooting method, shooting device, electronic equipment and readable storage medium
WO2019100547A1 (en) Projection control method, apparatus, projection interaction system, and storage medium
WO2020115732A1 (en) Smart head-mounted display alignment system and method
CN115348438B (en) Control method and related device for three-dimensional display equipment
CN113364985B (en) Live broadcast lens tracking method, device and medium
CN114205532B (en) Shooting method, device, equipment and medium of virtual camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination