CN110187773B - Augmented reality glasses control method, apparatus, and computer storage medium - Google Patents

Augmented reality glasses control method, apparatus, and computer storage medium Download PDF

Info

Publication number
CN110187773B
CN110187773B CN201910481603.0A CN201910481603A CN110187773B CN 110187773 B CN110187773 B CN 110187773B CN 201910481603 A CN201910481603 A CN 201910481603A CN 110187773 B CN110187773 B CN 110187773B
Authority
CN
China
Prior art keywords
key
current
function
mode
variable parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910481603.0A
Other languages
Chinese (zh)
Other versions
CN110187773A (en
Inventor
李学文
刘洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongkehai Micro Beijing Technology Co ltd
Original Assignee
Zhongkehai Micro Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongkehai Micro Beijing Technology Co ltd filed Critical Zhongkehai Micro Beijing Technology Co ltd
Priority to CN201910481603.0A priority Critical patent/CN110187773B/en
Publication of CN110187773A publication Critical patent/CN110187773A/en
Application granted granted Critical
Publication of CN110187773B publication Critical patent/CN110187773B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a method, equipment and computer storage medium for augmented reality glasses control. An increase key, a decrease key and a function key are configured on the line control of the augmented reality glasses, and the method comprises the following steps: under the condition that the triggering state information of the function key is determined to meet the set mode switching condition, switching the current function mode of the augmented reality glasses and operating according to the triggering state information of the function key; under the condition that the increasing key is determined to be touched, the current set variable parameter value in the current function mode is increased in an upward jumping mode according to the set gear value; and under the condition that the reduction key is determined to be touched, the current set variable parameter value in the current function mode is reduced in a downward jump mode according to the set gear position value.

Description

Augmented reality glasses control method, apparatus, and computer storage medium
Technical Field
The present application relates to the field of smart terminal technology, and for example, to a method, an apparatus, and a computer storage medium for augmented reality glasses control.
Background
Augmented Reality (AR) technology is a new technology for seamlessly integrating real world information and virtual world information, and is characterized in that entity information (visual information, sound, taste, touch and the like) which is difficult to experience in a certain time space range of the real world originally is simulated and then superposed through scientific technologies such as computers, and virtual information is applied to the real world and is perceived by human senses, so that the sensory experience beyond Reality is achieved.
Augmented reality glasses, i.e., glasses adopting an augmented reality technology, have been gradually applied to daily life, and with the development of the AR technology, the augmented reality glasses have various functions, for example: functions such as shooting, face recognition and license plate recognition are more functional keys which may be needed with the increase of functions, but the space of the AR glasses is limited, so that the complex and various functions of the AR glasses are realized through a small number of simple keys, which is a problem to be solved urgently.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a method, a device, equipment, a computer program product and a computer readable storage medium for controlling augmented reality glasses, which are used for solving the problem of realizing complex and various functions of AR glasses through a small number of simple keys.
In some optional embodiments, the by-wire of the augmented reality glasses is configured with an increase key, a decrease key and a function key, and the method includes:
Under the condition that the triggering state information of the function key is determined to meet the set mode switching condition, switching the current function mode of the augmented reality glasses and operating according to the triggering state information of the function key;
under the condition that the increasing key is determined to be touched, the current set variable parameter value in the current function mode is increased in an upward jumping mode according to the set gear value;
and under the condition that the reduction key is determined to be touched, the current set variable parameter value in the current function mode is reduced in a downward jump mode according to the set gear position value.
The embodiment of the disclosure provides a device for controlling augmented reality glasses.
In some optional embodiments, the by-wire control of the augmented reality glasses is configured with an increase key, a decrease key and a function key, and the apparatus includes:
the function switching unit is configured to switch and operate the current function mode of the augmented reality glasses according to the trigger state information of the function key under the condition that the trigger state information of the function key is determined to meet a set mode switching condition;
an increase control unit configured to, in a case where it is determined that the increase key is touched, perform a jump-up of a value of a currently set variable parameter in the current functional mode according to a set gear position value;
A reduction control unit configured to perform a down-step reduction of a currently set variable parameter value in the current functional mode according to the set gear value in a case where it is determined that the reduction key is touched.
The embodiment of the disclosure provides augmented reality glasses control equipment.
In some optional embodiments, the augmented reality glasses control apparatus comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor, and the instructions, when executed by the at least one processor, cause the at least one processor to perform the augmented reality glasses control method described above.
The disclosed embodiments provide a computer program product.
In some alternative embodiments, the computer program product comprises a computer program stored on a computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the augmented reality glasses control method described above.
The disclosed embodiments provide a computer-readable storage medium.
In some alternative embodiments, the computer-readable storage medium stores computer-executable instructions configured to perform the augmented reality glasses control method described above.
Some technical solutions provided by the embodiments of the present disclosure can achieve the following technical effects:
dispose the drive-by-wire that increases the button, reduces button and function button on the AR glasses, can realize the multiple functions of AR glasses, like this, through a small amount of simple buttons, realize the complicated manifold function of AR glasses, can further reduce the volume of AR glasses, improve the intelligence and the user experience of AR glasses.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in limitation thereof, in which elements having the same reference numeral designations are shown as like elements and not in limitation thereof, and wherein:
fig. 1 is a schematic flowchart of a method for controlling augmented reality glasses according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of an augmented reality glasses control method provided in an embodiment of the present disclosure;
Fig. 3 is a schematic structural diagram of an augmented reality glasses control apparatus provided in an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an augmented reality glasses control apparatus provided in an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present disclosure.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, one or more embodiments may be practiced without these details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
In the embodiment of the disclosure, complex and various functions of the AR glasses can be realized through a small number of simple line control keys.
Fig. 1 is a schematic flowchart of a method for controlling augmented reality glasses according to an embodiment of the present disclosure. The drive-by-wire of the AR glasses is provided with an increase key, a decrease key and a function key. As shown in fig. 1, the process of augmented reality glasses control includes:
Step 101: and under the condition that the triggering state information of the function keys meets the set mode switching condition, switching the current function mode of the augmented reality glasses and operating according to the triggering state information of the function keys.
AR glasses have a variety of functions, such as: the functions of camera shooting, picture taking and video recording, face recognition, license plate recognition, identification card recognition, picture taking or video browsing and the like. The operation of the function can be controlled by corresponding different keys to different functions, but the space on the AR glasses is smaller, and the functions that the AR glasses can realize are more, so that in the embodiment of the disclosure, some functions on the AR glasses can be realized by the keys on the line control of the AR glasses, and some functions can be realized by the terminal bound with the AR glasses and the keys on the line control of the AR glasses together.
Therefore, when the trigger state information of the function keys on the drive-by-wire of the AR glasses meets the set mode switching condition, the key drive-by-wire process on the AR glasses can be entered, namely the current function mode of the augmented reality glasses can be switched and operated according to the trigger state information of the function keys. Wherein, the triggering status information may include: point touch, triggering for a time longer than a set time, or triggering twice or more within a set time. For example: the increasing key is touched, or the function key is triggered, and the triggering time exceeds 4 seconds, or the decreasing key is triggered twice within 1.5 seconds, and the like, which are the triggering state information corresponding to each key.
For example: and triggering the function key on the drive-by-wire of the AR glasses, wherein the triggered time exceeds the set time, for example, 3 seconds, and then determining that the triggering state information of the function key on the drive-by-wire of the AR glasses meets the set mode switching condition. Or, the function key on the drive-by-wire of the AR glasses is continuously triggered twice or more within 1 second, and it may also be determined that the trigger state information of the function key on the drive-by-wire of the AR glasses satisfies the set mode switching condition.
Therefore, the drive-by-wire process of the key can be entered under the condition that the triggering state information of the function key meets the set mode switching condition. The function mode capable of being controlled by a line can be configured on the AR glasses in advance, and the function mode comprises the following steps: the first set of functional modes, the second set of functional modes, and the third set of functional modes … may be configured to determine the functional mode to be configured based on the specific implementation of each function. The first set function mode may be a default function mode.
For example: the AR glasses are provided with the function mode capable of being controlled by the wire, and the function mode comprises the following steps: the system comprises the following steps of shooting, photographing and recording, face recognition, license plate recognition and the like, wherein the shooting function mode is a default function mode, namely the shooting function mode can be realized after AR glasses are started.
In this way, when it is determined that the trigger state information of the function key satisfies the set mode switching condition, once the function key is touched, the function key can be switched cyclically in the configured function mode capable of being controlled by the wire. For example: after the AR glasses are started, the camera shooting function mode is operated, so that the trigger state information of the function key can be determined to meet the set mode switching condition after the function key on the line control is pressed for a long time for more than three seconds, therefore, after the function key is touched, the current function mode can be switched to the camera video recording function mode, and the function key is not triggered in 3 seconds and operates to the camera video recording function mode. If the function key is touched twice, three times or four times within 3 seconds, the current function mode can be respectively switched into a face recognition function mode, a license plate recognition function mode and a camera function mode, namely, the function key is switched circularly in the configured function mode capable of being controlled by wires.
The AR glasses can be operated according to the switched current function mode, once the switched current function mode is operated, if switching needs to be continued, it still needs to be determined whether the trigger state information of the function key satisfies the set mode switching condition? When it is determined that the trigger state information of the function key satisfies the set mode switching condition, the current function mode may be switched continuously, and at this time, the default function is the function mode before switching, for example: the AR glasses operate the face recognition function, if the function keys are continuously triggered twice within two seconds, the set mode switching condition is met, at the moment, the default function is the face recognition function, if the function keys are touched once and are not continuously triggered within three seconds, the current function mode can be switched to the license plate recognition function, and the license plate recognition function is operated.
Step 102: and under the condition that the increasing key is determined to be touched, the current set variable parameter value in the current function mode is increased upwards according to the set gear value.
Step 103: and under the condition that the reduction key is determined to be touched, the current set variable parameter value in the current function mode is reduced in a downward jump mode according to the set gear value.
The current function mode is determined by the function key on the drive-by-wire of the AR glasses, at the moment, the increasing key on the drive-by-wire is touched, the current set variable parameter value under the current function mode can be increased in a jumping-up mode according to the set gear value, and the decreasing key on the drive-by-wire is touched, the current set variable parameter value under the current function mode can be decreased in a jumping-down mode according to the set gear value.
The function key on the line control of the AR glasses may determine one, two or more parameters of the line control in each function mode, that is, corresponding to one or more of the first setting variable parameter, the second setting variable parameter, the third setting variable parameter …, and the like, where the first setting variable parameter may be determined as a default variable parameter, that is, when the function mode starts to operate, the corresponding first setting variable is controlled.
In this way, when the functional mode is started and operated, the increasing button on the drive-by-wire is touched, the first setting variable parameter value can be increased by one gear, and if the decreasing button is touched, the first setting variable parameter value can be decreased by one gear. For example: the current function mode is a camera shooting function mode, and the corresponding first setting variable parameter is volume, so that the volume value can be increased by one gear if the increasing key is touched, and can be decreased by one gear if the decreasing key is touched. Or, if the current function mode is a license plate recognition function mode, the corresponding first setting variable parameter is brightness, so that the brightness value can be increased by one gear if the increase key is touched, and can be decreased by one gear if the decrease key is touched.
Under the condition that variable parameter values corresponding to gears are different, the increase button is triggered until the maximum variable parameter value is set, namely the highest gear, and the decrease button is triggered until the minimum variable parameter value is set, namely the lowest gear.
For example, when the current function mode is a photographing and recording function mode, the first setting variable parameter is a mode parameter, and corresponding gears are a photographing function and a recording function, respectively, so that if the additional key is touched by a point, the photographing function and the recording function can be cyclically selected in an upward jumping manner, and if the additional key is touched by a point, the photographing function and the recording function can be cyclically selected in a downward jumping manner.
For the condition that the current function mode has two or more setting variable parameters, only the default first setting variable parameter cannot be controlled, so that when the current function mode is the first setting function mode, the current setting variable parameter can be replaced from the first setting variable parameter to the second setting variable parameter under the conditions that the adding key is determined to be triggered and the triggering state information meets the switching condition of the first setting parameter; or, the current set variable parameter is replaced from the second set variable parameter to the first set variable parameter.
When the current function mode is the first set function mode, the current set variable parameter can be replaced from the first set variable parameter to a third set variable parameter under the condition that the reduction key is determined to be triggered and the trigger state information meets the second set parameter switching condition; or, the current set variable parameter and thus the third set variable parameter are replaced with the first set variable parameter. The first setting parameter switching condition and the second setting parameter switching condition may be the same or different.
For example: when the current function mode is the first setting function mode, namely the camera shooting function, the current setting variable parameter can be volume, at this moment, an increase key on the drive-by-wire is triggered, and the triggering time exceeds 3 seconds, the condition that the increase key meets the first setting parameter switching condition can be determined, at this moment, the current setting variable parameter can be switched from the volume to the brightness, namely the brightness is the second setting variable parameter. Thus, the increase button is triggered or the decrease button is touched, and the brightness value is increased or decreased in an upward or downward jump according to the set gear value. Similarly, if the AR glasses operate the camera shooting function and the current set variable parameter is brightness, the increase button on the drive-by-wire is triggered, and the triggering time exceeds 3 seconds, the brightness adjustment can be quitted, namely the current set variable parameter is switched from brightness to volume.
Or, if the AR glasses operate the camera function and the current setting variable parameter is the volume, at this time, the decrease key on the drive-by-wire is continuously triggered twice within 1 second, it may be determined that the trigger state information of the decrease key satisfies the second setting parameter switching condition, the current setting variable parameter may be switched from the volume to the talk-back parameter, and at this time, the talk-back parameter is the third setting variable parameter. Thus, the increase button is triggered or the decrease button is touched, and the talkback volume value is increased or decreased in an upward or downward jump manner according to the set gear value. Similarly, if the AR glasses operate the camera function and the current setting variable parameter is the talk-back parameter, and at this time, the decrease key on the drive-by-wire is continuously triggered twice within 1 second, it can be determined that the trigger state information of the decrease key satisfies the second setting parameter switching condition, and the current setting variable parameter can be switched from the talk-back parameter to the volume.
In the line control process of the AR glasses, the increase key or the decrease key is touched, which is to control the current set variable parameter value in the current function mode, at this time, the function key is touched, which is generally invalid, but the present invention is not limited thereto, and for some set function modes, in the running process, the function key on the line control is touched, which may perform corresponding control, for example: and the AR glasses are used for performing the photographing and video recording functions, increasing the number of the keys to be triggered or reducing the number of the keys to be touched, and setting the current variable parameter as a mode parameter, so that the upward jump type or the downward jump type cyclic selection between the photographing function and the video recording function can be realized, and when the function keys are touched, corresponding operation actions can be executed. If the mode parameter is the photographing function, the function key can execute the determination of the selected picture when touched. If the mode parameter is the video recording function, the function key is touched, and the action of starting or ending the video recording can be executed. Therefore, in some embodiments of the present disclosure, when the current function mode is the second setting function mode, the method may further include: and determining the current operation action in the second set function mode according to the trigger state information of the function key, and executing the current operation action.
When general AR glasses start operation, what the operation is default function mode, be first setting function mode promptly, for example can the operation function mode of making a video recording after AR glasses start, and when AR glasses operate other setting function modes, for example: the shooting and video recording function mode and the license plate recognition function mode can be controlled to directly return to the first set function mode on the line control. In some embodiments of the present disclosure, when the current function mode is not the first setting function mode, in a case that it is determined that the reduction key is triggered and the trigger state information satisfies the setting exit condition, the method further includes: and switching the current function mode into a first set function mode. For example: when the AR glasses operate in a photographing and video recording function mode or a license plate recognition function mode, if the reduction keys on the drive-by-wire are triggered and the triggering time is longer than 3 seconds, the AR glasses can be switched to the photographing function mode to operate.
And when the functional mode after the switching is first when setting for the functional mode, after the functional mode of switching augmented reality glasses, still include: and determining the first set variable parameter value as the current set variable parameter value. When the AR glasses are switched to operate the first setting function mode, the first setting variable parameter is controlled by wire.
In practical application, when it is determined that the trigger state information of the function key satisfies the set mode switching condition, the AR display content and the AR display position parameter may be further determined, AR display is performed based on the AR display content and the AR display position parameter, and further, the updated AR display content and the updated AR display position parameter are determined based on the trigger state information of the add key, the drop key and the function key in the drive-by-wire and the current AR display content, and the AR display is updated based on the updated AR display content and the updated AR display position parameter.
For example, if the function key on the line control of the AR glasses is triggered and the triggered time exceeds a set time period, for example, 3 seconds, it may be determined that the trigger state information of the function key on the AR glasses satisfies the set mode switching condition. At this time, AR display content may be generated, which may include at least one of: the device comprises a mode list, a variable parameter value adjustment schematic diagram and a mode parameter list.
The display position of the AR display content may be set below the field of view, above the field of view, or other set positions, for example, a mode list, which may be arranged horizontally, and the current default function is set at the first position in the mode list. The AR display position parameters may include a corresponding area of AR display content in an effective pixel area in a current pixel coordinate system of the AR device, where the current pixel coordinate system is a pixel coordinate system determined based on a frame image currently acquired by the AR device, and the determination method of the pixel coordinate system is different for the AR device for video perspective and the AR device for optical perspective. The AR display position parameter may be a corresponding coordinate of an AR display content boundary point. This can reduce the amount of information to be stored and avoid waste of storage space.
In some examples, updated AR display content and updated AR display position parameters may be determined based on the trigger state information of the add key, the subtract key, and the function key, and the current AR display content, and the AR display may be updated based on the updated AR display content and the updated AR display position parameters.
For example, after the mode switching condition is satisfied, if the function key is touched once and is not touched again within three seconds, it may be determined that the mode switching is to be performed, and at this time, the switched function mode is determined, the selected cursor of the mode list may be adjusted to the selected and switched function mode, and the AR display position parameter is unchanged, and the AR display is continued.
In some examples, if the add key or the subtract key is touched, and the trigger state information does not satisfy the first setting parameter switching condition or the second setting parameter switching condition, determining a variable parameter value adjustment diagram corresponding to the current setting parameter, adding, in the AR display content, the AR content corresponding to the variable parameter value adjustment diagram, and determining a selected position of a selected cursor on the variable parameter value adjustment diagram based on a trigger result of the add key or the subtract key, for example, the variable parameter value adjustment diagram is a variable parameter value adjustment diagram corresponding to a volume, and after the add key or the subtract key is triggered, the volume is 10, selecting 10 the selected cursor in the variable parameter value adjustment diagram. And adding a variable parameter value to adjust the display position corresponding to the schematic diagram in the AR display position parameter.
When the increase key or the decrease key is touched, and the trigger state information meets the first set parameter switching condition or the second set parameter switching condition, determining a current corresponding variable parameter list, adding AR content corresponding to the variable list in AR display content, and determining the selected position of the selected cursor on the variable list based on the trigger result of the increase key or the decrease key. For example, the variable parameter list may include three variable parameters of brightness, talkback, and volume, and according to the trigger result, it is determined that the talkback should be selected at present, and then the selected position of the selected cursor is set on the talkback. And adding a display position corresponding to the variable parameter list in the AR display position parameter.
For example, when the current function mode is a photographing and recording function mode, the first setting variable parameter is a mode parameter, and corresponding gears are a photographing function and a recording function, respectively, so that if the additional key is touched by a point, the photographing function and the recording function can be cyclically selected in an upward jumping manner, and if the additional key is touched by a point, the photographing function and the recording function can be cyclically selected in a downward jumping manner. At this time, the current corresponding mode parameter list may be determined according to the trigger result of the add key or the subtract key, the AR content corresponding to the mode parameter list is added to the AR display content, and the selected position of the selected cursor on the variable list is determined based on the trigger result of the add key or the subtract key. For example, the mode parameter list may include a photographing function and a recording function, and if it is determined that the photographing function should be selected at present according to the trigger result, the selected position of the selected cursor is set on the photographing function. And adding a display position corresponding to the variable parameter list in the AR display position parameter. In other examples, based on the trigger state information of the add key, the subtract key, and the function key, when new AR content needs to be added to the AR display content, the original AR content may be deleted, and in the AR display position parameter, the display position corresponding to the original AR content may be deleted. For example, when the mode parameter list is displayed, the mode list is deleted to avoid that the displayed picture is too crowded to influence the display control effect.
Therefore, in the embodiment of the disclosure, the control process of the drive-by-wire can be visualized through the AR display, so that the control of the user on the functions of the AR glasses can be more convenient.
In addition, by arranging the line control of the increase key, the decrease key and the function key on the AR glasses, the switching of various functions of the AR glasses can be realized, for example: make a video recording, take a picture video recording, face identification, license plate discernment etc to need not the control of the terminal that AR glasses bound, like this, through a small amount of simple drive-by-wire buttons, realize the complicated manifold function of AR glasses, improve the intelligence and the user experience of AR glasses.
Of course, the functions of the AR glasses are not only those controlled by the push-to-wire, but with the development of intelligent technology, the AR glasses also have many functions, for example: some of the functions may require the terminal bound to the AR glasses to set parameter data or perform other control, so that the current function mode of the AR glasses may be determined after the terminal sends an operation instruction to the AR glasses. For example: and starting AR navigation on the terminal bound with the AR glasses, and inputting a destination, wherein the terminal can generate a corresponding operation instruction and send the operation instruction to the AR glasses, so that the AR glasses operate a corresponding current function mode, namely AR navigation, according to the received operation instruction. Or, photo browsing is started on a terminal bound by the AR glasses, so that the terminal can generate a corresponding operation instruction and send the operation instruction to the AR glasses, and the AR glasses operate the photo browsing according to the received operation instruction. It can be seen that the current function mode of operation is determined according to the received operation instruction, and at this time, the current operation information corresponding to the current trigger state information of the add key, the subtract key or the function key in the current function mode can be determined according to the corresponding relationship among the stored function mode, the key trigger state information and the operation work information, and the operation is performed according to the current operation work information, wherein the operation instruction is generated by the terminal bound with the augmented reality glasses.
Since the functions of the AR glasses are various, the correspondence between the key trigger state information and the operation work information in each function mode can be stored in advance, then the current operation work information corresponding to the current trigger state information of the add key, the subtract key, or the function key in the current function mode is determined according to the stored correspondence, and the operation is performed according to the current operation work information.
Table 1 shows a correspondence relationship between a function mode, key trigger state information, and operation work information provided in the embodiment of the present disclosure.
Figure BDA0002084022570000111
TABLE 1
As shown in table 1, after determining the current function mode and the current trigger state information of each key, the corresponding current operation work information may be determined according to table 1, and thus, the operation may be performed according to the current operation work information.
For example: the terminal sends an operation instruction carrying the identification card identification function mode, so that the AR glasses can determine the identification card identification function mode and operate according to the received operation instruction, at the moment, if the increase key is triggered and the triggering time exceeds 3 seconds, the current operation work information can be determined according to the table 1 to adjust the brightness, and the corresponding control can be continued according to the point contact of the increase key and the decrease key.
Therefore, according to the instruction of the terminal and the line control of the keys, other functions of the AR glasses can be realized, the situation that the complexity of the line control is too high due to the fact that all the functions are switched and controlled through the line control is avoided, and the user experience is influenced due to the fact that the selection time is too long when the functions are switched is avoided.
The following operational flows are integrated into the specific embodiments to illustrate the control method provided by the embodiments of the present invention.
In one embodiment of the disclosure, an increase key (+), a decrease key (-) and a function key () are configured on the drive-by-wire of the AR glasses. And stores the correspondence between the functional mode, the key trigger state information, and the operation work information, for example, as shown in table 1. The functional modes with the wire control configuration comprise: the system comprises the following steps of shooting, shooting and recording, face recognition, license plate recognition and the like, wherein the shooting function mode is a default function mode, namely the first set function mode is the shooting function mode, and the second set function mode is the shooting and recording function mode.
Fig. 2 is a schematic flowchart of a method for controlling augmented reality glasses according to an embodiment of the present disclosure. As shown in fig. 2, the process of augmented reality glasses control includes:
step 201: is it determined whether the current functional mode is determined according to the received operation instruction? If not, go to step 202, and if so, go to step 214.
Step 202: determine whether the function button is triggered, and trigger time exceeds 3 seconds? If so, go to step 203, otherwise, go to step 204.
Step 203: and switching the current function mode of the augmented reality glasses and operating according to the trigger state information of the function keys. Proceed to step 205
For example: the current function mode is a camera shooting function mode, within 2 seconds, the function key is touched once, the current function mode is switched to a camera shooting and video recording function mode, and if the function key is touched three times within 2 seconds, the current function mode is directly switched to a license plate recognition function mode.
Step 204: determine whether the current functional mode is the picture-taking and video-recording mode? If so, go to step 205, otherwise, go to step 206.
Step 205: and switching on or off the photo selection or video recording according to the trigger state information of the function keys. Proceed to step 206
Step 206: is the add button determined to be touched? If yes, go to step 207, otherwise, go to step 208.
Step 207: and (5) performing jump-up increase on the current set variable parameter value in the current function mode according to the set gear position value, and turning to step 210.
Step 208: determine whether the add button is triggered in the camera function mode, and trigger time exceeds 3 seconds? If so, go to step 209, otherwise, go to step 210.
Step 209: and switching the current setting variable parameter between the first setting variable parameter and the second setting variable parameter. Proceed to step 210.
Step 210: is it determined whether the reduction key is touched? If so, go to step 211, otherwise, go to step 212.
Step 211: and (5) performing descending reduction on the current set variable parameter value in the current function mode according to the set gear position value, and ending the process.
Step 212: determine whether the button is triggered in the camera shooting mode, and trigger time exceeds 3 seconds? If yes, go to step 213, otherwise, this process ends.
Step 213: and switching the current setting variable parameter between the first setting variable parameter and the third setting variable parameter. The process is finished.
Step 214: and determining current operation working information corresponding to the current trigger state information of the increase key, the decrease key or the function key in the current function mode according to the corresponding relation among the stored function mode, the key trigger state information and the operation working information, and operating according to the current operation working information.
And determining current operation working information corresponding to the current trigger state information of the increase key, the decrease key or the function key according to the table 1, and operating according to the current operation working information.
Therefore, in the embodiment, by configuring the line control with the additional keys, the reduction keys and the function keys for the AR glasses, various functions of the AR glasses can be realized, the size of the AR glasses is further reduced, and the intelligence and the user experience of the AR glasses are improved.
In one embodiment of the disclosure, the line control on the AR glasses is provided with an increase key (+), a decrease key (-) and a function key (tangle-solidup). The functional modes with the wire control configuration comprise: the method comprises the steps of shooting, shooting and recording, face recognition, license plate recognition and the like, wherein the shooting function mode is a default function mode, and a terminal bound with the AR glasses can generate an operation instruction carrying the function mode, namely the current function mode of the AR glasses can be determined according to the operation instruction generated by the terminal.
In this embodiment, no matter whether the current function mode is through-wire by the key or determined according to the operation instruction generated by the terminal, the corresponding relationship between the function mode, the key trigger state information, and the operation work information may be saved.
Table 2 shows a correspondence relationship between the functional mode, the key trigger state information, and the operation work information provided in the embodiment of the present disclosure.
Figure BDA0002084022570000141
Figure BDA0002084022570000151
TABLE 2
Fig. 3 is a schematic flowchart of a method for controlling augmented reality glasses according to an embodiment of the present disclosure. As shown in fig. 3, the process of augmented reality glasses control includes:
Step 301: and determining the current functional mode and the current key triggering state information of the keys.
Wherein, the keys on the wire control comprise an increase key, a decrease key or a function key. The key triggering state information includes: touching, triggering and triggering time is more than 3 seconds, or triggering twice within 1 second. The current function mode may be a preset function mode that can be controlled by wire, or determined according to an operation instruction transmitted from the terminal.
Step 302: and determining the current operation working information corresponding to the current functional mode and the current key triggering state information according to the corresponding relation among the stored functional mode, the key triggering state information and the operation working information.
As shown in table 2, if the current function mode is photo browsing and the current trigger state information is adding key point touch, it may be determined that the current operation work information is displaying the next picture. And if the current function mode is to shoot the video in the video and the current trigger state information is to perform key point touch, determining that the current operation working information is to start the video.
Step 303: and operating according to the current operating work information.
For example: the AR glasses may display the next picture, or start recording, or stop recording, etc.
It can be seen that, in the embodiment of the present disclosure, the corresponding relationship between the function mode, the key trigger state information, and the operation work information is stored, so that, no matter whether the function mode of the AR glasses is controlled by the key of the drive-by-wire or determined according to the received operation instruction generated by the terminal, the current operation work information corresponding to the current function mode and the current key trigger state information can be queried and operated, so that the function control of the AR glasses can be quickly and simply implemented, multiple functions of the AR glasses can be implemented, the size of the AR glasses is further reduced, and the intelligence and the user experience of the AR glasses are improved.
An apparatus for augmented reality glasses control may be constructed according to a method for augmented reality glasses control.
Fig. 4 is a schematic structural diagram of an augmented reality glasses control device according to an embodiment of the present disclosure. The line control of the augmented reality glasses is configured with an increase key, a decrease key and a function key, as shown in fig. 4, the augmented reality glasses control device may include: a function switching unit 100, an increase control unit 200, and a decrease control unit 300.
And a function switching unit 100 configured to switch and operate a current function mode of the augmented reality glasses according to the trigger state information of the function key, in a case where it is determined that the trigger state information of the function key satisfies the set mode switching condition.
And an increase control unit 200 configured to, in a case where it is determined that the increase key is touched, jump up a value of a currently set variable parameter in the current function mode according to the set gear position value.
And a reduction control unit 300 configured to perform a down-step reduction of a currently set variable parameter value in the current function mode according to the set gear position value in a case where it is determined that the reduction key is touched.
In some embodiments of the present disclosure, when the current function mode is the first setting function mode, in a case that it is determined that the add key is triggered and the trigger state information satisfies the first setting parameter switching condition, the add control unit 200 is further configured to replace the current setting variable parameter from the first setting variable parameter to the second setting variable parameter; or, the current set variable parameter is replaced from the second set variable parameter to the first set variable parameter.
In some embodiments of the present disclosure, when the current function mode is the first setting function mode, in a case that it is determined that the reduction key is triggered and the trigger state information satisfies the second setting parameter switching condition, the reduction control unit 300 is further configured to replace the current setting variable parameter from the first setting variable parameter to a third setting variable parameter; or, the current setting variable parameter is replaced from the third setting variable parameter to the first setting variable parameter.
In some embodiments of the present disclosure, when the current function mode is the first setting function mode, the function switching unit is further configured to determine the first setting variable parameter as the current setting variable parameter.
In some embodiments of the disclosure, the apparatus further comprises: the AR display unit is configured to determine corresponding augmented reality AR display content and AR display position parameters under the condition that the trigger state information of the function key is determined to meet the set mode switching condition, and AR display is carried out based on the AR display content and the AR display position parameters;
and determining updated AR display content and updated AR display position parameters based on the trigger state information of the increase key, the decrease key and the function key and the current AR display content, and updating the AR display based on the updated AR display content and the updated AR display position parameters.
In some embodiments of the disclosure, the apparatus further comprises: and the function execution unit is configured to determine the current operation action in the second set function mode according to the trigger state information of the function key when the current function mode is the second set function mode, and execute the current operation action.
In some embodiments of the present disclosure, further comprising: and the function returning unit is configured to switch the current function mode to the first set function mode under the condition that the reduction key is determined to be triggered and the trigger state information meets the set exit condition when the current function mode is not the first set function mode.
In some embodiments of the present disclosure, further comprising: and the matching operation unit is configured to determine current operation work information corresponding to the current trigger state information of the increase key, the decrease key or the function key in the current function mode according to the corresponding relation among the stored function mode, the key trigger state information and the operation work information under the condition that the current function mode of operation is determined according to the received operation instruction, and operate according to the current operation work information, wherein the operation instruction is generated by a terminal bound with the augmented reality glasses.
It can be seen that in this embodiment, dispose the drive-by-wire that increases the button, reduces button and function button on the AR glasses, can realize the multiple functions of AR glasses, like this, through a small amount of simple drive-by-wire buttons, realize the complicated diversified function of AR glasses, improve the intelligence and the user experience of AR glasses. In addition, the control process of the drive-by-wire is visualized through AR display, so that the control of the AR glasses function by a user can be more convenient.
The embodiment of the present disclosure also provides a computer-readable storage medium storing computer-executable instructions configured to execute the above augmented reality glasses control method.
The disclosed embodiments also provide a computer program product comprising a computer program stored on a computer-readable storage medium, the computer program comprising program instructions that, when executed by a computer, cause the computer to perform the above-described augmented reality glasses control method.
The computer-readable storage medium described above may be a transitory computer-readable storage medium or a non-transitory computer-readable storage medium.
An embodiment of the present disclosure further provides an electronic device, a structure of which is shown in fig. 5, including:
at least one processor (processor)1000, one processor 1000 being illustrated in FIG. 5; and a memory (memory)1001, and may further include a Communication Interface (Communication Interface)1002 and a bus 1003. The processor 100, the communication interface 1002, and the memory 1001 may communicate with each other via the bus 1003. Communication interface 1002 may be used for the transfer of information. The processor 1000 may call the logic instructions in the memory 1001 to perform the augmented reality glasses control method of the above-described embodiment.
In addition, the logic instructions in the memory 1001 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products.
The memory 1001 is a computer readable storage medium and can be used for storing software programs, computer executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 1000 executes functional applications and data processing by running software programs, instructions and modules stored in the memory 1001, that is, implements the augmented reality glasses control method in the above method embodiment.
The memory 1001 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. Further, the memory 1001 may include a high-speed random access memory and may also include a nonvolatile memory.
The technical solution of the embodiments of the present disclosure may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes one or more instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium comprising: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes, and may also be a transient storage medium.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of the disclosed embodiments includes the full ambit of the claims, as well as all available equivalents of the claims. As used in this application, although the terms "first," "second," etc. may be used in this application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, unless the meaning of the description changes, so long as all occurrences of the "first element" are renamed consistently and all occurrences of the "second element" are renamed consistently. The first and second elements are both elements, but may not be the same element. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It can be clearly understood by the skilled person that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be merely a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. Each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Claims (9)

1. A method for controlling augmented reality glasses is characterized in that an increase key, a decrease key and a function key are configured on a drive-by-wire of the augmented reality glasses, and the method comprises the following steps:
Under the condition that the trigger state information of the functional keys meets the set mode switching conditions, switching the current functional mode of the augmented reality glasses and operating according to the trigger state information of the functional keys;
under the condition that the increasing key is determined to be touched, the current set variable parameter value in the current function mode is subjected to jump-up increase according to the set gear value;
under the condition that the reduction key is determined to be touched, the current set variable parameter value in the current function mode is reduced in a downward jumping mode according to the set gear value;
determining corresponding augmented reality AR display content and AR display position parameters under the condition that the trigger state information of the function key is determined to meet the set mode switching condition, and performing AR display based on the AR display content and the AR display position parameters;
and determining updated AR display content and updated AR display position parameters based on the trigger state information of the increase key, the decrease key and the function key and the current AR display content, and updating the AR display based on the updated AR display content and the updated AR display position parameters.
2. The method according to claim 1, wherein when the current function mode is the first set function mode, in a case that it is determined that the add key is activated and the activation status information satisfies the first set parameter switching condition, the method further comprises:
replacing the current set variable parameter from a first set variable parameter to a second set variable parameter; or the like, or a combination thereof,
and replacing the current setting variable parameter from the second setting variable parameter to the first setting variable parameter.
3. The method according to claim 1, wherein when the current function mode is the first set function mode, in a case that it is determined that the reduction key is activated and the activation status information satisfies the second set parameter switching condition, the method further comprises:
replacing the current set variable parameter from the first set variable parameter to a third set variable parameter; or the like, or a combination thereof,
and replacing the current set variable parameter value from the third set variable parameter to the first set variable parameter.
4. The method of claim 1, wherein when the current functional mode is a second set functional mode, the method further comprises:
And determining the current operation action in the second set function mode according to the trigger state information of the function key, and executing the current operation action.
5. The method of claim 1, wherein when the current function mode is not the first set function mode, and it is determined that the reduction key is activated and the activation status information satisfies a set exit condition, the method further comprises:
and switching the current function mode into a first set function mode.
6. The method of claim 1, wherein in the event that the current functional mode of operation is determined from received operational instructions, the method further comprises:
and determining current operation work information corresponding to the current trigger state information of the increase key, the decrease key or the function key in the current function mode according to the corresponding relation among the stored function mode, the key trigger state information and the operation work information, and operating according to the current operation work information, wherein the operation instruction is generated by a terminal bound with the augmented reality glasses.
7. A control device for augmented reality glasses, comprising:
At least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor, the instructions, when executed by the at least one processor, causing the at least one processor to perform the method of any of claims 1-6.
8. An electronic device comprising computer program instructions which, when executed by a processor, implement the steps of the method of claims 1-6.
9. A computer-readable storage medium having computer-executable instructions stored thereon, the computer-executable instructions configured to perform the method of any one of claims 1-6.
CN201910481603.0A 2019-06-04 2019-06-04 Augmented reality glasses control method, apparatus, and computer storage medium Active CN110187773B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910481603.0A CN110187773B (en) 2019-06-04 2019-06-04 Augmented reality glasses control method, apparatus, and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910481603.0A CN110187773B (en) 2019-06-04 2019-06-04 Augmented reality glasses control method, apparatus, and computer storage medium

Publications (2)

Publication Number Publication Date
CN110187773A CN110187773A (en) 2019-08-30
CN110187773B true CN110187773B (en) 2022-07-29

Family

ID=67720203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910481603.0A Active CN110187773B (en) 2019-06-04 2019-06-04 Augmented reality glasses control method, apparatus, and computer storage medium

Country Status (1)

Country Link
CN (1) CN110187773B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085223A (en) * 2020-08-04 2020-12-15 深圳市新辉煌智能科技有限责任公司 Guidance system and method for mechanical maintenance

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2901452Y (en) * 2006-02-10 2007-05-16 南京联慧通信技术有限公司 Radio controller with radio earphone mounted on automobile steering wheel
US20110246871A1 (en) * 2010-03-31 2011-10-06 Lenovo (Singapore) Pte.Ltd. Optimized reading experience on clamshell computer
KR20170112497A (en) * 2016-03-31 2017-10-12 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN107438804B (en) * 2016-10-19 2019-07-12 深圳市大疆创新科技有限公司 It is a kind of for controlling the wearable device and UAV system of unmanned plane
CN106791008A (en) * 2016-11-29 2017-05-31 珠海格力电器股份有限公司 Shortcut use pattern changing method, device and mobile phone for mobile phone
CN108734939B (en) * 2018-05-31 2020-09-01 广东美的制冷设备有限公司 Electronic equipment control method and device, electronic equipment and readable storage medium
CN109177682A (en) * 2018-07-26 2019-01-11 奇瑞新能源汽车技术有限公司 A kind of automobile DVD button multiplexing system and method

Also Published As

Publication number Publication date
CN110187773A (en) 2019-08-30

Similar Documents

Publication Publication Date Title
JP6985339B2 (en) Scene-based vibration feedback method and mobile device
CN107930122B (en) Information processing method, device and storage medium
CN108536353B (en) Interface display control method, device and storage medium
CN112135181B (en) Video preview method and device and electronic equipment
EP3144041A1 (en) Video game processing program, video game processing system and video game processing method
CN108897881B (en) Interactive image display method, device, equipment and readable storage medium
CN112684970B (en) Adaptive display method and device of virtual scene, electronic equipment and storage medium
CN111228810A (en) Control method and device of virtual rocker, electronic equipment and storage medium
CN108345484A (en) The system and method for customizing personalized user interface for using face recognition
US11385791B2 (en) Method and device for setting layout of icon of system interface of mobile terminal, and medium
EP2939411B1 (en) Image capture
CN110187773B (en) Augmented reality glasses control method, apparatus, and computer storage medium
CN112363658A (en) Interaction method and device for video call
CN113778304B (en) Method and device for displaying layer, electronic equipment and computer readable storage medium
CN113262476B (en) Position adjusting method and device of operation control, terminal and storage medium
CN113786607A (en) Interface display method, device, terminal and storage medium
CN114816692A (en) Screen projection display method and device, mobile terminal and storage medium
WO2013103968A2 (en) Touchscreen controller
CN114095611B (en) Processing method and device of caller identification interface, electronic equipment and storage medium
CN103019419A (en) Electronic equipment and method for regulating touch area thereof
CN105892897A (en) Terminal operation method and equipment
CN115237323A (en) Interface display method and device, electronic equipment and storage medium
CN114341774A (en) Dynamic eye tracking camera alignment using eye tracking maps
CN104951378B (en) A kind of control method and terminal
CN113318429A (en) Control method and device for quitting game, processor and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200420

Address after: Room 1146, 11th floor, research complex building, Institute of computing technology, Chinese Academy of Sciences, No. 6, South Road, Haidian District, Beijing 100000

Applicant after: Wang Yi

Address before: Room 16B101, Room 813, Changlin, Xisanqi, Haidian District, Beijing

Applicant before: BEIJING SEENGENE TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210316

Address after: Room 1146, 11 / F, research complex building, Institute of computing technology, Chinese Academy of Sciences, No. 6, South Road, Haidian District, Beijing

Applicant after: Zhongkehai micro (Beijing) Technology Co.,Ltd.

Address before: Room 1146, 11 / F, research complex building, Institute of computing technology, Chinese Academy of Sciences, No. 6, South Road, Haidian District, Beijing

Applicant before: Wang Yi

GR01 Patent grant
GR01 Patent grant