CN112732078A - Electronic equipment operation method and device and electronic equipment - Google Patents

Electronic equipment operation method and device and electronic equipment Download PDF

Info

Publication number
CN112732078A
CN112732078A CN202011613325.9A CN202011613325A CN112732078A CN 112732078 A CN112732078 A CN 112732078A CN 202011613325 A CN202011613325 A CN 202011613325A CN 112732078 A CN112732078 A CN 112732078A
Authority
CN
China
Prior art keywords
plane
target
operation object
finger
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011613325.9A
Other languages
Chinese (zh)
Inventor
王建辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011613325.9A priority Critical patent/CN112732078A/en
Publication of CN112732078A publication Critical patent/CN112732078A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an electronic equipment operation method and device and electronic equipment, and belongs to the technical field of communication. The method comprises the following steps: under the condition that the electronic equipment is in an air-separating operation mode, acquiring information of an operation object; receiving a first input of the operation object under the condition that the acquired information of the operation object meets a preset condition; and responding to the first input, and determining that a space plane where the operation object is located is a target space operation plane, wherein the target space operation plane is parallel to a plane where the screen of the electronic equipment is located, or an included angle between the target space operation plane and the plane where the screen of the electronic equipment is located is smaller than a preset angle. Therefore, the user can perform non-contact operation on the electronic equipment in the space, so that the disease infection risk problem caused by contact can be reduced, the target space operation plane can be locked for the user based on the information of the operation object acquired in the space operation mode and the user input, and the space operation experience of the user is improved.

Description

Electronic equipment operation method and device and electronic equipment
Technical Field
The application belongs to the technical field of communication, and particularly relates to an electronic device, an operation method and device thereof.
Background
At present, electronic equipment basically realizes user operation through a touch screen, and a user finger is required to contact the screen in operation, however, pathogenic organisms such as viruses and bacteria are easy to remain on the screen of the electronic equipment, so that the problem of disease infection risk caused by contact operation is easy to occur.
Disclosure of Invention
An object of the embodiments of the present application is to provide an electronic device, an operating method and an operating device thereof, which can overcome the problem of risk of disease infection in the existing contact operation.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an operation method of an electronic device, where the method includes:
under the condition that the electronic equipment is in an air-separating operation mode, acquiring information of an operation object;
receiving a first input of the operation object under the condition that the acquired information of the operation object meets a preset condition;
and responding to the first input, and determining that a space plane where the operation object is located is a target space operation plane, wherein the target space operation plane is parallel to a plane where the screen of the electronic equipment is located, or an included angle between the target space operation plane and the plane where the screen of the electronic equipment is located is smaller than a preset angle.
In a second aspect, an embodiment of the present application provides an operating device for an electronic device, including:
the first acquisition module is used for acquiring the information of an operation object under the condition that the electronic equipment is in an air-separating operation mode;
the first receiving module is used for receiving first input of the operation object under the condition that the acquired information of the operation object meets preset conditions;
and the execution module is used for responding to the first input and determining that the space plane where the operation object is located is a target space operation plane, wherein the target space operation plane is parallel to the plane where the screen of the electronic equipment is located, or an included angle between the target space operation plane and the plane where the screen of the electronic equipment is located is smaller than a preset angle.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or an instruction stored on the memory and executable on the processor, and when executed by the processor, the program or the instruction implements the steps of the operation method of the electronic device according to the first aspect.
In a fourth aspect, the present application provides a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the operating method of the electronic device according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, under the condition that the electronic equipment is in the spaced operation mode, the information of an operation object is acquired; receiving a first input of the operation object under the condition that the acquired information of the operation object meets a preset condition; and responding to the first input, and determining that a space plane where the operation object is located is a target space operation plane, wherein the target space operation plane is parallel to a plane where the screen of the electronic equipment is located, or an included angle between the target space operation plane and the plane where the screen of the electronic equipment is located is smaller than a preset angle. Therefore, the user can perform non-contact operation on the electronic equipment in the space, so that the risk of disease infection caused by contact can be reduced, the target space operation plane can be locked for the user based on the information of the operation object acquired in the space operation mode and the user input, and the space operation experience of the user is improved.
Drawings
Fig. 1 is a flowchart of an operation method of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a movement track of a mobile camera in a process of collecting finger information according to an embodiment of the present disclosure;
FIG. 3a is a schematic diagram of an operation process of the operation plane of the locking space provided by the embodiment of the present application;
FIG. 3b is a schematic view of an operation interface of the operation plane of the locking space provided in the embodiment of the present application;
fig. 4 is a schematic structural diagram of an operating device of an electronic device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of another electronic device provided in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail an operation method of an electronic device according to an embodiment of the present application with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a flowchart of an operation method of an electronic device according to an embodiment of the present application, and as shown in fig. 1, the method includes the following steps:
step 101, acquiring information of an operation object under the condition that the electronic equipment is in the air-separating operation mode.
In the embodiment of the present application, the operation object may include an object that can be used for space operation, such as a finger and a stylus. Of course, in order to ensure safety in use and prevent misoperation, the operation object may be a specific operation object that is specified in advance and is available for space operation, such as a finger, or a specified finger with a fingerprint registered in advance. For convenience of understanding, the specific embodiments of the present application will be described below mainly with a finger as an operation object, but the operation object is not limited to only a finger.
The idle mode may be a mode configured by the electronic device and capable of performing an idle operation, and the electronic device may be in the idle mode, and the idle mode may be activated by the electronic device based on a trigger operation of a user, or may be activated based on a specific event.
The information of the operation object may include an image, a biological characteristic, and the like of the operation object, and may specifically be specific information that needs to be used, such as a finger image, fingerprint information, and the like.
The above-mentioned information of obtaining the operation object may be information of obtaining an operation object currently collected by the electronic device, and specifically, the electronic device may be configured with a photoelectric module for collecting information of the operation object, and the photoelectric module may be a module capable of collecting information of the operation object or capable of following information of collecting the operation object, such as a front camera, a mobile camera under a screen, and the like.
And 102, receiving a first input of the operation object under the condition that the acquired information of the operation object meets a preset condition.
That is to say, in order to determine a target spatial operation plane desired by a user based on a user requirement, it may be further determined whether information of an operation object, such as finger information, is collected, and whether the currently collected information of the operation object satisfies a preset condition for locking the spatial operation plane may be further determined, and only when the preset condition is satisfied, a first input executed by the user through the operation object may be received and responded, and a spatial plane where the operation object is located may be locked.
Specifically, after the information of the operation object is acquired, whether the currently acquired information of the operation object meets the preset condition may be determined by analyzing whether the current information can reflect that the user finger is placed within the effective acquisition range of the electronic device, that is, whether the electronic device can acquire the information of the complete operation object currently, or further matching the biological characteristics of the current operation object, such as the finger fingerprint, and by matching the finger fingerprint with the identified spaced operation finger fingerprint, determining whether the finger currently operated by the user is the identified spaced operation finger.
That is, the preset condition may include any one of:
the acquired information of the operation object is a complete operation object image;
the acquired information of the operation object is complete biological characteristics;
the acquired information of the operation object is a biological characteristic which is complete and is matched with a preset biological characteristic.
Specifically, when the preset condition includes that the acquired information of the operation object is a complete operation object, the information of the operation object may include an operation object image, and it may be determined whether the preset condition is currently satisfied by analyzing whether the acquired operation object image is complete, for example, taking a finger as an example, and determining whether the finger image includes a complete finger tip image.
When the preset condition includes that the acquired information of the operation object is a complete biological feature, the information of the operation object may include the biological feature of the operation object, and whether the preset condition is currently met may be determined by analyzing whether the acquired biological feature of the operation object is complete, for example, by taking a finger as an example, and by judging whether the acquired fingerprint of the finger is complete.
When the preset condition includes that the acquired information of the operation object is a complete biological feature, the information of the operation object may include an operation object image and a biological feature of the operation object, and whether the preset condition is currently satisfied may be determined by analyzing whether the operation object image is complete and whether the biological feature of the operation object is complete, respectively.
Thus, the preset condition may include any one of the three conditions, and under different preset conditions, whether the information of the operation object meets the preset condition or not may be correspondingly identified in different manners, and the method is applicable to different user requirement scenarios, and ensures the effectiveness of the user in the air separation operation. In addition, through the judgment of the preset conditions, the target space operation plane which meets the user's expectation can be determined to meet the actual operation requirements of the user.
In this step, when the acquired information of the operation object meets a preset condition, it indicates that the current operation object is in an effective operation position, or the information of the operation object is effective operation object information, so that a user may execute a first input for triggering locking of a spatial plane where the operation object is located, and accordingly, the electronic device may receive the first input of the operation object.
The first input may be a user input for locking the current spatial operation plane, for example, a specific gesture, a specific voice instruction, or a spatial position where a finger is moved to a corresponding lock key, which is input by a user is received, or an input that the finger of the user stays at the current position for more than a preset time period is received, for example, when the user moves the finger to a spatial position suitable for operation, the finger stays at the position for 3 seconds, and then the corresponding spatial operation plane where the current finger position is located may be triggered to be locked.
Optionally, after the step 101 and before the step 102, the method further includes:
and under the condition that the acquired information of the operation object does not meet the preset condition, controlling the photoelectric module of the electronic equipment to move to a target position, wherein the acquired information of the operation object at the target position meets the preset condition.
In an alternative embodiment, the electronic device may be configured with a movable optoelectronic module, for example, the optoelectronic module may be a mobile camera, a below-screen mobile camera, or other modules capable of following to collect information of an operation object. Thus, in this embodiment, the convenience of the blanking operation can be further improved in combination with the movable characteristic of the photovoltaic module.
Specifically, taking an operation object as a finger as an example, when the electronic device is in the spaced operation mode, the optoelectronic module may collect finger information of a user first, and judges whether the collected finger information meets the preset condition, that is, whether the collected finger information is effective, wherein the finger information may be a finger image or a fingerprint image, the judging whether the collected finger information is valid or not, whether a complete finger image or fingerprint image is acquired or not can be judged, whether the finger image or fingerprint image is complete or not can help to determine whether the finger of the user is completely in the acquisition range of the photoelectric module, if the finger image or fingerprint image is complete, effective finger information is acquired, that is, the current finger of the user is completely in the acquisition range, otherwise, the current finger of the user is not in the acquisition range, that is, the current finger of the user is not completely in the acquisition range.
In this embodiment, when valid finger information is not collected, that is, the information of the operation object does not satisfy the preset condition, the movement of the optoelectronic module may be controlled by using the movable characteristic of the optoelectronic module without the need of moving the finger position by a user, so that the optoelectronic module moves to a target position where the finger information satisfying the preset condition can be obtained, that is, when the optoelectronic module is located at the target position, the complete and valid finger information can be collected.
The above-mentioned specific way of controlling the movement of the photovoltaic module may be to control the photovoltaic module to move in a direction close to the center of the finger of the user by determining the position offset of the photovoltaic module relative to the finger of the user, so that the photovoltaic module can collect more complete finger information, and stop controlling the movement of the photovoltaic module when the photovoltaic module collects effective finger information, i.e., a complete finger image or a complete fingerprint image. More specifically, the position offset of the photoelectric module towards the finger of the user can be determined based on the finger information currently acquired by the photoelectric module, the direction in which the photoelectric module needs to move is further determined, the latest finger information currently acquired after the movement can be compared with the finger information acquired before the movement in real time, so that whether more complete finger information is acquired or not is judged, if so, the photoelectric module can continue to move towards the direction until the complete finger information is acquired, and if not, the photoelectric module can immediately move towards the opposite direction until the complete finger information is acquired once the acquired finger information is reduced.
For example, if only the right half of the fingerprint of the finger is collected at the beginning, it indicates that the current photovoltaic module is located on the right side of the finger, the photovoltaic module needs to be controlled to move to the left side, the size of the fingerprint image is compared in real time during movement to judge whether the currently collected fingerprint image is more complete, if so, the movement can be continued, the movement is stopped when the complete fingerprint image is determined to be collected, and if the currently collected fingerprint image is found to be smaller during the movement, the movement can be performed in the opposite direction until the movement is moved to a proper position where the complete fingerprint image can be collected.
Finally, the first input of the operation object can be received and responded under the condition that the finger information meeting the preset condition is collected, and the target space operation plane where the current finger position is located is locked.
Therefore, by the implementation mode, the user does not need to move and adjust the position of the finger in the process of locking the space operation plane, and the convenience of the user operation is improved.
Further, the photoelectric module is a screen-down movable camera. Namely, the mobile camera is arranged below the screen of the electronic equipment, and the principle of the mobile camera under the screen is that the camera can replace part of the upper screen when needed through the sliding rail under the screen. Like this, through being applied to the electronic equipment who disposes the screen down mobile camera with the scheme in this application embodiment, the position control scope that can guarantee mobile camera is wider, and is more nimble in the position control, and because the screen down mobile camera is adjustable to the position that more aligns with user's operation finger to more be favorable to the user to separate the blank operation conveniently.
An embodiment that helps to collect finger information meeting a preset condition by controlling a moving track of the mobile camera is described below with reference to fig. 2:
as shown in fig. 2, the mobile camera 21 of the electronic device 20 is located at an initial position a, and when the complete finger information is not collected at the position, the mobile camera 21 may be controlled to move in a corresponding direction based on the position deviation information of the mobile camera 21 relative to the finger, so that the mobile camera 21 reaches an adjusted position B, and at the position B, the mobile camera 21 may collect the complete finger information.
It should be noted that, in the moving process of the mobile camera, on the premise that the position of the finger is kept unchanged, the distance between the operation space plane determined by the position of the finger of the user and the screen of the electronic device should be kept unchanged, but the effective operation area on the operation space plane can be moved horizontally correspondingly along with the moving direction of the mobile camera.
Step 103, responding to the first input, and determining that a space plane where the operation object is located is a target space operation plane, wherein the target space operation plane is parallel to a plane where the screen of the electronic device is located, or an included angle between the target space operation plane and the plane where the screen of the electronic device is located is smaller than a preset angle.
In this step, the first input may be responded, the spatial plane where the operation object is located is locked, that is, the spatial plane where the operation object is located is determined to be the target spatial operation plane, so that a subsequent user may perform an air-separating operation in the target spatial operation plane, and the electronic device may also receive and respond to the air-separating operation of the user in the target spatial operation plane.
The target space operation plane is a space plane parallel to the plane of the electronic device screen, or an included angle between the target space operation plane and the plane of the electronic device screen is smaller than a preset angle, that is, the target space operation plane may be substantially parallel to the plane of the electronic device screen, so that a certain range of operation position errors can be allowed during the blank operation. The preset angle may be set based on the operation precision requirement, and is generally not set too large, for example, may be about 5 degrees.
Optionally, a first interface is displayed on the screen of the electronic device, the first interface includes a first control, and the operation object includes a target finger;
before the step 103, the method further includes:
acquiring a first position of the target finger, wherein the target finger is a finger establishing a mapping relation with the first control;
the receiving of the first input of the operation object comprises:
receiving an input that the target finger stays at the first position for a first length of time;
the determining that the space plane where the operation object is located is a target space operation plane includes:
under the condition that the first time length exceeds a preset time length, determining that a space plane where the target finger is located is a target space operation plane, and locking the position of a first operation area in the target space operation plane, wherein a mapping area of the first interface in the target space operation plane is the first operation area;
after the step 103, the method further comprises:
and releasing the mapping relation between the target finger and the first control.
In an optional implementation manner, when a first interface is displayed on the screen of the electronic device and the first interface includes a first control, when a user performs an idle operation using a target finger that has a mapping relationship with the first control and the target space operation plane is a plane in which the target finger is always mapped to the first control before being locked, that is, any position of the target finger in space points to the first control correspondingly.
The first interface may be any interface, may also be a preset interface supporting an idle operation, and may also be an operation area interface in the whole interface displayed on the screen of the electronic device, such as an interface where a keyboard is located, where the first control may be a specific control on the first interface, for example, may be a determination control for triggering locking of a spatial operation plane where the current finger position is located. In this embodiment, the mapping relationship between the target finger and the first control may be established in advance, that is, the mapping relationship between the target finger and the first control may be established by setting the target finger in advance and designating the first control in advance.
For example, in the case that the target finger is a right index finger, and the first control is a character key Y in a keyboard displayed on the screen of the electronic device, as the position of the right index finger moves, the position of the entire spatial keyboard also moves correspondingly, but the character key Y is always associated with the position of the right index finger.
In this embodiment, before the target spatial operation plane is unlocked, position information of the target finger may be obtained, for example, when the target finger is located at a first position, when a user considers that the spatial plane where the first position is located is suitable for being used as an air separation operation plane, the target finger may stay at the first position for a certain period of time, for example, a first period of time, where the first period of time exceeds a preset period of time, and the preset period of time may be set according to a requirement, for example, set to 3 seconds, so that the electronic device may receive an input that the target finger stays at the first position for the first period of time, may respond to the input, and lock the spatial plane where the target finger is located, that is, determine that the spatial plane where the target finger is located is the target spatial operation plane.
In this embodiment, an effective operation area may be further determined on the target space operation plane, where the effective operation area may be an operation area capable of receiving and responding, and specifically, the position and size of the effective operation area may be determined according to the acquisition range of the electronic device, may be determined based on the size of the effective operation area set by default and the position of the effective operation area relative to the electronic device, or may be determined based on the size of the operation area on the screen of the electronic device, such as the size of a keyboard, on the target space operation plane, where the effective operation area is an equal proportion or a preset proportion.
That is to say, in a case that it is detected that the target finger stays at the first position for more than a preset time period, it may be determined that the spatial plane where the target finger is located is a target spatial operation plane, and a position of a first operation area in the target spatial operation plane is locked, so that a subsequent user performs an air-cut operation in the first operation area, and the electronic device may also receive and respond to the air-cut operation of the user in the first operation area. The first operation area is a mapping area of the first interface in the target space operation plane, and the first operation area and the first interface are in a mapping relationship with each other.
Further, after the spatial plane where the operation object is located is locked, that is, after it is determined that the spatial plane where the operation object is located is the target spatial operation plane, the mapping relationship between the target finger and the first control may also be released, so that the user may start to move the target finger in the first operation area to perform the blank operation on the electronic device, that is, after the mapping relationship between the target finger and the first control is released, the moved target finger does not always point at the first control any more, but the relative movement position of the target finger in the first operation area may be converted into the relative movement position corresponding to the first interface, and then the blank operation position of the target finger in the first operation area may be responded. As described in the example above, after contacting the mapping relationship, the right index finger no longer always points to control Y.
Further, a second control is further included in the first interface;
after the step 103, the method further comprises:
receiving a second input of the target finger at a second position, wherein the second position is a position mapped to the second control on the target space operation plane;
in response to the second input, canceling the determination of the target space operation plane and restoring the mapping relation; the second control is used for unlocking the target space operation plane.
That is, in this embodiment, a second control for unlocking the spatial operation plane may also be displayed in the first interface, for example, the first control is a determination key for locking the spatial plane where the target finger is located, and the second control is a cancel key for unlocking the currently determined target spatial operation plane.
After the spatial plane where the target finger is located is locked, if the user needs to lock the spatial operation plane again, the target finger may be moved to a second position pointing to the second control, such as a cancel key, to perform a space-isolated touch operation on the cancel key, that is, to execute a second input at the second position by the target finger, and the target finger may also be stopped at the second position for a certain time period to avoid an erroneous operation, so as to trigger the electronic device to unlock the current spatial operation plane, that is, to cancel the determination of the target spatial operation plane. Further, the mapping relationship between the target finger and the first control may also be restored in the event that the determination of the target spatial operation plane is cancelled, so that the target finger may be directed to the first control at all times no matter where it is moved.
It should be noted that, after the target spatial operation plane is unlocked, the user may select a suitable operation position again, and lock the spatial plane where the selected operation position is located again, and the locking manner is similar to the foregoing manner, and is not described here again.
The following will exemplify embodiments of the aforementioned locking target space operation plane and unlocking target space operation plane with reference to fig. 3a and 3 b:
as shown in fig. 3a, the electronic device 30 displays a confirm key 31 and a cancel key 32 on the screen, wherein C, D, E fingers are located at valid positions, that is, both are located within the collection range of the optoelectronic module of the electronic device 30, a mapping relationship is established between the right index finger of the user and the confirm key 31, the user moves the right index finger from the valid position C to the valid position D and then from the valid position D to the final locking position E, wherein the cursor on the screen of the electronic device 30 can be always in a focusing state and always points at the confirm key 31 until the locking condition is reached, and when the locking condition is reached, if the user stays the right index finger at the final locking position E for more than 3 seconds, the corresponding target space operation plane 32 can be locked according to the final locking position E as shown in fig. 3b, and the mapping relationship between the right index finger and the confirm key 31 is released, the user can then operate on the target space operation plane 32 to achieve the same touch operation effect as on the screen of the electronic device 30.
After the corresponding target space operation plane 33 is locked, if the user needs to unlock the target space operation plane 33, the index finger of the right hand may be moved to another spatial position pointing to the cancel key 32, and stay at the position for a certain period of time, the electronic device 30 responds to the operation at the position, cancels the locking of the target space operation plane 33, and restores the mapping relationship between the right finger and the determination key 31.
Optionally, after the step 103, the method further includes:
acquiring moving position information of the operation object on the target space operation plane, and converting the moving position information into input position information corresponding to the screen of the electronic equipment;
and operating the electronic equipment based on the input position information.
In the embodiment of the application, after locking target space operation plane, the user can be right on the target space operation plane the electronic equipment separates the blank operation, specifically, the user can be according to actual operation demand remove the finger on the target space operation plane, electronic equipment then gathers the mobile position information of finger in real time, and will mobile position information turns into and corresponds input position information on the electronic equipment screen, more specifically, can convert the relative distance that removes at every turn into and correspond relative movement distance on the electronic equipment screen, reunion preset reference position or last time correspond to the input position on the screen, alright confirm the space operation position of user's finger correspond to concrete input position on the electronic equipment screen.
After the spatial operation position of the user is converted into input position information on a screen corresponding to the electronic device, the electronic device may be operated based on the input position information, and specifically, a plurality of different scenarios of the blank operation may be included, for example, specific information of the blank input may be determined based on the input position information, such as characters on a keyboard of an input method, or specific objects of the blank operation may be determined based on the input position information, such as specific keys, controls, and the like, or specific operation gestures or operation trajectories may be determined based on the input position information, and the electronic device may be controlled to respond to the operations.
Thus, with this embodiment, it is possible to receive and respond to a user's blank operation on the target space operation plane.
Optionally, before determining that the spatial plane where the operation object is located is the target spatial operation plane, the method further includes:
and in the process of acquiring the information of the operation object, aligning the position of the operation object with the preset reference position in real time.
In other words, in the above embodiment, during the process of acquiring information of an operation object, for example, acquiring finger information, the acquired finger position may be further aligned with a preset reference position on the screen of the electronic device in real time, so that the finger position of the user always points to the preset reference position, where the preset reference position is a position of a determination key for locking the operation space plane, which is equivalent to binding the finger position currently operated by the user with the determination key, and the entire effective operation space plane may move along with the movement of the currently operated finger of the user.
The preset reference position may be a preset specific reference position, for example, a position of a specific virtual key (e.g., a determination key) displayed on the screen is preset as the preset reference position, or a fixed point (e.g., a screen center) on the screen is preset as the preset reference position. In the embodiment of the application, the finger position of the user can be aligned with the preset reference position in real time before the target space operation plane is unlocked, so that the finger position of the user is aligned with the preset reference position when the target space operation plane is locked, or the finger position of the user can be aligned with the preset reference position under the condition that the target space operation plane is locked.
Therefore, the position of the operation space plane can be determined in real time, and the user can conveniently adjust the proper finger position with reference so as to find the space plane suitable for operation.
Optionally, before the step 101, the method further includes:
and under the condition that the electronic equipment enters a preset page, starting a photoelectric module of the electronic equipment to acquire information of an operation object.
The preset page may be a preset page capable of opening the blank operation, for example, a page requiring the input method keyboard to be started, such as an information editing page, an instant messaging page, or a page requiring the blank operation to be opened by another user. The preset page can be automatically set by the system or can be set by a user in a self-defined way.
In this embodiment, when the electronic device enters a preset page, the photoelectric module may be triggered to start, and the photoelectric module starts to collect finger information, that is, enters an idle operation mode.
In this way, the blank operation mode can be automatically opened for the user under certain specific pages without manual opening by the user.
Optionally, before receiving the first input of the operation object, the method further includes:
and outputting a prompt message for prompting a user to lock the space operation plane under the condition that the acquired information of the operation object meets a preset condition.
In an optional implementation manner, in order to make the interaction manner more friendly, when the acquired information of the operation object satisfies a preset condition, and the acquired information of the operation object satisfies the preset condition, it may be determined that the condition of the lockable space operation plane is currently satisfied, so that in this case, a prompt message for prompting the user to lock the space operation plane may be output, for example, corresponding prompt information is displayed on a screen, and a corresponding voice prompt is output. In this way, the user, upon receiving the prompt message, can make a corresponding operation as soon as possible and determine the position of the finger, for example, place the finger at a certain position for 3 seconds to trigger the electronic device to lock the corresponding spatial plane where the current finger position is located.
According to the operation method of the electronic equipment in the embodiment of the application, under the condition that the electronic equipment is in the air-separating operation mode, the information of an operation object is obtained; receiving a first input of the operation object under the condition that the acquired information of the operation object meets a preset condition; and responding to the first input, and determining that a space plane where the operation object is located is a target space operation plane, wherein the target space operation plane is parallel to a plane where the screen of the electronic equipment is located, or an included angle between the target space operation plane and the plane where the screen of the electronic equipment is located is smaller than a preset angle. Therefore, the user can perform non-contact operation on the electronic equipment in the space, so that the risk of disease infection caused by contact can be reduced, the target space operation plane can be locked for the user based on the information of the operation object acquired in the space operation mode and the user input, and the space operation experience of the user is improved.
In the operating method of the electronic device provided in the embodiment of the present application, the executing body may be an operating device of the electronic device, or a control module in the operating device of the electronic device, for executing the operating method of the electronic device. In the embodiment of the present application, a method for executing an operation of an electronic device by using an operation device of the electronic device is taken as an example, and the operation device of the electronic device provided in the embodiment of the present application is described.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an operating device of an electronic apparatus according to an embodiment of the present disclosure, and as shown in fig. 4, the operating device 400 of the electronic apparatus includes:
a first obtaining module 401, configured to obtain information of an operation object when the electronic device is in an idle operation mode;
a first receiving module 402, configured to receive a first input of the operation object when the acquired information of the operation object meets a preset condition;
an executing module 403, configured to determine, in response to the first input, that a spatial plane where the operation object is located is a target spatial operation plane, where the target spatial operation plane is parallel to a plane where the electronic device screen is located, or an included angle between the target spatial operation plane and the plane where the electronic device screen is located is smaller than a preset angle.
Optionally, the operating device 400 of the electronic device further includes:
the control module is used for controlling the photoelectric module of the electronic equipment to move to a target position under the condition that the acquired information of the operation object does not meet the preset condition, wherein the information of the operation object acquired at the target position meets the preset condition.
Optionally, the condition that the acquired information of the operation object meets the preset condition includes any one of:
the acquired information of the operation object is a complete operation object image;
the acquired information of the operation object is complete biological characteristics;
the acquired information of the operation object is a biological characteristic which is complete and is matched with a preset biological characteristic.
Optionally, a first interface is displayed on the screen of the electronic device, the first interface includes a first control, and the operation object includes a target finger;
the operating device 400 of the electronic apparatus further includes:
the second acquisition module is used for acquiring a first position of the target finger, wherein the target finger is a finger establishing a mapping relation with the first control;
the first receiving module 402 is configured to receive an input that the target finger stays at the first position for a first duration;
the execution module 403 is configured to, when the first duration exceeds a preset duration, determine that a spatial plane where the target finger is located is a target spatial operation plane, and lock a position of a first operation area in the target spatial operation plane, where a mapping area of the first interface in the target spatial operation plane is the first operation area;
the operating device 400 of the electronic apparatus further includes:
and the first processing module is used for releasing the mapping relation between the target finger and the first control.
Optionally, the first interface further includes a second control;
the operating device 400 of the electronic apparatus further includes:
a second receiving module, configured to receive a second input of the target finger at a second position, where the second position is a position on the target space operation plane that is mapped to the second control;
the second processing module is used for responding to the second input, canceling the determination of the target space operation plane and recovering the mapping relation;
the second control is used for unlocking the target space operation plane.
In the operating device of the electronic device in the embodiment of the application, under the condition that information of an operating object is acquired through a photoelectric module of the electronic device, a spatial plane where the operating object is located is locked as a target spatial operating plane, wherein the spatial plane is a plane parallel to a screen of the electronic device, and the location of the operating object is aligned to a preset reference position on the screen of the electronic device; acquiring moving position information of the operation object on the target space operation plane through the photoelectric module, and converting the moving position information into input position information corresponding to the screen of the electronic equipment; and operating the electronic equipment based on the input position information. Therefore, the user can perform contactless operation on the electronic equipment in the space, and the disease infection risk problem caused by contact can be reduced.
The operation device of the electronic device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The operating device of the electronic device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The operating device of the electronic device provided in the embodiment of the present application can implement each process implemented by the method embodiments in fig. 1 to fig. 3, and is not described here again to avoid repetition.
Optionally, as shown in fig. 5, an electronic device 500 is further provided in this embodiment of the present application, and includes a processor 501, a memory 502, and a program or an instruction stored in the memory 502 and executable on the processor 501, where the program or the instruction is executed by the processor 501 to implement each process of the operation method embodiment of the electronic device, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and the like.
Those skilled in the art will appreciate that the electronic device 600 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 610 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 610 is configured to obtain information of an operation object when the electronic device 600 is in the idle operation mode;
receiving a first input of the operation object under the condition that the acquired information of the operation object meets a preset condition;
and responding to the first input, and determining that the space plane where the operation object is located is a target space operation plane, wherein the target space operation plane is parallel to the plane where the screen of the electronic device 600 is located, or an included angle between the target space operation plane and the plane where the screen of the electronic device 600 is located is smaller than a preset angle.
Optionally, the processor 610 is further configured to control the optoelectronic module of the electronic device 600 to move to a target position when the acquired information of the operation object does not satisfy the preset condition, where the information of the operation object acquired at the target position satisfies the preset condition.
Optionally, the condition that the acquired information of the operation object meets the preset condition includes any one of:
the acquired information of the operation object is a complete operation object image;
the acquired information of the operation object is complete biological characteristics;
the acquired information of the operation object is a biological characteristic which is complete and is matched with a preset biological characteristic.
Optionally, a first interface is displayed on the screen of the electronic device 600, the first interface includes a first control, and the operation object includes a target finger;
the processor 610 is further configured to obtain a first position of the target finger, where the target finger is a finger that establishes a mapping relationship with the first control;
receiving an input that the target finger stays at the first position for a first length of time;
under the condition that the first time length exceeds a preset time length, determining that a space plane where the target finger is located is a target space operation plane, and locking the position of a first operation area in the target space operation plane, wherein a mapping area of the first interface in the target space operation plane is the first operation area;
and releasing the mapping relation between the target finger and the first control.
Optionally, the first interface further includes a second control;
a processor 610, further configured to receive a second input of the target finger at a second position, where the second position is a position on the target spatial operation plane mapped to the second control;
and responding to the second input, canceling the determination of the target space operation plane, and restoring the mapping relation.
The electronic equipment in the embodiment of the application acquires the information of an operation object under the condition of being in the air-separating operation mode; receiving a first input of the operation object under the condition that the acquired information of the operation object meets a preset condition; and responding to the first input, and determining that a space plane where the operation object is located is a target space operation plane, wherein the target space operation plane is parallel to a plane where the screen of the electronic equipment is located, or an included angle between the target space operation plane and the plane where the screen of the electronic equipment is located is smaller than a preset angle. Therefore, the user can perform non-contact operation on the electronic equipment in the space, so that the disease infection risk problem caused by contact can be reduced, the target space operation plane can be locked for the user based on the information of the operation object acquired in the space operation mode and the user input, and the space operation experience of the user is improved.
It is to be understood that, in the embodiment of the present application, the input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics Processing Unit 6041 processes image data of a still picture or a video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 607 includes a touch panel 6071 and other input devices 6072. A touch panel 6071, also referred to as a touch screen. The touch panel 6071 may include two parts of a touch detection device and a touch controller. Other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 609 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 610 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the operation method embodiment of the electronic device, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the operation method embodiment of the electronic device, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes several instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the methods according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. A method of operation of an electronic device, comprising:
under the condition that the electronic equipment is in an air-separating operation mode, acquiring information of an operation object;
receiving a first input of the operation object under the condition that the acquired information of the operation object meets a preset condition;
and responding to the first input, and determining that a space plane where the operation object is located is a target space operation plane, wherein the target space operation plane is parallel to a plane where the screen of the electronic equipment is located, or an included angle between the target space operation plane and the plane where the screen of the electronic equipment is located is smaller than a preset angle.
2. The method according to claim 1, wherein after the obtaining of the information of the operation object, before the receiving of the first input of the operation object when the obtained information of the operation object satisfies a preset condition, the method further comprises:
and under the condition that the acquired information of the operation object does not meet the preset condition, controlling the photoelectric module of the electronic equipment to move to a target position, wherein the acquired information of the operation object at the target position meets the preset condition.
3. The method according to claim 1 or 2, wherein the acquired information of the operation object meeting a preset condition includes any one of:
the acquired information of the operation object is a complete operation object image;
the acquired information of the operation object is complete biological characteristics;
the acquired information of the operation object is a biological characteristic which is complete and is matched with a preset biological characteristic.
4. The method according to claim 1 or 2, wherein a first interface is displayed on the screen of the electronic device, the first interface comprises a first control, and the operation object comprises a target finger;
before the determining, in response to the first input, that the spatial plane in which the operation object is located is the target spatial operation plane, the method further includes:
acquiring a first position of the target finger, wherein the target finger is a finger establishing a mapping relation with the first control;
the receiving of the first input of the operation object comprises:
receiving an input that the target finger stays at the first position for a first length of time;
the determining that the space plane where the operation object is located is a target space operation plane includes:
under the condition that the first time length exceeds a preset time length, determining that a space plane where the target finger is located is a target space operation plane, and locking the position of a first operation area in the target space operation plane, wherein a mapping area of the first interface in the target space operation plane is the first operation area;
after the spatial plane where the operation object is located is determined to be the target spatial operation plane, the method further includes:
and releasing the mapping relation between the target finger and the first control.
5. The method of claim 4, further comprising a second control in the first interface;
after the spatial plane where the operation object is located is determined to be the target spatial operation plane, the method further includes:
receiving a second input of the target finger at a second position, wherein the second position is a position mapped to the second control on the target space operation plane;
in response to the second input, canceling the determination of the target space operation plane and restoring the mapping relation;
the second control is used for unlocking the target space operation plane.
6. An operating device of an electronic apparatus, comprising:
the first acquisition module is used for acquiring the information of an operation object under the condition that the electronic equipment is in an air-separating operation mode;
the first receiving module is used for receiving first input of the operation object under the condition that the acquired information of the operation object meets preset conditions;
and the execution module is used for responding to the first input and determining that the space plane where the operation object is located is a target space operation plane, wherein the target space operation plane is parallel to the plane where the screen of the electronic equipment is located, or an included angle between the target space operation plane and the plane where the screen of the electronic equipment is located is smaller than a preset angle.
7. The operating device of an electronic apparatus according to claim 6, characterized in that the operating device of an electronic apparatus further comprises:
the control module is used for controlling the photoelectric module of the electronic equipment to move to a target position under the condition that the acquired information of the operation object does not meet the preset condition, wherein the information of the operation object acquired at the target position meets the preset condition.
8. The operating device of the electronic device according to claim 6 or 7, wherein the acquired information on the operation object that satisfies a preset condition includes any one of:
the acquired information of the operation object is a complete operation object image;
the acquired information of the operation object is complete biological characteristics;
the acquired information of the operation object is a biological characteristic which is complete and is matched with a preset biological characteristic.
9. The operating device of the electronic equipment according to claim 6 or 7, wherein a first interface is displayed on the screen of the electronic equipment, the first interface includes a first control, and the operating object includes a target finger;
the operation device of the electronic apparatus further includes:
the second acquisition module is used for acquiring a first position of the target finger, wherein the target finger is a finger establishing a mapping relation with the first control;
the first processing module is used for removing the mapping relation between the target finger and the first control under the condition that the space plane where the operation object is located is determined to be a target space operation plane;
the first receiving module is used for receiving the input of the target finger staying at the first position for a first time length;
the execution module is configured to determine that a spatial plane where the target finger is located is a target spatial operation plane and lock a position of a first operation area in the target spatial operation plane when the first duration exceeds a preset duration, where a mapping area of the first interface in the target spatial operation plane is the first operation area.
10. The operating device according to claim 9, wherein the first interface further includes a second control;
the operation device of the electronic apparatus further includes:
a second receiving module, configured to receive a second input of the target finger at a second position, where the second position is a position on the target space operation plane that is mapped to the second control;
the second processing module is used for responding to the second input, canceling the determination of the target space operation plane and recovering the mapping relation;
the second control is used for unlocking the target space operation plane.
11. An electronic device comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the method of operation of the electronic device according to any one of claims 1 to 5.
12. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the method of operation of an electronic device according to any one of claims 1 to 5.
CN202011613325.9A 2020-12-30 2020-12-30 Electronic equipment operation method and device and electronic equipment Pending CN112732078A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011613325.9A CN112732078A (en) 2020-12-30 2020-12-30 Electronic equipment operation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011613325.9A CN112732078A (en) 2020-12-30 2020-12-30 Electronic equipment operation method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN112732078A true CN112732078A (en) 2021-04-30

Family

ID=75611815

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011613325.9A Pending CN112732078A (en) 2020-12-30 2020-12-30 Electronic equipment operation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112732078A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113347526A (en) * 2021-07-08 2021-09-03 歌尔科技有限公司 Sound effect adjusting method and device of earphone and readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006070441A1 (en) * 2004-12-27 2006-07-06 Celltec Project Management Co., Ltd. Operator supporting system
WO2013077359A1 (en) * 2011-11-25 2013-05-30 シャープ株式会社 Electronic device, method of operating electronic device, and program
CN109240571A (en) * 2018-07-11 2019-01-18 维沃移动通信有限公司 A kind of control device, terminal and control method
KR20190097706A (en) * 2018-02-13 2019-08-21 위드로봇 주식회사 Apparatus and method for contactless fingerprint recognition
CN110448899A (en) * 2018-11-13 2019-11-15 网易(杭州)网络有限公司 The control method and device of virtual role, electronic equipment in game
KR20200039983A (en) * 2018-10-08 2020-04-17 주식회사 토비스 Space touch detecting device and display device having the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006070441A1 (en) * 2004-12-27 2006-07-06 Celltec Project Management Co., Ltd. Operator supporting system
WO2013077359A1 (en) * 2011-11-25 2013-05-30 シャープ株式会社 Electronic device, method of operating electronic device, and program
KR20190097706A (en) * 2018-02-13 2019-08-21 위드로봇 주식회사 Apparatus and method for contactless fingerprint recognition
CN109240571A (en) * 2018-07-11 2019-01-18 维沃移动通信有限公司 A kind of control device, terminal and control method
KR20200039983A (en) * 2018-10-08 2020-04-17 주식회사 토비스 Space touch detecting device and display device having the same
CN110448899A (en) * 2018-11-13 2019-11-15 网易(杭州)网络有限公司 The control method and device of virtual role, electronic equipment in game

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113347526A (en) * 2021-07-08 2021-09-03 歌尔科技有限公司 Sound effect adjusting method and device of earphone and readable storage medium

Similar Documents

Publication Publication Date Title
CN111787223B (en) Video shooting method and device and electronic equipment
CN103139390A (en) Method, system of unlocking screen of mobile phone and mobile phone
CN100561414C (en) A kind of portable terminal hand-written inputting method, device and portable terminal
CN103795868A (en) Mobile terminal screen unlocking method and device and mobile terminal
CN103268197B (en) A kind of method of lock-screen, device and terminal device
CN103616953A (en) Method and device for unlocking screen and initiating application
CN101869484A (en) Medical diagnosis device having touch screen and control method thereof
CN104376241A (en) Information processing method and electronic device
EP2827268A1 (en) Information terminal and execution control method
CN112433693B (en) Split screen display method and device and electronic equipment
CN111665938A (en) Application starting method and electronic equipment
CN111656313A (en) Screen display switching method, display device and movable platform
CN112492201B (en) Photographing method and device and electronic equipment
CN111596760A (en) Operation control method and device, electronic equipment and readable storage medium
CN112181559A (en) Interface display method and device and electronic equipment
CN113655929A (en) Interface display adaptation processing method and device and electronic equipment
WO2022268023A1 (en) Fingerprint recognition method and apparatus, and electronic device and readable storage medium
CN112486394A (en) Information processing method and device, electronic equipment and readable storage medium
CN112699796A (en) Operation method and device of electronic equipment
CN102685581B (en) Multi-hand control system for intelligent television
CN112732078A (en) Electronic equipment operation method and device and electronic equipment
CN106446643B (en) Terminal control method and device
CN104077513A (en) Information processing method and electronic equipment
CN111859334A (en) Screen state control method and device and electronic equipment
CN112486387A (en) Application program control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination