CN113986084A - Operating method and device - Google Patents

Operating method and device Download PDF

Info

Publication number
CN113986084A
CN113986084A CN202111279985.2A CN202111279985A CN113986084A CN 113986084 A CN113986084 A CN 113986084A CN 202111279985 A CN202111279985 A CN 202111279985A CN 113986084 A CN113986084 A CN 113986084A
Authority
CN
China
Prior art keywords
application program
input
determining
displaying
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111279985.2A
Other languages
Chinese (zh)
Inventor
段绪芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111279985.2A priority Critical patent/CN113986084A/en
Publication of CN113986084A publication Critical patent/CN113986084A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The application discloses an operation method and device, and belongs to the technical field of electronic equipment. The method comprises the following steps: receiving a first input of a user on an image in a case where the image is displayed; executing target operation according to the object corresponding to the first input; wherein the target operation is associated with characteristic information of the object.

Description

Operating method and device
Technical Field
The application belongs to the technical field of electronic equipment, and particularly relates to an operation method and device.
Background
When a user uses one application program, if the user wants to use functions in other application programs, the user needs to quit the current application program first, then finds the target application program to be used, opens the target application program, expands a corresponding service interface and uses the corresponding function. It can be seen that the operation for starting the application program in the prior art is more complicated, and the user experience is poorer.
Disclosure of Invention
The embodiment of the application aims to provide an operation method, which can simplify the operation process of a user and improve the user experience.
In a first aspect, an embodiment of the present application provides an operating method, where the method includes:
receiving a first input of a user on an image in a case where the image is displayed;
executing target operation according to the object corresponding to the first input;
wherein the target operation is associated with characteristic information of the object.
In a second aspect, an embodiment of the present application provides an operating device, including:
the receiving module is used for receiving a first input of a user on an image under the condition of displaying the image;
the execution module is used for executing target operation according to the object corresponding to the first input;
wherein the target operation is associated with characteristic information of the object.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, a first input of a user on an image is received under the condition that the image is displayed; executing target operation according to the object corresponding to the first input; wherein the target operation is associated with characteristic information of the object. According to the embodiment of the application, the target operation corresponding to the object of the first input can be executed through the first input of the image under the condition that the user browses the image, the corresponding function or application can be quickly started according to the difference of the first input object, the operation flow of the user is simplified, and the user experience is improved.
Drawings
FIG. 1 is a schematic flow chart diagram of a method of operation of an embodiment of the present application;
FIG. 2 is a schematic structural diagram of an operating device according to an embodiment of the present disclosure;
FIG. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail an operation method provided by the embodiments of the present application with reference to the accompanying drawings and application scenarios thereof.
FIG. 1 is a schematic flow chart diagram of a method of operation of an embodiment of the present application. The operation method of the embodiment of the application can be executed by an operation device, for example, and the operation device can be disposed in an electronic device such as a smart phone, a tablet computer, and a smart watch.
As shown in fig. 1, the operation method of the present embodiment may include the following steps 1100 to 1200:
in the case of displaying an image, a first input of a user on the image is received, step 1100.
The first input may be a click input of an object in an image by a user, or a voice instruction input by the user, or a specific gesture input by the user, which may be determined specifically according to an actual use requirement, and this embodiment does not limit this.
The specific gesture in this embodiment may be any one of a single-click gesture, a sliding gesture, a dragging gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture, and a double-click gesture; the click input in the embodiment of the application can be click input, double-click input, click input of any number of times and the like, and can also be long-time press input or short-time press input.
Step 1200, executing a target operation according to an object corresponding to the first input; wherein the target operation is associated with characteristic information of the object.
The feature information is information related to a feature of the object. The object may be, for example, a person, an animal, a static object, etc. When the object is a person, the characteristics of the person can comprise five-sense-organ characteristics, limb characteristics and attachment characteristics; wherein the five sense organ characteristics may include eyes, nose, ears, mouth, eyebrows, beard, and the like; the limb characteristics may include arm, finger, elbow, hair, shoulder, belly, waist, leg, knee, foot, toe, etc; the attachment features may include features of clothing, pants, shoes, socks, hats, and the like. When the object is an animal, the characteristic information of the animal may include, for example, characteristics of a body, feet, and the like. The static object can be an inanimate object such as a scenic spot, a two-dimensional code, a card, a table and the like.
In this embodiment, the electronic device needs to establish an association relationship between the feature information of the object in the image and the target operation in advance, so that the target operation associated with the feature information of the object can be executed when the first input to the object in the image is received.
Specifically, when the association relationship between the feature information of the object in the image and the target operation is established, the electronic device analyzes the object in the information flow and the feature information of the object when acquiring the information flow, determines corresponding association data for each feature information, and determines the corresponding target operation according to the association data.
The information stream is, for example, information such as files, images, videos, and the like. Illustratively, the object may be a person, an animal, an attachment, a static object, etc. in the information stream. The attachments can include clothing, trousers, shoes, socks, hats and other objects of people or animals, and the static objects can refer to objects without life, such as scenic spots, two-dimensional codes, cards, tables and other objects.
As described above, the characteristic information of the person may include, for example, characteristics of eyes, nose, ears, mouth, tongue, arms, fingers, elbows, hair, eyebrows, beard, shoulders, belly, waist, legs, knees, feet, toes, and the like. The characteristic information of the animal may include, for example, characteristics of the body, feet, and the like.
After analyzing the information stream to obtain the object and the feature information of the object in the information stream, the electronic device may generate associated data by acquiring data associated with the feature information of the object in the application program, and associate the associated data with an operation in a specific application program. Wherein the application programs include but are not limited to: calendar-like applications, memos, communication-like applications, social networking platforms, mailboxes, notes, etc.
For example, when the characteristic information of the object is the eyes of a person, the target operation associated with the eyes may be a video call in a social network platform; when the feature information of the object is the nose of a person, the target operation associated with the nose may be playing a dining video recorded by the person in a video playing application; when the characteristic information of the object is the ears of the person, the target operation associated with the ears may be that the communication application plays the last voice chat content; when the characteristic information of the object is the mouth of the person, the target operation associated with the mouth can be a voice call in the social network platform; when the characteristic information of the object is a tongue of a character, the target operation associated with the tongue may be to open a food purchase interface of a shopping application; when the characteristic information of the object is the feet of the person, the target operation associated with the feet can be to open a navigation application program and navigate to the home address of the person; when the characteristic information of the object is a hand of a person, the target operation associated with the hand may be to open a recording page in the memo that the person has last touched.
For example, when the characteristic information of the object is a swimsuit of a person, the target operation associated with the swimsuit may be playing a swimming video and a swimming address recorded by the person in a video playing application; when the characteristic information of the object is a plurality of persons, the target operation associated with the plurality of persons may be to open a multi-person communication page in the social network platform. When the characteristic information of the object is an animal, the target operation associated with the animal may be to open a monitoring application to play a real-time monitoring video of the animal.
Illustratively, when the characteristic information of the object is a bank card, the target operation associated with the bank card is to open a bank client transfer page or generate a money return reminder; when the characteristic information of the object is scenes such as a swimming pool and a museum, the target operation associated with the scenes can be opening a scene detail introduction webpage; when the characteristic information of the object is the two-dimensional code, the target operation associated with the two-dimensional code can be opening a function page of payment, account transfer, friend adding and the like in a corresponding application program; when the characteristic information of the object is contract, the target operation associated with the contract can be opening a progress reminding page of a schedule application program; when the characteristic information of the object is a ball, the target operation associated with the ball may be initiating a ball playing activity through a communication application; when the characteristic information of the object is an entity, the target operation associated with the entity may be to open the tracing information of the entity, such as providing the record information of the application program where the entity is located, for example, a photo, providing where the photo is saved, and displaying a hover button, which may be used to open the window position of the chat record where the picture is located, view the context, and the like.
The electronic device may input a first input to an object in an image when a user browses the image with an association between feature information of the object in the image and a target operation established.
When the electronic device receives the first input aiming at the object in the image, the target operation to be executed can be determined according to the incidence relation because the incidence relation between the object and the target operation is established in the electronic device in advance.
In one embodiment, considering that the object in the image may be a person or an animal, and similar characteristic information may exist in the person and the animal, for example, the person and the animal have five sense organs and four limbs, in order to distinguish the target operation of the person and the animal, corresponding operation sets may be set for different types of objects. Namely, a human target operation set is set for the characteristic information of the human being, and an animal target operation set is set for the characteristic information of the animal.
Specifically, before the step 1200, the electronic device may further obtain the type of the object; determining a corresponding operation set according to the type of the object; determining the target operation from the set of operations.
For example, in the case of displaying the person image, a first input of a user to a foot of the person on the person image is received, and the electronic device determines that the type of the object is the person according to the foot of the person and determines an operation set corresponding to the person according to the type of the person, for example, a target operation associated with the foot of the person in the operation set is to open a navigation application program.
For example, in the case of displaying an animal image, a first input of a user to a foot of the animal on the animal image is received, and the electronic device determines that the type of the object is the animal according to the foot of the animal, and determines an operation set corresponding to the animal according to the type of the animal, for example, an operation associated with the foot of the animal in the operation set is to open a monitoring application program to play an activity track video of the animal.
In an embodiment, when the image includes a person and the object corresponding to the first input is a target part of the person, the step 1200 may specifically include: determining an application program associated with the target part, and displaying a first interface of the application program; wherein the target site comprises at least one of: five sense organs, limbs, attachments of the human.
It will be appreciated that the object corresponding to the first input is associated with the application that is subsequently operated or launched, and the corresponding first interface. That is, the user can trigger different shortcut operations by operating different positions/objects in the same photo.
Optionally, the target site is a five-sense organ site, and the determining, by the electronic device, an application associated with the target site and displaying the first interface of the application may include: under the condition that the five sense organ parts are eyes, determining an instant messaging application program related to the eyes, and displaying a video call interface of the instant messaging application program; and under the condition that the five sense organ parts are mouths, determining a communication application program associated with the mouths, and displaying a voice communication interface of the communication application program.
It can be appreciated that the eyes have an association with some applications that can be seen and the mouth with some applications that can speak to each other, so that a quick operation of certain applications/functions can be achieved by setting a quick start operation of the corresponding five sense organs in comparison with the associated applications.
Exemplarily, in the case that the five sense organs part is a nose, determining a video playing application program associated with the nose, and displaying a playing interface of a eating video; and under the condition that the five sense organs are ears, determining a communication application program associated with the ears, and displaying a voice playing interface for playing the latest voice chat content.
Optionally, the target part is a limb part, and the determining, by the electronic device, an application associated with the target part, and displaying the first interface of the application may include: determining a navigation application corresponding to the foot and displaying a navigation interface for navigating to a human home address under the condition that the limb part is the foot; and when the limb part is a hand, determining a memorandum application program corresponding to the hand, and displaying a recording interface of the latest face contact condition of the person.
It is understood that, in the above corresponding operation knowledge example, the association relationship between the specific target portion and the target operation may be set by the user or automatically determined according to big data. For example, clicking on a person's eyes may automatically turn on a photographing application, and clicking on a person's ears may automatically play music. And is not particularly limited.
Further, after receiving the first input, if the object corresponding to the first input corresponds to a plurality of operations, a selection list of the plurality of operations may be displayed, and the user selects one of the plurality of operations, thereby automatically executing the selected operation.
Illustratively, after receiving the first input to the eye of the person, the displaying may include: and operating the video of the user A, starting a camera, playing the video of the movie B, receiving the selection input of the user on the option of 'operating the video of the user A', namely sending a video call request to the user A and displaying a video call interface with the user A.
In an embodiment, when the image includes an animal and the object corresponding to the first input is a target part of the animal, the step 1200 may specifically include: determining an application program associated with the target part, and displaying a first interface of the application program; wherein the target site comprises at least one of: body part, limb part.
Optionally, in a case that the target part is a body part, the electronic device determining an application associated with the target part and displaying a first interface of the application may include: and determining a video playing application program corresponding to the body part, and displaying an active video playing interface of the animal.
Optionally, in a case that the target part is a body part, the determining, by the electronic device, an application associated with the target part, and displaying the first interface of the application may include: and determining a positioning application program corresponding to the limb part, and displaying a display interface of the current positioning position information of the animal.
In an embodiment, when the image includes a static object and the object corresponding to the first input is the static object, the step 1200 may specifically include: determining an application program associated with the static object, and displaying a first interface of the application program; wherein the static object comprises at least one of: scenic spot, two-dimensional code, card, desk.
Optionally, in a case that the static object is a bank card, the determining, by the electronic device, an application associated with the static object, and displaying a first interface of the application may include: and determining a bank client application program corresponding to the bank card, and displaying a bank card transfer interface.
Optionally, in a case that the static object is a scenic spot, the determining, by the electronic device, an application associated with the static object, and displaying a first interface of the application may include: and determining a browser application program corresponding to the scenic spot, and displaying a detailed introduction interface of the scenic spot.
Optionally, when the static object is a two-dimensional code, the determining, by the electronic device, an application program associated with the static object, and displaying a first interface of the application program may include: and determining a payment application program corresponding to the two-dimensional code, and displaying a payment interface.
In an embodiment, the electronic device may specifically display in the form of a floating control when displaying the first interface of the application program.
For example, the electronic device may display a video call interface, a voice call interface, a video playing interface, a voice playing interface, a navigation interface, a recording interface, a positioning information display interface, a transfer interface, a detailed introduction interface, a payment interface, and the like in the form of a floating control.
The floating control may be displayed at any position of the electronic device, where the any position of the electronic device is, for example, a preset position, or the user drags the floating control to a predetermined position, which is not specifically limited in this embodiment.
The operation method of the embodiment can receive a first input of a user on an image under the condition of displaying the image; determining target operation according to the object corresponding to the first input; wherein the target operation is associated with characteristic information of the object. According to the method and the device, the operation of the object corresponding to the first input can be executed through the first input of the image under the condition that the user browses the image, the operation process of the user is simplified, and the user experience is improved.
In the operation method provided in the embodiment of the present application, the execution main body may be an operation device, or a control module in the operation device for executing the operation method. In the embodiment of the present application, an operating device executing an operating method is taken as an example, and the operating device provided in the embodiment of the present application is described.
Fig. 2 is a schematic structural diagram of an operating device according to an embodiment of the present application. As shown in fig. 2, the operation device 200 of the embodiment of the present application may include: a receiving module 210 and an executing module 220.
The receiving module 210 is configured to receive a first input of a user on an image in a case that the image is displayed.
The executing module 220 is configured to determine a target operation according to an object corresponding to the first input. Wherein the target operation is associated with characteristic information of the object.
In the embodiment of the application, the operation of the object corresponding to the first input can be executed through the first input of the image under the condition that the user browses the image, so that the operation process of the user is simplified, and the user experience is improved.
In one embodiment, the electronic device further comprises: the acquisition module is used for acquiring the type of the object; the determining module is used for determining a corresponding operation set according to the type of the object; determining the target operation from the set of operations.
In the embodiment of the application, the corresponding operation set is determined according to the determined type of the object, and the operation set where the object is located is further determined, so that the operation in the operation set is executed, the operation corresponding to the object is more accurate, the operation flow of a user is simplified, and the user experience is improved.
In one embodiment, the image includes a person, and the object corresponding to the first input is a target part of the person; the execution module 220 is specifically configured to: determining an application program associated with the target part, and displaying a first interface of the application program; wherein the target site comprises at least one of: five sense organs, limbs, attachments of the human.
In the embodiment of the application, the first interface of the application program associated with the target part can be determined and displayed through the first input of the image under the condition that the user browses the image, so that the use of corresponding functions is quickly realized, the operation process of the user is simplified, and the user experience is improved.
In one embodiment, the target site is a five-sense organ site, and the execution module 220 is specifically configured to: under the condition that the five sense organ parts are eyes, determining an instant messaging application program related to the eyes, and displaying a video call interface of the instant messaging application program; and under the condition that the five sense organ parts are mouths, determining a communication application program associated with the mouths, and displaying a voice communication interface of the communication application program.
In one embodiment, the executing module 220 is specifically configured to: and displaying a first interface of the application program in the form of a floating control.
In the embodiment of the application, the first interface of the application program associated with the object can be displayed in the form of the floating control through the first input of the image under the condition that the user browses the image, so that the use of corresponding functions is quickly realized, the operation flow of the user is simplified, and the user experience is improved.
The operation device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The apparatus may be a mobile electronic device. The mobile electronic device may be, for example, a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a Personal Digital Assistant (PDA), and the embodiments of the present application are not limited in particular.
The operating device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The operation device provided in the embodiment of the present application can implement each process implemented in the method embodiment of fig. 1, and is not described here again to avoid repetition.
Optionally, as shown in fig. 3, an electronic device 300 is further provided in this embodiment of the present application, and includes a processor 310, a memory 320, and a program or an instruction stored in the memory 320 and executable on the processor 310, where the program or the instruction is executed by the processor 310 to implement the processes in the operation method embodiment shown in fig. 1, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device described above.
Fig. 4 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 400 includes, but is not limited to: radio unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, and processor 410.
Those skilled in the art will appreciate that the electronic device 400 may further include a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 4 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The user input unit 407 is configured to receive a first input of a user on an image in a case where the image is displayed.
The processor 410 is configured to determine a target operation according to an object corresponding to the first input; wherein the target operation is associated with characteristic information of the object.
In the embodiment of the application, the operation of the object corresponding to the first input can be executed through the first input of the image under the condition that the user browses the image, so that the operation process of the user is simplified, and the user experience is improved.
In one embodiment, the processor 410 is specifically configured to: determining the type of the object according to the object corresponding to the first input; determining a corresponding operation set according to the type of the object; determining the target operation from the set of operations.
In the embodiment of the application, through the first input of the image, the operation of the object corresponding to the first input is executed, and the corresponding operation set is determined according to the type of the determined object, so that the operation in the operation set is executed, the operation flow of a user is simplified, and the user experience is improved.
In one embodiment, the image includes a person, and the object corresponding to the first input is a target part of the person; the processor 410 is specifically configured to: determining an application program associated with the target part, and displaying a first interface of the application program; wherein the target site comprises at least one of: five sense organs, limbs, attachments of the human.
In the embodiment of the application, the first interface of the application program associated with the target part can be determined and displayed through the first input of the image under the condition that the user browses the image, so that the use of corresponding functions is quickly realized, the operation process of the user is simplified, and the user experience is improved.
In one embodiment, the target site is a five-sense organ site, and the processor 410 is specifically configured to: under the condition that the five sense organ parts are eyes, determining an instant messaging application program related to the eyes, and displaying a video call interface of the instant messaging application program; and under the condition that the five sense organ parts are mouths, determining a communication application program associated with the mouths, and displaying a voice communication interface of the communication application program.
In one embodiment, the processor 410 is specifically configured to: and displaying a first interface of the application program in the form of a floating control.
In the embodiment of the application, the first interface of the application program associated with the object can be displayed in the form of the floating control through the first input of the image under the condition that the user browses the image, so that the use of corresponding functions is quickly realized, the operation flow of the user is simplified, and the user experience is improved.
It should be understood that in the embodiment of the present application, the input Unit 404 may include a Graphics Processing Unit (GPU) 4041 and a microphone 4042, and the Graphics processor 4041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 406 may include a display panel 4061, and the display panel 4061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 407 includes a touch panel 4071 and other input devices 4072. A touch panel 4071, also referred to as a touch screen. The touch panel 4071 may include two parts, a touch detection device and a touch controller. Other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 409 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 410 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
An embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the operation method embodiment shown in fig. 1, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the operation method embodiment shown in fig. 1, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A method of operation, the method comprising:
receiving a first input of a user on an image in a case where the image is displayed;
executing target operation according to the object corresponding to the first input;
wherein the target operation is associated with characteristic information of the object.
2. The method of claim 1, wherein before performing the target operation according to the object corresponding to the first input, the method further comprises:
acquiring the type of the object;
determining a corresponding operation set according to the type of the object;
determining the target operation from the set of operations.
3. The method of claim 1, wherein the image includes a person, and the object corresponding to the first input is a target part of the person;
the executing of the target operation according to the object corresponding to the first input comprises:
determining an application program associated with the target part, and displaying a first interface corresponding to the application program;
wherein the target site comprises at least one of: five sense organs, limbs, attachments of the human.
4. The method of claim 3, wherein the target site is a five-sense organ site, and wherein determining the application associated with the target site and displaying the first interface corresponding to the application comprises:
under the condition that the five sense organ parts are eyes, determining an instant messaging application program related to the eyes, and displaying a video call interface of the instant messaging application program;
and under the condition that the five sense organ parts are mouths, determining a communication application program associated with the mouths, and displaying a voice communication interface of the communication application program.
5. The method of claim 3, wherein displaying the first interface of the application comprises:
and displaying a first interface of the application program in the form of a floating control.
6. An operating device, characterized in that the device comprises:
the receiving module is used for receiving a first input of a user on an image under the condition of displaying the image;
the execution module is used for executing target operation according to the object corresponding to the first input;
wherein the target operation is associated with characteristic information of the object.
7. The apparatus of claim 6, further comprising:
the acquisition module is used for acquiring the type of the object;
the determining module is used for determining a corresponding operation set according to the type of the object; determining the target operation from the set of operations.
8. The apparatus of claim 6, wherein the image comprises a person, and the object corresponding to the first input is a target part of the person;
the execution module is specifically configured to: determining an application program associated with the target part, and displaying a first interface of the application program; wherein the target site comprises at least one of: five sense organs, limbs, attachments of the human.
9. The apparatus of claim 8, wherein the target site is a five-sense organ site, and wherein the execution module is specifically configured to:
under the condition that the five sense organ parts are eyes, determining an instant messaging application program related to the eyes, and displaying a video call interface of the instant messaging application program;
and under the condition that the five sense organ parts are mouths, determining a communication application program associated with the mouths, and displaying a voice communication interface of the communication application program.
10. The apparatus of claim 8, wherein the execution module is specifically configured to: and displaying a first interface of the application program in the form of a floating control.
CN202111279985.2A 2021-10-29 2021-10-29 Operating method and device Pending CN113986084A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111279985.2A CN113986084A (en) 2021-10-29 2021-10-29 Operating method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111279985.2A CN113986084A (en) 2021-10-29 2021-10-29 Operating method and device

Publications (1)

Publication Number Publication Date
CN113986084A true CN113986084A (en) 2022-01-28

Family

ID=79745082

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111279985.2A Pending CN113986084A (en) 2021-10-29 2021-10-29 Operating method and device

Country Status (1)

Country Link
CN (1) CN113986084A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080096042A (en) * 2007-04-26 2008-10-30 엘지전자 주식회사 Mobile communication device and control method thereof
US20100245241A1 (en) * 2009-03-30 2010-09-30 Samsung Electronics Co., Ltd. Apparatus and method for controlling functions of mobile terminal
CN102065166A (en) * 2010-12-30 2011-05-18 东莞宇龙通信科技有限公司 Mobile terminal and method and system for carrying out rapid and convenient operation on contacts
CN103369108A (en) * 2012-03-29 2013-10-23 宇龙计算机通信科技(深圳)有限公司 Mobile terminal operating method and mobile terminal
CN110286755A (en) * 2019-06-12 2019-09-27 Oppo广东移动通信有限公司 Terminal control method, device, electronic equipment and computer-readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080096042A (en) * 2007-04-26 2008-10-30 엘지전자 주식회사 Mobile communication device and control method thereof
US20100245241A1 (en) * 2009-03-30 2010-09-30 Samsung Electronics Co., Ltd. Apparatus and method for controlling functions of mobile terminal
CN102065166A (en) * 2010-12-30 2011-05-18 东莞宇龙通信科技有限公司 Mobile terminal and method and system for carrying out rapid and convenient operation on contacts
CN103369108A (en) * 2012-03-29 2013-10-23 宇龙计算机通信科技(深圳)有限公司 Mobile terminal operating method and mobile terminal
CN110286755A (en) * 2019-06-12 2019-09-27 Oppo广东移动通信有限公司 Terminal control method, device, electronic equipment and computer-readable storage medium

Similar Documents

Publication Publication Date Title
US11868939B2 (en) Fitness challenge e-awards
US11331007B2 (en) Workout monitor interface
KR102311622B1 (en) Display of a scrollable list of affordances associated with physical activities
DK179992B1 (en) Visning af brugergrænseflader associeret med fysiske aktiviteter
US10552004B2 (en) Method for providing application, and electronic device therefor
CN109463001A (en) Activity and body-building update
CN109219796A (en) Digital touch on real-time video
KR20170140085A (en) Data driven natural language event detection and classification
CN109691074A (en) The image data of user's interaction for enhancing
KR20160010608A (en) Attributing user action based on biometric identity
CN109189986B (en) Information recommendation method and device, electronic equipment and readable storage medium
CN110460799A (en) Intention camera
CN111698564B (en) Information recommendation method, device, equipment and storage medium
KR20230058718A (en) Coexistence shared augmented reality without a shared backend
JP2023524119A (en) Facial image generation method, device, electronic device and readable storage medium
US11714536B2 (en) Avatar sticker editor user interfaces
CN111803960B (en) Method and device for starting preset flow
CN107292221B (en) Track processing method and device and track processing device
CN113849065A (en) Method and device for triggering client operation instruction by using body-building action
US20230305688A1 (en) Avatar sticker editor user interfaces
KR102516079B1 (en) Rehabilitation recovery exercise recommendation system and method based on living environment sensing using digital image recoginition
US20230389806A1 (en) User interfaces related to physiological measurements
CN113986084A (en) Operating method and device
CN113542257A (en) Video processing method, video processing apparatus, electronic device, and storage medium
CN113470614A (en) Voice generation method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination