WO2021121093A1 - Procédé de commande d'image, dispositif électronique et support de stockage - Google Patents

Procédé de commande d'image, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2021121093A1
WO2021121093A1 PCT/CN2020/134815 CN2020134815W WO2021121093A1 WO 2021121093 A1 WO2021121093 A1 WO 2021121093A1 CN 2020134815 W CN2020134815 W CN 2020134815W WO 2021121093 A1 WO2021121093 A1 WO 2021121093A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
electronic device
input
information
control
Prior art date
Application number
PCT/CN2020/134815
Other languages
English (en)
Chinese (zh)
Inventor
王程刚
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021121093A1 publication Critical patent/WO2021121093A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/18Commands or executable codes

Definitions

  • the embodiments of the present invention relate to the field of communication technology, and in particular, to an image control method, electronic equipment, and storage medium.
  • user B directly sends a screenshot of the group member list interface to user A.
  • user A After user A receives the picture, he needs to open the application and follow the member names in the picture one by one. Enter the application, and then complete the operation of adding friends; if user A needs to set a certain function, please ask user B for help, user B will take a screenshot of the setting interface, and then send it to user A, user A needs to open the settings, according to the image
  • the content selects the corresponding options one by one to complete the setting, and the operation process is cumbersome.
  • the present invention is implemented as follows:
  • an embodiment of the present invention provides an electronic device.
  • the electronic device includes: a receiving module, configured to receive a first input from a user; and a display module, configured to display on a first image in response to the first input.
  • Display N controls; a sending module for sending the first image to the first electronic device; where N is a positive integer.
  • an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored in the memory and running on the processor.
  • the computer program is executed by the processor, the computer program is executed as described in the first The steps of the image control method in one aspect.
  • an embodiment of the present invention provides a computer-readable storage medium storing a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the image control method in the first aspect are implemented.
  • Fig. 4 is the second schematic diagram of displaying the first image provided by the embodiment of the present invention.
  • Fig. 5 is the fourth schematic diagram of displaying the first image provided by the embodiment of the present invention.
  • Fig. 6 is the fifth schematic diagram of displaying the first image provided by the embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a first electronic device according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of a second electronic device according to an embodiment of the present invention.
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present invention should not be construed as being more preferable or advantageous than other embodiments or design solutions. Indeed, the use of words such as “exemplary” or “for example” is intended to present related concepts in a specific manner.
  • an embodiment of the present invention provides an image control method applied to a first electronic device.
  • the method may include the following steps 111 to 114.
  • Step 111 Receive a first image sent by a second electronic device, where the first image includes N controls;
  • the first image may be a screenshot or any other image sent by the second electronic device.
  • displaying the first image can display N controls at the same time.
  • the first image is a screenshot of the address book list sent by the second electronic device, including N controls, and the identifier 402 is among the N controls. Of a control. Displaying the first image may be to display only the content of the first image.
  • N controls are displayed. Exemplarily, as shown in FIG. 3, after the user clicks the "Get Picture Information" control, N controls can be displayed, and the result shown in FIG. 4 can be obtained.
  • the first object may include, but is not limited to, pictures, text, numbers, controls, and so on.
  • the first object 401 is a contact person and a communication number.
  • the first object is an object associated with the first control, and the association relationship between the first object and the first control can be set by the second electronic device.
  • the first operation associated with the first control is set by the second electronic device.
  • the first image is an image sent by the second electronic device
  • the first image is a screenshot of the address book
  • the first image area of the first image may be the first column in the address book list
  • the first control can be the control "Add to Contacts" indicated by 402
  • the first object can be the control associated with this control, such as Tony 13812345678, the first control associated with the control One operation can be to add Tony 13812345678 to the address book.
  • the first operation is determined based on the object type of the first object. In this way, the first operation can be determined according to the type of the first object.
  • the object type of the first object may be acquired first; the first operation is determined based on the object type of the first object.
  • the object types can be pictures, applications, numbers, etc.
  • other object types are also included in the protection scope of the present invention.
  • the first operation is to add the number to the chat application, such as adding to the address book; the type of the first object is a picture, then the first operation can be determined The operation is to cut out the image; the first object is the application program icon, and it can be determined that the first operation is to download the application program.
  • the first operation is not limited to the several cases listed above, and other cases are also included in the protection scope of the present invention.
  • the target area of the target interface of the target application may be a full-screen area of the target interface of the target application, or may be a partial area of the target interface of the target application.
  • a partial area of the target interface of the target application may be any column in the contact list, and may be the area where the input box for editing a contact in the contact adding interface is located.
  • adding the first object to the target area of the target interface of the target application, or updating the interface content of the target interface of the target application may be determined by the second electronic device.
  • the first object 401 is a name and a phone number, such as Tony 13812345678, and the first operation associated with the first control 402 may be to add the phone number to the address book list.
  • the user can click the first control 402 to add Tony 13812345678 to the address book list.
  • a first image sent by a second electronic device is received, the first image includes N controls, and the first image is a screenshot of the second electronic device's setting interface, which may specifically be a display A screenshot of the brightness setting interface.
  • the screenshot shows setting options such as "Automatically adjust screen brightness”, “Global eye protection”, “Dark mode”, etc.
  • the user can click on the control indicated by 502 "Add to my settings” ,
  • the "Automatically adjust screen brightness” option of the first electronic device can be set to the on state, and the user can click the "Add all to my settings” control indicated by 503 to set the display and brightness of the first electronic device to this
  • the images are consistent, that is, the display and brightness setting interface content of the first electronic device can be updated. In this way, the user does not need to open the display and brightness setting interface, and set one by one according to the content of the first image, which simplifies the user's operation.
  • the second input may be a second operation, which may include, but is not limited to, the user's click touch input, double-tap touch input, sliding touch input, and long-press touch input of the second image area in the first image. Wait.
  • the second image area in the first image may be any area in the first image, which is not limited in the embodiment of the present invention.
  • Step b In response to the second input, sending first information to a second electronic device, where the first information includes first position information of the second object in the second image area in the first image or The image content of the second image area;
  • the second information is information associated with the second object, and may be obtained by the second electronic device based on the first information.
  • the second information associated with the second object may be obtained by the second electronic device based on the first information and obtained by querying the association relationship table stored in the server.
  • a first image sent by a second electronic device is received.
  • the first image is a screenshot of the second electronic device’s desktop.
  • the user can click on the second image area of the first image, such as Click the area where app4 is located, and the first information can be sent to the second electronic device.
  • the first information can be the first position information of the second object in the first image, such as the position information of app4 in the first image. It is the coordinate of app4 in the first image coordinate system. It should be noted that the coordinate system may be a coordinate system established at any point in the first image as the origin.
  • the first information may also be the image content of the second image area, such as the name of app4.
  • the second information sent by the second electronic device may be received.
  • the second information is information associated with the second object, which may be a download link of app4 or an installation package of app4.
  • a first image sent by a second electronic device is received.
  • the first image is a screenshot of the album interface of the second electronic device.
  • the user can click on the second image area of the first image, such as Click picture 1 to send the first information to the second electronic device.
  • the first information may be the first position information of the first object 501 in the first image, such as the position information of picture 1 in the first image. It can be the coordinate position information of the picture 1 in the first image, or the position information of the picture 1 in the second row and the first column of the first image.
  • the first information may also be the image content of the second image area, such as the image content of picture 1.
  • the second information sent by the second electronic device is received, and the second information may be the original image of picture 1.
  • the second information may be displayed. If the second information is the original image of the picture 1, the original image of the picture 1 may be displayed. Corresponding operations can also be performed based on the second information. For example, if the second information is a download link of app4, app4 can be downloaded based on the download link. Of course, it is not limited to the several situations listed above, and the specifics can be determined according to the actual situation.
  • the first image includes N controls; displaying the first image; receiving a user's feedback on the first control among the N controls A first input; in response to the first input, perform a first operation associated with the first control on a first object, the first object being an object in a first image area of the first image, the The first object is an object associated with the first control; where N is a positive integer.
  • the associated operation can be performed according to the user's input to the control in the image, which simplifies the operation process.
  • the first operation can be determined according to the type of the first object.
  • Step 201 Receive the user's first input
  • the first input may be the first operation, which may include, but is not limited to, the user's click touch input, double-tap touch input, sliding touch input, long-press touch input, and the like.
  • Step 202 In response to the first input, display N controls on the first image; where N is a positive integer.
  • N controls are displayed on the first image, where the identifier 402 is used to indicate one control.
  • the N controls may be displayed in the first image and sent to the first electronic device, or the N controls may be hidden and then sent to the first electronic device.
  • the electronic device can display N controls by acquiring image information. Exemplarily, as shown in FIG. 3, after the user clicks the "Get Picture Information" option, N controls can be displayed, and the result shown in FIG. 4 can be obtained.
  • the first input includes an input used to trigger the addition of a control
  • obtaining the object type of the i-th object in the i-th image area of the first image may be image recognition technology for the i-th object in the i-th image area of the first image to obtain Its object type.
  • image recognition technology for the i-th object in the i-th image area of the first image to obtain Its object type.
  • other acquisition methods are also included in the protection scope of the present invention.
  • the electronic device can establish and store an association relationship between controls and operations.
  • control 402 may be displayed; as shown in FIG. 5, the control 502 may be displayed; as shown in FIG.
  • the receiving the first input of the user includes:
  • receiving the first sub-input of the user on the m-th image area in the first image may include, but is not limited to, the user’s click touch input and double-click on the m-th image area in the first image. Touch input, sliding touch input, long press touch input, etc.
  • the second sub-input may include, but is not limited to, a click touch input, a double-tap touch input, a sliding touch input, a long-press touch input, and the like.
  • the m-th control is displayed in the m-th image area. In this way, controls can be added to the first image and related operations can be associated according to the user's input.
  • the method further includes:
  • Extracting the first position information in the first information or the image content of the second image area, the first position information is that the second object in the second image area in the first image is in the first image Image location information;
  • Step e Acquire second information associated with the first information based on the first location information or the image content of the second image area;
  • the first information sent by the first electronic device is received, and the first information may be the first position information of the object in the first image, such as the position information of app4 in the first image.
  • the first information may also be the image content of the second image area, such as the name of app4.
  • the second electronic device extracts the first location information in the first information or the image content of the second image area, locates the second object in the first image, and obtains the second information associated with the second object.
  • the information can be the download link of app4 or the installation package of app4, which can be determined according to the actual situation.
  • the first information sent by the first electronic device may be the first position information of the object 703 in the first image, such as the position information of the picture 1 in the first image.
  • the first information may also be the image content of the second image area, such as the image content of picture 1.
  • the second electronic device extracts the first location information in the first information or the image content of the second image area, and obtains the second information associated with the first information.
  • the second information may be the original image of the picture 1.
  • the information facilitates the interactive operation between the first electronic device and the second electronic device.
  • the receiving module 121 is configured to receive a first image sent by a second electronic device, the first image includes N controls; the display module 122 is configured to display the first image; the receiving module 121, It is also used to receive a user's first input to the first control among the N controls; the execution module 123 is used to execute the first control associated with the first object on the first object in response to the first input.
  • the first object is an object in a first image area of the first image, and the first object is an object associated with the first control; where N is a positive integer.
  • the first operation is determined based on the object type of the first object. In this way, the first operation can be determined according to the type of the first object.
  • the receiving module 121 is further configured to receive a user's second input to the second image area in the first image; the electronic device further includes: a sending module, configured to respond to the second Input, send first information to a second electronic device, the first information includes the first position information of the second object in the second image area in the first image or the image content of the second image area
  • the receiving module 121 is also used to receive second information sent by the second electronic device, where the second information is information associated with the second object. In this way, according to the input of the first electronic device at any position in the first image, the first information can be automatically triggered to be sent to the second electronic device, and the second information sent by the second electronic device based on the first information can be received.
  • the first object can be added to the target area of the target interface of the target application, or the interface content of the target interface of the target application can be updated, and the user does not need to open the target interface of the target application to perform operations, which simplifies the user’s operating. It can automatically trigger the sending of the first information to the second electronic device according to the input of the first electronic device at any position in the first image, and can receive the second information needed by the user sent by the second electronic device based on the first information.
  • an embodiment of the present invention provides an electronic device 130, and the electronic device 130 includes:
  • the first input includes an input for triggering adding a control
  • the electronic device further includes: an acquiring module configured to acquire the object type of the i-th object in the i-th image area of the first image;
  • the determining module is used to determine the i-th operation associated with the i-th control based on the object type;
  • the display module 132 is also used to display the i-th control in the i-th image area; where i is a positive integer , And i ⁇ N.
  • the i-th operation associated with the i-th control can be determined according to the object type of the i-th object in the i-th image area in the first image, that is, the control can be automatically added based on the object type, and the control-associated operation can be added.
  • the receiving the user's first input includes receiving the user's first sub-input of the m-th image area in the first image and is used to create the m-th control and; the display module 132 further It is used for displaying the m-th control in the m-th image area in response to the first sub-input and the second sub-input.
  • controls can be added to the first image and related operations can be associated according to the user's input.
  • the second electronic device can send the second information required by the user based on the first information, which is convenient for the first electronic device Interaction with the second electronic device.
  • the electronic device can receive a user's first input; in response to the first input, display N controls on a first image; and send the first image to the first electronic device ;
  • N is a positive integer, that is, by receiving the user's first input, N controls are displayed on the first image, and then the first image is sent to the first electronic device.
  • the i-th operation associated with the i-th control can be determined according to the object type of the i-th object in the i-th image area in the first image, that is, the control can be automatically added based on the object type, and the control-associated operation can be added.
  • the processor 110 is configured to receive a first image sent by a second electronic device, and the first image includes N controls; the display unit 106 is configured to display the first image; and the user input unit 107 is also configured to Receiving a user's first input to a first control of the N controls; the processor 110 is configured to perform a first operation associated with the first control on a first object in response to the first input, the The first object is an object in the first image area of the first image, and the first object is an object associated with the first control; where N is a positive integer.
  • the electronic device can receive a first image sent by a second electronic device, the first image includes N controls; displays the first image; The first input of the first control; in response to the first input, perform a first operation associated with the first control on a first object, the first object being in the first image area of the first image
  • the first object is an object associated with the first control; where N is a positive integer. Therefore, through this solution, the associated operation can be performed according to the user's input to the control in the image, which simplifies the operation process.
  • the electronic device can receive a user's first input; in response to the first input, display N controls on a first image; and send the first image to the first electronic device ;
  • N is a positive integer, that is, by receiving the user's first input, N controls are displayed on the first image, and then the first image is sent to the first electronic device.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the electronic device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output it as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic device 100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the electronic device 100 further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the electronic device 100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the electronic device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be realized by various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it transmits it to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
  • the implementation of the input and output functions of the electronic device is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device with the electronic device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (Input/Output, I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 100 or can be used to connect to the electronic device 100 and the external device. Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the electronic device, which uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes software programs and/or modules stored in the memory 109, and calls data stored in the memory 109 , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the electronic device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the electronic device 100 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present invention also provides an electronic device, which may include the aforementioned processor 110 shown in FIG. 10, a memory 109, and a computer program stored in the memory 109 and running on the processor 110, When the computer program is executed by the processor 110, each process of the image control method shown in any one of FIG. 2 to FIG. 7 in the above method embodiment is realized, and the same technical effect can be achieved. To avoid repetition, it will not be repeated here. .
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the technical solution of the present invention essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make an electronic device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present invention.
  • a storage medium such as ROM/RAM, magnetic disk,
  • the optical disc includes several instructions to make an electronic device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present invention.
  • the program can be stored in a computer readable storage medium, and the program can be stored in a computer readable storage medium. When executed, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.
  • modules, units, and sub-units can be implemented in one or more Application Specific Integrated Circuits (ASIC), Digital Signal Processor (DSP), Digital Signal Processing Device (DSP Device, DSPD) ), programmable logic devices (Programmable Logic Device, PLD), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA), general-purpose processors, controllers, microcontrollers, microprocessors, used to execute the present disclosure Other electronic units or a combination of the functions described above.
  • ASIC Application Specific Integrated Circuits
  • DSP Digital Signal Processor
  • DSP Device Digital Signal Processing Device
  • DSPD Digital Signal Processing Device
  • PLD programmable logic devices
  • Field-Programmable Gate Array Field-Programmable Gate Array
  • FPGA Field-Programmable Gate Array

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de commande d'image. Le procédé consiste à : recevoir une première image envoyée par un second dispositif électronique, la première image comprenant N commandes ; afficher la première image ; recevoir une première entrée d'un utilisateur pour une première commande parmi les N commandes ; et en réponse à la première entrée, exécuter une première opération associée à la première commande sur un premier objet, le premier objet étant un objet dans une première zone d'image de la première image, le premier objet étant un objet associé à la première commande, et N étant un nombre entier positif.
PCT/CN2020/134815 2019-12-16 2020-12-09 Procédé de commande d'image, dispositif électronique et support de stockage WO2021121093A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911294231.7 2019-12-16
CN201911294231.7A CN111130995B (zh) 2019-12-16 2019-12-16 图像控制方法、电子设备及存储介质

Publications (1)

Publication Number Publication Date
WO2021121093A1 true WO2021121093A1 (fr) 2021-06-24

Family

ID=70499245

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/134815 WO2021121093A1 (fr) 2019-12-16 2020-12-09 Procédé de commande d'image, dispositif électronique et support de stockage

Country Status (2)

Country Link
CN (1) CN111130995B (fr)
WO (1) WO2021121093A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111130995B (zh) * 2019-12-16 2021-08-10 维沃移动通信有限公司 图像控制方法、电子设备及存储介质
CN112843692B (zh) * 2020-12-31 2023-04-18 上海米哈游天命科技有限公司 拍摄图像的方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104636029A (zh) * 2014-12-31 2015-05-20 魅族科技(中国)有限公司 控件的显示控制方法和系统
US20170371844A1 (en) * 2015-01-15 2017-12-28 Zte Corporation Method, device and terminal for implementing regional screen capture
WO2019104478A1 (fr) * 2017-11-28 2019-06-06 华为技术有限公司 Procédé et terminal de reconnaissance de texte de capture d'écran
CN110456956A (zh) * 2019-08-05 2019-11-15 腾讯科技(深圳)有限公司 截图方法、装置、计算机设备和存储介质
CN111130995A (zh) * 2019-12-16 2020-05-08 维沃移动通信有限公司 图像控制方法、电子设备及存储介质

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0166046B1 (fr) * 1984-06-25 1988-08-24 International Business Machines Corporation Dispositif d'affichage graphique comportant des processeurs-pipelines
US8468465B2 (en) * 2010-08-09 2013-06-18 Apple Inc. Two-dimensional slider control
CN103197855A (zh) * 2013-04-19 2013-07-10 中国建设银行股份有限公司 图像文件输入方法和装置
CN105549888B (zh) * 2015-12-15 2019-06-28 芜湖美智空调设备有限公司 组合控件生成方法和装置
CN105700789B (zh) * 2016-01-11 2019-05-10 Oppo广东移动通信有限公司 一种图片发送方法及终端设备
CN108563473A (zh) * 2018-04-04 2018-09-21 歌尔股份有限公司 扫描模组配置方法及装置
CN108549568B (zh) * 2018-04-18 2020-01-31 Oppo广东移动通信有限公司 应用入口处理方法、装置、存储介质及电子设备
CN108769374B (zh) * 2018-04-25 2020-10-02 维沃移动通信有限公司 一种图像管理方法及移动终端
CN109829070B (zh) * 2019-01-29 2021-01-08 维沃移动通信有限公司 一种图像查找方法及终端设备
CN109840129A (zh) * 2019-01-30 2019-06-04 维沃移动通信有限公司 一种显示控制方法及电子设备
CN110339568B (zh) * 2019-08-19 2024-06-21 网易(杭州)网络有限公司 虚拟控件的显示方法、装置、存储介质和电子装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104636029A (zh) * 2014-12-31 2015-05-20 魅族科技(中国)有限公司 控件的显示控制方法和系统
US20170371844A1 (en) * 2015-01-15 2017-12-28 Zte Corporation Method, device and terminal for implementing regional screen capture
WO2019104478A1 (fr) * 2017-11-28 2019-06-06 华为技术有限公司 Procédé et terminal de reconnaissance de texte de capture d'écran
CN110456956A (zh) * 2019-08-05 2019-11-15 腾讯科技(深圳)有限公司 截图方法、装置、计算机设备和存储介质
CN111130995A (zh) * 2019-12-16 2020-05-08 维沃移动通信有限公司 图像控制方法、电子设备及存储介质

Also Published As

Publication number Publication date
CN111130995B (zh) 2021-08-10
CN111130995A (zh) 2020-05-08

Similar Documents

Publication Publication Date Title
WO2021098678A1 (fr) Procédé de commande de vidéocapture d'écran et dispositif électronique
WO2019141174A1 (fr) Terminal mobile et procédé de traitement de message non lu
WO2020258929A1 (fr) Procédé de commutation d'interface de dossier et dispositif terminal
EP3786771B1 (fr) Procédé et terminal de gestion de messages
WO2021109907A1 (fr) Procédé de partage d'application, premier dispositif électronique et support de stockage lisible par ordinateur
WO2021012931A1 (fr) Procédé et terminal de gestion d'icônes
WO2020151519A1 (fr) Procédé d'entrée d'informations, dispositif terminal et support d'enregistrement lisible par ordinateur
WO2020238463A1 (fr) Procédé de traitement de messages et terminal
WO2021083087A1 (fr) Procédé de capture d'écran et dispositif terminal
WO2020238449A1 (fr) Procédé de traitement de messages de notification et terminal
WO2021082716A1 (fr) Procédé de traitement d'informations et dispositif électronique
WO2020151525A1 (fr) Procédé d'envoi de message et dispositif terminal
WO2020238497A1 (fr) Procédé de déplacement d'icône et dispositif terminal
WO2019196691A1 (fr) Procédé d'affichage d'interface de clavier et terminal mobile
WO2019196864A1 (fr) Procédé de commande de bouton virtuel et terminal mobile
WO2021083112A1 (fr) Procédé de partage d'informations et dispositif électronique
WO2021036553A1 (fr) Procédé d'affichage d'icônes et dispositif électronique
WO2020156123A1 (fr) Procédé de traitement d'informations et dispositif terminal
WO2020192299A1 (fr) Procédé d'affichage d'informations et dispositif terminal
WO2021036603A1 (fr) Procédé et terminal de commande de programme d'application
WO2020238445A1 (fr) Procédé d'enregistrement d'écran et terminal
WO2020173405A1 (fr) Procédé de partage de contenu et terminal mobile
WO2021109959A1 (fr) Procédé de partage d'application et dispositif électronique
WO2020211612A1 (fr) Procédé d'affichage d'informations et dispositif terminal
WO2021238719A1 (fr) Procédé d'affichage d'informations et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20902588

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20902588

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 16/02/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20902588

Country of ref document: EP

Kind code of ref document: A1