CN113873081B - Method and device for sending associated image and electronic equipment - Google Patents

Method and device for sending associated image and electronic equipment Download PDF

Info

Publication number
CN113873081B
CN113873081B CN202111154586.3A CN202111154586A CN113873081B CN 113873081 B CN113873081 B CN 113873081B CN 202111154586 A CN202111154586 A CN 202111154586A CN 113873081 B CN113873081 B CN 113873081B
Authority
CN
China
Prior art keywords
image
images
target
input
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111154586.3A
Other languages
Chinese (zh)
Other versions
CN113873081A (en
Inventor
袁文龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111154586.3A priority Critical patent/CN113873081B/en
Publication of CN113873081A publication Critical patent/CN113873081A/en
Application granted granted Critical
Publication of CN113873081B publication Critical patent/CN113873081B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a method and a device for sending a related image and electronic equipment, wherein the method for sending the related image comprises the following steps: in response to a first input, correlating N images, N being an integer greater than or equal to 2; receiving a second input of a target image in the N images; and responding to the second input, determining an image to be sent according to the target image and at least one image associated with the target image, and sending the image to be sent to a receiver.

Description

Method and device for sending associated image and electronic equipment
Technical Field
The application belongs to the technical field of communication, and particularly relates to a method and a device for sending a related image and electronic equipment.
Background
At present, image shooting through electronic equipment is increasingly common in life, and a user can meet many scenes needing to send images in the process of using the electronic equipment. For example, when a user uses the electronic device to take a picture of another person, after the picture is taken, the user needs to select a good taken picture and send the good taken picture to a receiver; or, when the user wants to share a certain commodity with other people, the commodity and the price can be photographed and sent to the receiving party.
When a user shoots a plurality of images and needs to send the images to a receiver, the images need to be selected one by one and then sent, so that the time is consumed, and the operation is complicated; and when a plurality of images are transmitted in succession, the association between the images may be confused, resulting in a failure of the receiving side to determine the association between the images based on the received images.
Disclosure of Invention
An embodiment of the present application provides a method, an apparatus, and an electronic device for sending associated images, so as to solve the problems that in the prior art, when multiple images are sent, operations are complicated, time is consumed, and a receiving party cannot determine the association between the images.
In a first aspect, an embodiment of the present application provides a method for sending an associated image, including:
in response to a first input, correlating N images, N being an integer greater than or equal to 2;
receiving a second input of a target image in the N images;
and responding to the second input, determining an image to be sent according to the target image and at least one image associated with the target image, and sending the image to be sent to a receiver.
In a second aspect, an embodiment of the present application provides an apparatus for sending an associated image, including:
the correlation module is used for responding to a first input and correlating N images, wherein N is an integer greater than or equal to 2;
the receiving module is used for receiving second input of a target image in the N images;
and the processing module is used for responding to the second input, determining an image to be sent according to the target image and at least one image associated with the target image, and sending the image to be sent to a receiver.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, by associating the N images, when receiving a second input to a target image in the N images, the image to be sent is determined according to the target image and at least one image associated with the target image, so that the image to be sent can be determined by quickly and simply selecting the multiple images based on the input to the target image, and a receiver can conveniently and effectively view the images and quickly know the images based on the association between the images.
Drawings
Fig. 1 is a schematic diagram of a method for sending an associated image according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a display image tag provided by an embodiment of the application;
FIG. 3 is a schematic diagram illustrating an example of a method for performing image correlation based on shooting time according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of an image preview interface displaying a start control according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an image preview interface displaying an end control according to an embodiment of the present application;
FIG. 6 is a schematic diagram of region marking of an image according to an embodiment of the present application;
FIG. 7 is a schematic diagram of performing region image stitching according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a device for sending associated images according to an embodiment of the present application;
FIG. 9 is a block diagram of an electronic device provided by an embodiment of the application;
fig. 10 is a second schematic block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail a method for sending a related image according to an embodiment of the present application with reference to the accompanying drawings.
Referring to fig. 1, an embodiment of the present application provides a method for sending a related image, including the following steps:
and 101, responding to a first input, and associating N images, wherein N is an integer greater than or equal to 2.
The electronic device captures or stores a plurality of images, and associates N images in response to a first input performed by a user when the first input is received, where N is an integer greater than or equal to 2. That is, the association between the at least two images is established based on the first input of the user. The first input may be an input performed by a user on the image preview interface, or may be an input performed by the user on the image. The same image tags can be set for the N images forming the association relationship, so that the association relationship among the N images can be displayed based on the image tags, and a user can know the association among the images conveniently.
And 102, receiving a second input of the target image in the N images.
After associating the N images, a second input of the user to a target image may be received, where the target image is one of the N images, such as a first image of the N images, a last image of the N images, or an image of the N images that is in a specific order; or may be an image containing specific content. The second input performed by the user may be a selection input performed on the target image, or the second input may be an input performed by the user on a partial region of the target image.
And 103, responding to the second input, determining an image to be sent according to the target image and at least one image associated with the target image, and sending the image to be sent to a receiver.
When a second input performed by the user on the target image is received, the image to be transmitted may be determined according to the target image and the at least one image associated with the target image in response to the second input. That is, the image to be transmitted may be determined from the target image and at least one image of the (N-1) images. For example, the image to be transmitted may be determined according to the target image and (N-1) images, and the image to be transmitted may be determined according to the target image and one of the (N-1) images, which may include other cases.
When determining an image to be transmitted, for a target image, at least part of image content of the target image may be used as a basis for determining the image to be transmitted, and for each image of at least one image associated with the target image, at least part of the image content may be used as a basis for determining the image to be transmitted.
After determining the image to be transmitted, the determined image to be transmitted may be transmitted to a receiver (the receiver herein refers to a receiver device), such as transmitting the image to be transmitted to the receiver based on a social application, transmitting the image to be transmitted to the receiver based on a mail.
In the implementation process of the application, through associating the N images, when receiving a second input to a target image in the N images, the image to be sent is determined according to the target image and at least one image associated with the target image, so that the image to be sent can be determined by quickly and simply selecting the plurality of images based on the input to the target image, and a receiver can conveniently and effectively view the images and quickly know the images based on the association between the images.
Optionally, step 101, in response to the first input, associates the N images, including:
in response to the first input, image correlation is performed on the N images based on at least one of a photographing time and photographing contents.
In associating the N images in response to a first input performed by the user, the N images may be associated based on at least one parameter of the photographing time and the photographing contents.
When image association is performed based on the shooting time, N images located in the same shooting period may be image-associated; it is also possible to associate N images having an association relationship at the shooting time, for example, N images shot at the same time (for example, 8 am) within N days. When performing image association based on the shot contents, the N images related to the shot contents (the shot contents may be at least partially identical, or the shot contents may have a binding relationship therebetween) may be subjected to image association. When performing image association based on the shooting time and the shooting content, performing image association on N images which are located in the same shooting period and related to the shooting content; the N images related to the shooting content and having the association relationship at the shooting time may be image-associated.
In the implementation process, the image association of the N images is carried out based on at least one of the shooting time and the shooting content, so that the image association based on at least one parameter can be realized, and the image association mode is enriched.
Describing a scheme for image association based on shooting time, in response to a first input, associating N images, including:
responding to first input respectively executed on an image preview interface at a first moment and a second moment, and associating N images in a plurality of images collected in a target time period; the starting time of the target time interval is a first time, and the ending time of the target time interval is a second time.
When the image association is performed based on the shooting time, the shooting start time and the shooting end time may be determined in response to a first input of a user, specifically: receiving a first input executed by a user in the image preview interface at a first moment, determining the first moment as a shooting starting moment, receiving a second input executed by the user in the image preview interface at a second moment, and determining the second moment as a shooting ending moment. After the shooting start time and the shooting end time are determined, a target time period is determined based on the shooting end time and the shooting start time, and N images in the multiple images shot in the target time period are associated.
Wherein, a plurality of images shot in the target time period can form an image set, and N images are at least partial images in the image set. For the case that the N images are partial images in the image set, when the N images in the image set are subjected to image association, the N images may be selected in the image set according to a preset selection rule, and the selected N images are associated, for example, the N continuous images are selected for image association, and the N images are selected for image association according to an interval selection principle. In the case where the N images are all images in the image set, all images in the image set may be image-associated.
The first input performed by the user in the image preview interface at the first time may be a first input to the start control, and correspondingly, the first input performed by the user in the image preview interface at the second time may be a first input to the end control. The starting control and the ending control may be the same control or different controls, and for the case where the starting control and the ending control are the same control, the starting control may be controlled to be switched to the ending control after the first input is performed on the starting control. The first input performed by the user at the image preview interface at the first time and the second time may also be a specific gesture motion performed at the image preview interface, such as a specific shape, letter, number, etc. drawn at the image preview interface.
In the implementation process, by receiving first inputs respectively executed by the user at the image preview interface at the first time and the second time, determining the target time period in response to the first inputs, and associating N images in the multiple images acquired in the target time period, the determination of the shooting time period based on the shooting time parameter and the association of at least part of images shot in the shooting time period can be realized.
Optionally, for a case where a shooting period is determined based on the shooting time parameter, and N images are selected from among a plurality of images corresponding to the shooting period, determining, in response to the second input, an image to be transmitted according to the target image and at least one image associated with the target image, includes: and determining the N images as images to be transmitted in response to a second input of the target image.
After N images are determined in a plurality of images corresponding to a target time period and the determined N images are associated, after a second input to a target image in the N images is received, the N images can be directly determined as an image to be transmitted in response to the second input, that is, the N images associated with each other are all determined as the image to be transmitted by performing the second input to the target image in the N images associated with each other (the target image may be any one of the N images), so that the image to be transmitted is quickly determined through simple and convenient operation of a user, and the experience of the user is improved.
After the image association is performed on the N images, the method further includes:
displaying the N images with the image labels added; or
And displaying the N images in a multi-level superposition mode, and controlling the N images to be displayed in a tiled mode in response to the expansion input.
After the N images are associated, image tags may be set for the N images, the image tags corresponding to the N images may be set by a user, or information of a certain image (e.g., a first image) such as person, scene, or location information may be automatically recognized by the electronic device, and the image tags may be set according to the recognized information. By setting the image tags, when the N images are displayed, the N images with the image tags added can be displayed, that is, for each image in the N images, the corresponding image tag is added and then displayed, so that the N images are displayed in the album different from other images. As shown in fig. 2, images 1 to 6 to which image tags (person a) are added are displayed in the album list, and the images 1 to 6 are a series of related images. The images 1 to 6 may be selected according to a selection input for any one of the images 1 to 6.
The image labels corresponding to the determined N images are set, the images added with the image labels are displayed, the N images can be distinguished from other images in the album list, and the user can be prevented from being confused by automatic image selection through displaying the image labels.
After associating the N images, the N images may be displayed in a multi-level superimposition form in the album list, a superimposition display of the associated series images may be realized, and a set image tag may be displayed (e.g., a corresponding image tag may be displayed for a first image of the N images which is representative of the series images). Under the condition of receiving the expansion input of the user, for example, receiving the click on the first image of the N images displayed in the multi-level superposition mode, controlling the N images to be displayed in the tiled mode, so that the user can browse the N images, and can also screen the images in the N images according to the requirements.
In the implementation process, the image tags are set, and the N images added with the image tags are displayed, so that the N images can be distinguished from other images in the album list, and the image tags are displayed, so that the user can be prevented from being confused by automatic image selection; by displaying the related series images in an overlapping mode, more images can be displayed, when the series images need to be browsed or selected, specific input can be executed to control the series images to be tiled for display, and the display form of the series images is enriched.
A scheme for determining a shooting period, associating N images based on a plurality of images in the shooting period, and determining an image to be transmitted based on the associated N images is described below with reference to fig. 3, which includes the following steps:
step 301, receiving a first input executed by a user on a start control in an image preview interface at a first time, and determining the first time as a shooting start time.
For example, referring to fig. 4, the image preview interface displays a start control, and by receiving a first input (e.g., a click input) to the start control from a user, a first time when the first input is received is determined as the shooting start time. The first specific action performed by the user on the image preview interface, such as drawing the letter S, may also be received, and the first time when the first specific action is received is determined as the shooting start time.
Step 302, collecting images according to the continuous shooting action of the user, and determining the second moment as the shooting termination moment when receiving the first input executed by the user on the termination control in the image preview interface at the second moment.
For example, referring to fig. 5, the image preview interface displays an end control, and a second time when the first input is received is determined as the shooting termination time by receiving a first input (e.g., a click input) to the end control from a user. A second specific action performed by the user on the image preview interface, such as drawing the letter E, may also be received, and a second time when the second specific action is received is determined as the shooting termination time.
Step 303, associating the plurality of images collected in the target time period from the shooting start time to the shooting end time, and setting the same image tags for the plurality of images.
And step 304, displaying a plurality of images which are acquired in the target time period and added with the image labels.
By setting the image tags and displaying the plurality of images added with the image tags, relevance display of the plurality of related images through the image tags can be realized.
And 305, under the condition that one image in the plurality of images is selected, automatically selecting other images in the plurality of images, and determining the selected plurality of images as the images to be transmitted.
For example, a plurality of images are tiled in an image list, and when a first image of the plurality of images is selected, the plurality of images are all selected, so that the images (images acquired in a target time period) of the whole shooting series are all selected, so that a user can be prevented from hooking the images one by one, the selection experience of the user is improved, and the image tag is arranged, so that the user can be prevented from generating the confusion of automatic image hooking.
According to the process, the target time interval is determined by identifying the shooting starting time and the shooting ending time, and the images shot in the target time interval are correlated, so that when the user selects the images to send to the receiver, the images can be quickly selected through simple and convenient operation, and the user experience is improved.
In the implementation process of performing image association based on shooting time in the embodiment of the application, the target time period is determined through the first input of the user, and the image association is performed on N images in the plurality of images acquired in the target time period, so that the image association based on the shooting time parameter can be realized; the N images which are mutually associated are determined as the images to be sent, so that the images to be sent are quickly determined through simple operation of a user, and the user experience is improved; by setting the image label, the confusion of automatic image selection generated by a user can be avoided; by displaying the related series images in a superimposed manner, the series image tiling display is controlled based on a specific input performed by a user, and the display form of the series images is enriched.
The following describes a scheme of performing image association based on captured contents, and the following scheme may be included for the case of performing image association based on captured contents. When the N images are associated in response to the first input, the method comprises the following steps:
setting M mark areas for a first image in response to a first input to the first image;
for each marking area, when a third input to the marking area is received and K second images are obtained by shooting a target object corresponding to the marking area, associating at least partial images in the K second images with the corresponding marking areas;
m, K are integers greater than or equal to 1, each mark region corresponds to K second images, N images include a first image and at least partial images of the K second images corresponding to each mark region, the values of K are the same or different for different mark regions, and the first image is the first image of the N images.
In associating the N images based on the shot content, it is possible to set M mark regions for the first image in response to a first input performed by a user on the first image after receiving the first input for the first image, M being an integer greater than or equal to 1, that is, to enable setting of at least one mark region for the first image. After at least one mark region is set for the first image, for each mark region of the at least one mark region, when a third input for the mark region (current mark region) is received and K (K is an integer greater than or equal to 1) second images are obtained by shooting a target object corresponding to the current mark region, at least partial images of the K second images are associated with the corresponding current mark region in response to the third input, and the association between the mark region and the second images is realized. And through associating partial images in the K second images with the corresponding current mark areas, proper images can be selected from the K second images to be associated with the corresponding current mark areas, and through associating all the images in the K second images with the corresponding current mark areas, the K shot images can be associated with the corresponding current mark areas.
The number of the mark areas corresponding to the first image is at least one, each mark area corresponds to K second images, when the number of the mark areas corresponding to the first image is two or more than two, each mark area can correspond to K second images, the values of K can be the same or different for different mark areas, and the values of K can be preset by electronic equipment, so that a user can shoot according to the preset shooting number, and the user can shoot the second images with the corresponding number according to the self requirements. When the value of K is the same, it may be achieved that different marked areas correspond to the same number of second images.
When image association is performed on each mark area, the current mark area is associated with at least partial images in the corresponding K second images, so that the associated N images comprise the first image and at least partial images in the K second images corresponding to each mark area.
The first image is the first image of the N images, and the association of the series of images can be achieved by associating a plurality of images before and after the first image. By associating the mark region of the first image with the second image, when the first image is displayed and input satisfying the preset characteristics is received for the mark region, the corresponding second image is skipped to be displayed, so that the viewing continuity of the associated images is realized.
In the above implementation, the association between the region in the first image and the second image can be realized by associating the mark region with at least a partial image of the K corresponding second images for each mark region in the first image, and when the mark region is associated with a partial image of the K corresponding second images, the selection of an appropriate second image as the associated image can be realized, and when the mark region is associated with all images of the K corresponding second images, the captured K images can be realized as the associated images.
For the case that images shot before and after are associated based on shooting contents, when a target image is a first image, responding to a second input, and determining an image to be sent according to the target image and at least one image associated with the target image, the method comprises one of the following schemes:
in response to a second input, determining at least partial images of K second images respectively corresponding to the first image and the at least one mark area as images to be transmitted;
in response to a second input, for each mark region in the at least one mark region, image-stitching the mark region and at least part of images in the corresponding K second images to determine an image to be transmitted;
and in response to a second input, generating a first mark image according to the mark area for each mark area in the at least one mark area, and determining an image to be transmitted according to at least partial images in the K second images of the at least one first mark image corresponding to each first mark image.
When determining an image to be transmitted according to a target image and at least one image associated with the target image, one of the following three schemes may be included for a case where the target image is a first image.
When determining an image to be transmitted, at least some of the K second images corresponding to the first image and at least one of the M marker regions, respectively (at least some of the K second images are images associated with the corresponding marker region in the first image), may be determined as the image to be transmitted in response to a second input to the target image (the first image). The method specifically comprises the following steps: respectively acquiring at least partial images associated with the current mark area in K second images corresponding to the current mark area aiming at each mark area in the M mark areas, and determining the acquired second images and the first images as images to be transmitted; or, for a part of the M mark areas, at least a part of the images in the K second images corresponding to each mark area are respectively acquired (the at least a part of the images in the K second images are associated with the corresponding mark areas), and the acquired second images and the first image are determined as images to be transmitted. At this time, an image to be transmitted may be determined based on the first image and the associated second image.
When determining an image to be transmitted, the image to be transmitted may be determined by image-stitching, for each of at least one of the M marker regions, the marker region with at least some of the corresponding K second images (at least some of the K second images being images associated with the respective marker region in the first image) in response to a second input to the target image (the first image). The method specifically comprises the following steps: and for each marking area in the M marking areas, performing image splicing on the current marking area and at least part of images associated with the current marking area in the corresponding K second images, and determining the obtained spliced images as images to be sent, for example, obtaining M spliced images and determining the M Zhang Pinjie images as images to be sent. Or, for a part of the mark areas in the M mark areas, image stitching is performed on each mark area and at least a part of the images in the corresponding K second images (the at least part of the images in the K second images are images associated with the corresponding mark areas), and the obtained stitched image is determined as an image to be transmitted. At this time, the image to be transmitted may be determined based on image stitching.
When determining an image to be transmitted, in response to a second input to a target image (a first image), for each of at least one of M marker regions, a first marker image may be generated according to the marker region to obtain at least one first marker image, and obtaining a corresponding first marker image for the at least one of M marker regions may be implemented. And then determining an image to be transmitted according to at least one first mark image and at least part of the image in the K second images corresponding to each first mark image (the second image in the K second images associated with the corresponding first mark image). At this time, a marker image may be generated based on the marker region, and an image to be transmitted may be determined based on the marker image and the associated second image.
The following describes, by way of example, the above process of determining an image to be transmitted based on image stitching, and the process of determining an image to be transmitted based on a marker image and an associated second image. For example, the first image corresponds to the mark region 1 and the mark region 2, the mark region 1 corresponds to the second image 1, and the mark region 2 corresponds to the second image 2, and for determining the image to be transmitted based on image stitching, the mark region 1 and the corresponding second image 1 may be stitched, and the mark region 2 and the corresponding second image 2 may be stitched, so as to determine the image to be transmitted. For the case of determining an image to be transmitted based on the marker image and the associated second image, the first marker image 1 may be determined from the marker region 1, the first marker image 2 may be determined from the marker region 2, and the first marker image 1, the first marker image 2, the second image 1, and the second image 2 may be determined as the image to be transmitted.
In the implementation process, when the target image is the first image, the associated image can be directly determined as the image to be transmitted according to the first image and the second image, the image to be transmitted is determined based on image stitching, or the image to be transmitted is determined based on the marker image and the second image.
Optionally, in a case where the target image is a second image, in response to the second input, determining an image to be transmitted according to the target image and at least one image associated with the target image, where the determining includes one of the following schemes:
in response to a second input, determining the target image and the first image as an image to be transmitted;
responding to a second input, carrying out image splicing on the target image and a target mark region of the first image to determine an image to be sent, wherein the target mark region is associated with the target image;
in response to a second input, a target image and a second marker image are determined as an image to be transmitted, the second marker image being generated based on a target marker region of the first image, the target marker region being associated with the target image.
When determining an image to be transmitted according to a target image and at least one image associated with the target image, one of the following three schemes may be included for a case where the target image is a second image.
When determining the image to be transmitted, the target image (second image) and the first image may be determined as the image to be transmitted in response to a second input to the target image (second image), and at this time, the image to be transmitted may be determined according to the second image and the first image, so as to directly determine the image to be transmitted based on the photographed associated image.
When determining the image to be sent, in response to a second input to the target image (a second image), a target mark region associated with the target image may be determined in the first image, the target image and the target mark region in the first image are subjected to image splicing, and the spliced image is determined as the image to be sent. At this time, the image to be transmitted may be determined based on image stitching.
When determining an image to be transmitted, a target marker region associated with a target image (second image) may be determined in a first image in response to a second input to the target image, a second marker image may be generated based on the target marker region of the first image, and the target image and the generated second marker image may be determined as the image to be transmitted. At this time, a marker image may be generated based on the target marker region, and an image to be transmitted may be determined based on the marker image and the target image.
In the implementation process, when the target image is the second image, the associated image can be directly determined as the image to be transmitted according to the first image and the second image, the image to be transmitted is determined based on image stitching, or the image to be transmitted is determined based on the marker image and the second image.
In the following, a description is given of a case of performing image association based on captured content by using a specific example, after capturing and displaying a first image, stopping on a display page of the first image, receiving a first input of a user for the first image, and in response to the first input, setting M marked areas for the first image, such as marking a certain area or certain areas, where the marking may be to circle out a certain area. When a third input to the mark area is received and the target object corresponding to the mark area is photographed to obtain a second image, the photographed second image may be associated with the corresponding mark area, so as to realize association between the second image and the first image. As shown in fig. 6, two persons (person a and person B) in the first image are respectively marked to realize setting of two marked areas, the association between the second image and the first image is realized after the corresponding second image is captured and acquired by capturing the person a and a third input for the marked area (corresponding to the person a) is received, and the association between the second image and the first image is realized after the corresponding second image is captured and a third input for the marked area (corresponding to the person B) is received by capturing the person B. By carrying out image association, when a user views the associated images, the corresponding associated images can be displayed by carrying out operation on the first images, so that the user can conveniently and effectively view the associated images. For example, when a user wants to photograph a commodity and a commodity price for the other side, after one image (a commodity image including the commodity price) is photographed, the commodity price on the image may be circled, and then the circled position may be clicked, and after the commodity price is photographed, the two images may be automatically associated.
That is, when a user wants to associate a captured first image with another image, the user only needs to mark at least one region on the captured first image after capturing the first image, and for each marked region, after receiving a third input for the marked region and capturing a target object corresponding to the marked region, the captured image is automatically associated with the corresponding marked region of the first image, thereby achieving association between the images.
When a user wants to send a shot image to a receiving party, a certain image (such as a first image) in the associated series of images is selected, all the series of images can be automatically selected and sent, and after the images are sent, the receiving party can also touch a marked area of the first image to view the associated images.
According to the implementation process for performing image association based on shot content, M mark areas are set for a first image, the first image and a second image are associated under the condition that a third input for the mark areas is received and the second image is obtained by shooting a target object corresponding to the mark areas, and the image association can be realized based on the image content, so that a user can automatically jump to another associated image for viewing by inputting a current image when viewing the first image, and the defect that the user needs to repeatedly switch and compare the association between the recalled images when viewing the images is avoided. The method enriches the determining mode of the image to be transmitted by determining the first image and the second image as the image to be transmitted, determining the image to be transmitted based on image splicing or determining the image to be transmitted based on the marking image and the second image.
Alternatively, the following scheme may be included for the case of performing image association based on the shot content. In response to a first input, correlating the N images, including:
in response to a first input to the third image, segmenting the third image and determining P first regions;
for each first region, associating the first region with a corresponding second region in the event that a fourth input is received for the first region and the second region of at least one fourth image;
the fourth image is T, the fourth image comprises R second areas, the P, T, R are integers which are larger than or equal to 1, R values are the same or different for different fourth images, and the N images comprise the third image and the fourth image.
When associating the N images based on the shot content, the third image may be segmented to determine P first regions in response to a first input performed by a user on the third image after receiving the first input to the third image, where P is an integer greater than or equal to 1, that is, at least one first region may be obtained by segmenting the third image. After determining the P first regions, for each first region, upon receiving a fourth input performed on the current first region and a second region of at least one fourth image, associating the current first region with the corresponding second region (the second region receiving the fourth input), enabling association between regions of different images.
The number of the fourth images is T, T is an integer greater than or equal to 1, each first region may be associated with a second region of at least one fourth image of the T fourth images, each fourth image may include R second regions, R is an integer greater than or equal to 1, and values of R may be the same or different for different fourth images. For the condition that the values of R are the same, the number of the second regions corresponding to T fourth images can be the same. For each first region, in associating with the second region of the fourth image, an association may be made with at least one second region of each of the at least one fourth image. The associated N images comprise at least part of the images of the third image and of the T fourth images.
The process can realize the association between different areas of the image through the content association, and is convenient for a user to intercept the associated areas of different images on the basis of ensuring the image association. For example, the area association and the image association are implemented by automatically segmenting the third image through a first input (such as a long press or a double click) to the third image in the image list, segmenting the third image into at least one first area, dragging a certain first area to a second area of the fourth image, and associating the current first area corresponding to the third image with the current second area corresponding to the fourth image.
In the implementation process, the third image is divided to determine at least one first area, and the first area is associated with the second area of the fourth image, so that the third image and the fourth image can be associated based on the association between the image areas.
For the case of performing image association based on shot content, and in the case that the target image is a third image, in response to a second input, determining an image to be transmitted according to the target image and at least one image associated with the target image, the method includes one of the following schemes:
in response to a second input, determining at least partial images of at least one fourth image respectively corresponding to the third image and the at least one first area as images to be sent;
in response to a second input, for each of the at least one first region, image stitching the first region with at least part of the associated second regions to determine an image to be transmitted;
and in response to a second input, generating a first region image according to the first region for each of the at least one first region, and determining an image to be transmitted according to at least part of an image in at least one fourth image corresponding to each of the at least one first region image and the at least one first region image.
When determining the image to be transmitted according to the target image and the at least one image associated with the target image, one of the following three schemes may be included for the case where the target image is the third image.
When determining an image to be transmitted, in response to a second input to a target image (a third image), for each of at least one of P first regions corresponding to the third image, at least a portion of an image in at least one fourth image corresponding to the current first region may be obtained, and then the image to be transmitted may be determined according to the third image and the obtained fourth image corresponding to each first region. At this time, it is possible to directly determine the image to be transmitted based on the associated image.
When determining an image to be transmitted, for each of at least one of the P first regions, in response to a second input to the target image (a third image), image stitching may be performed on the current first region and at least a part of the associated second region (which may correspond to at least one fourth image), and the image to be transmitted is determined by performing image stitching on each of the at least one first region. At this time, the image to be transmitted may be determined based on the region association and the region stitching.
When determining an image to be transmitted, in response to a second input to a target image (a third image), for each of at least one of the P first regions, a first region image may be generated according to the first region to obtain at least one first region image, and then the image to be transmitted may be determined according to at least a partial image of the fourth image corresponding to the at least one first region image and each first region image. At this time, an image to be transmitted may be determined based on the region image and the fourth image.
For example, when the user does not want to send the entire associated image to the receiver, the user can intercept a certain first region of the third image, then automatically intercept the associated region of the associated image, and then perform image stitching to generate an image to be sent, as shown in fig. 7, the image a (the upper left image) and the image B (the upper right image) are associated images, where the girl and the boy are associated regions, and when the girl in the image a is intercepted, the user can automatically intercept the boy in the associated image B, and then stitch the images to generate an image C (the image located below the image a and the image B).
In the implementation process, when the target image is the third image, the associated image can be directly determined as the image to be transmitted according to the third image and the fourth image, the image to be transmitted is determined based on image stitching, or the image to be transmitted is determined based on the region image and the fourth image, various schemes are provided for determining the image to be transmitted, and the determination mode of the image to be transmitted is enriched, so that a user can select a proper scheme to determine the image to be transmitted.
For the case of performing image association based on shot content, and in the case that the target image is a fourth image, in response to a second input, determining an image to be transmitted according to the target image and at least one image associated with the target image, the method includes one of the following schemes:
in response to the second input, determining the target image and the third image as images to be transmitted;
responding to a second input, and carrying out image splicing on a second target area of the target image and a first target area of a third image to determine an image to be sent, wherein the second target area is associated with the first target area;
in response to a second input, a target image and a second region image are determined as an image to be transmitted, the second region image being generated based on a first target region of a third image, the first target region being associated with the target image.
When determining the image to be transmitted according to the target image and the at least one image associated with the target image, one of the following three schemes may be included for the case where the target image is the fourth image.
When determining the image to be transmitted, the target image and the third image may be determined as the image to be transmitted in response to the second input to the target image (fourth image), and at this time, it may be implemented to directly determine the image to be transmitted based on the associated image.
In determining the image to be transmitted, a second target region may be determined in the target image in response to a second input to the target image (a fourth image), the second target region being a region of the target image that is associated with a first region (a first target region, and the first target region may be at least one) in the third image. And after the second target area is determined, carrying out image splicing on the second target area and the first target area of the third image to determine an image to be sent. At this time, the image to be transmitted may be determined based on the region association and the region stitching.
When determining an image to be transmitted, a first target region associated with a target image (a fourth image) may be determined in the third image in response to a second input to the target image, the number of the first target regions may be at least one, at least one second region image is generated according to the first target region, and the second region image and the target image are determined as the image to be transmitted. At this time, an image to be transmitted may be determined based on the region image and the fourth image.
In the implementation process, when the target image is the fourth image, the associated image can be directly determined as the image to be transmitted according to the third image and the fourth image, the image to be transmitted is determined based on the region association and the region stitching, or the image to be transmitted is determined based on the region image and the fourth image, various schemes are provided for determining the image to be transmitted, the determination mode of the image to be transmitted is enriched, and therefore a user can select a proper scheme to determine the image to be transmitted.
In the implementation process of performing image association based on the shot content, the P first areas are obtained by dividing the third image, and for each first area, the first area is associated with the second area of the fourth image, so that the area association and the image association can be realized based on the image content, and a user can conveniently and quickly capture the associated area image. The third image and the fourth image are determined as the image to be sent, the image to be sent is determined based on the region association and the region splicing, or the image to be sent is determined based on the region image and the fourth image, so that the determination mode of the image to be sent is enriched.
In the above overall implementation process of the method for sending the associated images, through associating the N images, when receiving the second input to the target image in the N images, the image to be sent is determined according to the target image and at least one image associated with the target image, so that the multiple images can be quickly and simply selected to determine the image to be sent based on the input to the target image, and the receiver can conveniently and effectively view the images and quickly know the images based on the association between the images.
Furthermore, by performing image association of the N images based on at least one of shooting time and shooting content, the image association based on at least one parameter can be realized, and the image association mode is enriched; the N correlated images are determined as the images to be sent, so that the images to be sent are rapidly determined through simple and convenient operation of a user, and the user experience is improved; by setting the image label, the confusion of automatic image selection generated by a user can be avoided; by displaying the related series images in a superimposed manner, the series image tiling display is controlled based on a specific input performed by a user, and the display form of the series images is enriched.
By carrying out image association based on shooting time, a user can view the associated images based on time information, and by carrying out image association based on shooting contents, the user can conveniently view the detailed contents of the associated images, thereby avoiding the defect that the user needs to repeatedly switch and compare the association between the recalled images when viewing the images. The association between the area and the image and the association between the area and the area are realized based on the shooting content, so that the association between the image and the image is realized, the association of different forms of images based on the image content can be realized, and the association forms are enriched. The determination mode of the image to be transmitted is enriched by determining the associated image as the image to be transmitted, determining the image to be transmitted based on splicing, or determining the image to be transmitted based on the region image and the fourth image (also can be a marker image and a second image).
In the related image transmission method provided in the embodiment of the present application, the execution subject may be a related image transmission device, or a control module in the related image transmission device for executing the related image transmission method. In the embodiments of the present application, a related image transmission apparatus that executes a related image transmission method will be described as an example.
Fig. 8 is a schematic block diagram of an apparatus for transmitting an associated image according to an embodiment of the present application. Referring to fig. 8, the apparatus comprises:
an association module 801, configured to, in response to a first input, associate N images, where N is an integer greater than or equal to 2;
a receiving module 802, configured to receive a second input of a target image in the N images;
the processing module 803 is configured to, in response to the second input, determine an image to be sent according to the target image and at least one image associated with the target image, and send the image to be sent to a receiver.
Optionally, the association module is further configured to:
responding to the first input respectively executed on an image preview interface at a first moment and a second moment, and associating the N images in the multiple images collected in the target time period;
wherein the starting time of the target time interval is the first time, and the ending time of the target time interval is the second time.
Optionally, the associating module includes:
a setting sub-module for setting M marking regions for a first image in response to a first input to the first image;
the first association sub-module is used for associating at least partial images in K second images with corresponding mark areas under the condition that a third input to the mark areas is received and the K second images are obtained by shooting target objects corresponding to the mark areas for each mark area;
m, K are integers greater than or equal to 1, each of the mark regions corresponds to K second images, the N images include the first image and at least partial images of the K second images corresponding to each of the mark regions, values of K are the same or different for different mark regions, and the first image is a first image of the N images.
Optionally, in a case that the target image is the first image, the processing module includes one of the following sub-modules:
a first determining sub-module, configured to determine, in response to the second input, at least some of the K second images respectively corresponding to the first image and the at least one marked region as the image to be transmitted;
a first processing sub-module, configured to perform, in response to the second input, image stitching on the mark region and at least part of the corresponding K second images for each of at least one mark region to determine the image to be transmitted;
and the second processing submodule is used for responding to the second input, aiming at each mark area in at least one mark area, generating a first mark image according to the mark area, and determining the image to be transmitted according to at least partial image in K second images corresponding to the at least one first mark image and each first mark image.
Optionally, in a case that the target image is the second image, the processing module includes one of the following sub-modules:
a second determining submodule, configured to determine, in response to the second input, the target image and the first image as the image to be transmitted;
a third processing sub-module, configured to perform image stitching on the target image and a target mark region of the first image in response to the second input to determine the image to be sent, where the target mark region is associated with the target image;
a third determining sub-module, configured to determine, in response to the second input, the target image and a second marker image as the image to be sent, where the second marker image is generated based on a target marker region of the first image, and the target marker region is associated with the target image.
Optionally, the associating module includes:
a segmentation determination sub-module for segmenting the third image in response to a first input to the third image, determining P first regions;
a second association submodule, configured to associate, for each first region, the first region with a corresponding second region of at least one fourth image if a fourth input for the first region and the second region is received;
the fourth image is T, the fourth image includes R second regions, P, T, R are integers greater than or equal to 1, R has the same or different values for different fourth images, and the N images include the third image and the fourth image.
Optionally, in a case that the target image is the third image, the processing module includes one of the following sub-modules:
a fourth determining sub-module, configured to determine, in response to the second input, at least partial images of at least one of the fourth images corresponding to the third image and the at least one first region, as the image to be transmitted;
a fourth processing submodule, configured to, in response to the second input, perform, for each of at least one of the first regions, image stitching on the first region and at least part of the associated second region to determine the image to be sent;
and a fifth processing sub-module, configured to, in response to the second input, for each of the at least one first region, generate a first region image according to the first region, and determine the image to be transmitted according to at least a partial image in at least one fourth image, where the at least one first region image corresponds to each first region image.
Optionally, in a case that the target image is the fourth image, the processing module includes one of the following modules:
a fifth determining submodule, configured to determine, in response to the second input, the target image and the third image as the image to be transmitted;
a sixth processing sub-module, configured to perform image stitching on a second target region of the target image and a first target region of the third image in response to the second input to determine the image to be sent, where the second target region is associated with the first target region;
a sixth determining sub-module, configured to determine, in response to the second input, the target image and a second region image as the image to be sent, where the second region image is generated based on a first target region of the third image, and the first target region is associated with the target image.
The sending device of the related images provided in the embodiment of the present application determines, by associating the N images, the image to be sent according to the target image and the at least one image associated with the target image when receiving the second input to the target image in the N images, so that the plurality of images can be quickly and simply selected to determine the image to be sent based on the input to the target image, and a receiver can conveniently and effectively view the images and quickly know the images based on the association between the images.
By carrying out image association based on shooting time, a user can view the associated images based on time information, and by carrying out image association based on shooting contents, the user can conveniently view the detailed contents of the associated images, thereby avoiding the defect that the user needs to repeatedly switch and compare the association between the recalled images when viewing the images. The association between the area and the image and the association between the area and the area are realized based on the shooting content, so that the association between the image and the image is realized, the association of different forms of images based on the image content can be realized, and the association forms are enriched. The determination mode of the image to be transmitted is enriched by determining the associated image as the image to be transmitted, determining the image to be transmitted based on splicing, or determining the image to be transmitted based on the region image and the fourth image (also can be a marker image and a second image).
The device for sending the related image in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The transmission device of the associated image in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The sending device for the associated image provided in the embodiment of the present application can implement each process implemented in the sending method embodiment of the associated image shown in fig. 1, and for avoiding repetition, details are not described here again.
Optionally, as shown in fig. 9, an electronic device 900 is further provided in this embodiment of the present application, and includes a processor 901, a memory 902, and a program or an instruction stored in the memory 902 and executable on the processor 901, where the program or the instruction is executed by the processor 901 to implement each process of the foregoing embodiment of the method for sending an associated image, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 10 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power supply (e.g., a battery) for supplying power to the various components, and the power supply may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 10 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
Wherein the processor 1010 is configured to: in response to a first input, correlating N images, N being an integer greater than or equal to 2; the user input unit 1007 is used to: receiving a second input of a target image in the N images; the processor 1010 is further configured to: in response to the second input, determining an image to be transmitted according to the target image and at least one image associated with the target image, and controlling the radio frequency unit 1001 to transmit the image to be transmitted to a receiver.
In this way, by associating the N images, when receiving a second input to the target image in the N images, the image to be transmitted is determined according to the target image and the at least one image associated with the target image, so that the image to be transmitted can be determined by quickly and simply selecting the plurality of images based on the input to the target image, and a receiver can conveniently and effectively view the images and quickly know the images based on the association between the images. Of course, the processor 1010 and other components may also perform other steps of the method for sending the associated image in the embodiment of the present application, which are not described herein again.
It should be understood that in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, and the Graphics Processing Unit 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 may include two parts, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1009 may be used to store software programs as well as various data, including but not limited to application programs and operating systems. Processor 1010 may integrate an application processor that handles primarily operating systems, user pages, and applications, etc., and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the foregoing embodiment of a method for sending an associated image, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above-mentioned method for sending an associated image, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (9)

1. A method for transmitting a related image, comprising:
in response to a first input, correlating N images, N being an integer greater than or equal to 2;
receiving a second input of a target image in the N images;
responding to the second input, determining an image to be sent according to the target image and at least one image related to the target image, and sending the image to be sent to a receiver;
wherein said associating N images in response to a first input comprises:
setting M mark areas for a first image in response to a first input to the first image;
for each marking area, when a third input to the marking area is received and K second images are obtained by shooting a target object corresponding to the marking area, associating at least partial images in the K second images with the corresponding marking areas;
m, K are integers greater than or equal to 1, each of the marked regions corresponds to K second images, the N images include the first image and at least partial images of the K second images corresponding to each of the marked regions, the values of K are the same or different for different marked regions, and the first image is a first image of the N images;
or,
the associating N images in response to the first input includes:
in response to a first input to a third image, segmenting the third image, and determining P first regions;
for each of the first regions, in the event that a fourth input is received for the first region and a second region of at least one fourth image, associating the first region with the corresponding second region;
the fourth images are T images, the fourth images comprise R second areas, P, T, R are integers which are larger than or equal to 1, R values are the same or different for different fourth images, and the N images comprise the third image and the fourth image.
2. The method for transmitting associated images according to claim 1, wherein the associating N images in response to the first input comprises:
responding to the first input respectively executed on an image preview interface at a first moment and a second moment, and associating the N images in the multiple images collected in the target time period;
wherein the starting time of the target time interval is the first time, and the ending time of the target time interval is the second time.
3. The method for sending the associated image according to claim 1, wherein in a case that the target image is the first image, determining an image to be sent according to the target image and at least one image associated with the target image in response to the second input comprises one of the following schemes:
responding to the second input, and determining at least partial images of K second images corresponding to the first image and at least one marking area as the images to be sent;
in response to the second input, for each of at least one of the mark regions, image-stitching the mark region with at least partial images of the corresponding K second images to determine the image to be transmitted;
and responding to the second input, generating a first mark image according to the mark region for each mark region in at least one mark region, and determining the image to be transmitted according to at least partial images in K second images corresponding to at least one first mark image and each first mark image.
4. The method for sending the associated image according to claim 1, wherein in a case that the target image is the second image, determining an image to be sent according to the target image and at least one image associated with the target image in response to the second input comprises one of the following schemes:
determining the target image and the first image as the image to be transmitted in response to the second input;
responding to the second input, and performing image splicing on the target image and a target mark region of the first image to determine the image to be sent, wherein the target mark region is associated with the target image;
in response to the second input, determining the target image and a second marker image as the image to be transmitted, the second marker image being generated based on a target marker region of the first image, the target marker region being associated with the target image.
5. The method for sending the associated image according to claim 1, wherein in a case where the target image is the third image, determining an image to be sent according to the target image and at least one image associated with the target image in response to the second input comprises one of:
responding to the second input, and determining at least partial images in at least one fourth image respectively corresponding to the third image and at least one first area as the image to be sent;
in response to the second input, for each of the at least one first region, image-stitching the first region with at least a portion of the associated second region to determine the image to be transmitted;
and responding to the second input, generating a first region image according to each first region in at least one first region, and determining the image to be transmitted according to at least partial images in at least one fourth image corresponding to each first region image in at least one first region image.
6. The method for sending the associated image according to claim 1, wherein in a case that the target image is the fourth image, determining an image to be sent according to the target image and at least one image associated with the target image in response to the second input comprises one of the following schemes:
determining the target image and the third image as the image to be transmitted in response to the second input;
performing image stitching on a second target region of the target image and a first target region of the third image in response to the second input to determine the image to be sent, wherein the second target region is associated with the first target region;
in response to the second input, determining the target image and a second region image as the image to be transmitted, the second region image being generated based on a first target region of the third image, the first target region being associated with the target image.
7. A transmission apparatus for associating images, comprising:
the correlation module is used for responding to a first input and correlating N images, wherein N is an integer greater than or equal to 2;
the receiving module is used for receiving second input of a target image in the N images;
the processing module is used for responding to the second input, determining an image to be sent according to the target image and at least one image related to the target image, and sending the image to be sent to a receiver;
wherein the association module comprises:
a setting sub-module for setting M marking regions for a first image in response to a first input to the first image;
the first association submodule is used for associating at least partial images in K second images with corresponding mark areas under the condition that a third input for the mark areas is received and K second images are obtained by shooting target objects corresponding to the mark areas aiming at each mark area;
m, K are integers greater than or equal to 1, each of the marked regions corresponds to K second images, the N images include the first image and at least partial images of the K second images corresponding to each of the marked regions, the values of K are the same or different for different marked regions, and the first image is a first image of the N images;
or,
the association module comprises:
a segmentation determination sub-module for segmenting the third image in response to a first input to the third image, determining P first regions;
a second association submodule, configured to associate, for each first region, the first region with a corresponding second region of at least one fourth image if a fourth input for the first region and the second region is received;
the fourth images are T images, the fourth images comprise R second areas, P, T, R are integers which are larger than or equal to 1, R values are the same or different for different fourth images, and the N images comprise the third image and the fourth image.
8. An electronic device comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the method of transmitting an associated image according to any of claims 1 to 6.
9. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of a method of transmission of an associated image according to any one of claims 1 to 6.
CN202111154586.3A 2021-09-29 2021-09-29 Method and device for sending associated image and electronic equipment Active CN113873081B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111154586.3A CN113873081B (en) 2021-09-29 2021-09-29 Method and device for sending associated image and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111154586.3A CN113873081B (en) 2021-09-29 2021-09-29 Method and device for sending associated image and electronic equipment

Publications (2)

Publication Number Publication Date
CN113873081A CN113873081A (en) 2021-12-31
CN113873081B true CN113873081B (en) 2023-03-14

Family

ID=79000728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111154586.3A Active CN113873081B (en) 2021-09-29 2021-09-29 Method and device for sending associated image and electronic equipment

Country Status (1)

Country Link
CN (1) CN113873081B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117082228A (en) * 2022-12-30 2023-11-17 惠州Tcl云创科技有限公司 3D image display method, device, medium and equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106503215A (en) * 2016-10-27 2017-03-15 北京小米移动软件有限公司 Process the method and device of picture
CN110147461A (en) * 2019-04-30 2019-08-20 维沃移动通信有限公司 Image display method, device, terminal device and computer readable storage medium
WO2019159333A1 (en) * 2018-02-16 2019-08-22 マクセル株式会社 Mobile information terminal, information presentation system and information presentation method
CN110543579A (en) * 2019-07-26 2019-12-06 华为技术有限公司 Image display method and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106503215A (en) * 2016-10-27 2017-03-15 北京小米移动软件有限公司 Process the method and device of picture
WO2019159333A1 (en) * 2018-02-16 2019-08-22 マクセル株式会社 Mobile information terminal, information presentation system and information presentation method
CN110147461A (en) * 2019-04-30 2019-08-20 维沃移动通信有限公司 Image display method, device, terminal device and computer readable storage medium
CN110543579A (en) * 2019-07-26 2019-12-06 华为技术有限公司 Image display method and electronic equipment

Also Published As

Publication number Publication date
CN113873081A (en) 2021-12-31

Similar Documents

Publication Publication Date Title
CN112135046B (en) Video shooting method, video shooting device and electronic equipment
CN113093968B (en) Shooting interface display method and device, electronic equipment and medium
CN113766129B (en) Video recording method, video recording device, electronic equipment and medium
CN111866392B (en) Shooting prompting method and device, storage medium and electronic equipment
CN112911147B (en) Display control method, display control device and electronic equipment
CN111857460A (en) Split screen processing method, split screen processing device, electronic equipment and readable storage medium
CN112698775B (en) Image display method and device and electronic equipment
CN113794834A (en) Image processing method and device and electronic equipment
CN113794831B (en) Video shooting method, device, electronic equipment and medium
CN113271378B (en) Image processing method and device and electronic equipment
CN113010738B (en) Video processing method, device, electronic equipment and readable storage medium
CN113873081B (en) Method and device for sending associated image and electronic equipment
CN113286085B (en) Display control method and device and electronic equipment
CN113194256A (en) Shooting method, shooting device, electronic equipment and storage medium
CN112822394A (en) Display control method and device, electronic equipment and readable storage medium
CN112383709A (en) Picture display method, device and equipment
CN112383708B (en) Shooting method and device, electronic equipment and readable storage medium
CN113542599A (en) Image shooting method and device
CN114025100A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN115278378B (en) Information display method, information display device, electronic apparatus, and storage medium
CN113794943B (en) Video cover setting method and device, electronic equipment and storage medium
CN112416230B (en) Object processing method and device
CN112911060B (en) Display control method, first display control device and first electronic equipment
CN114143455B (en) Shooting method and device and electronic equipment
CN113726953B (en) Display content acquisition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant