WO2024103958A1 - Procédé de traitement de contenus d'images sur la base de multiples dispositifs et appareil associé - Google Patents

Procédé de traitement de contenus d'images sur la base de multiples dispositifs et appareil associé Download PDF

Info

Publication number
WO2024103958A1
WO2024103958A1 PCT/CN2023/119787 CN2023119787W WO2024103958A1 WO 2024103958 A1 WO2024103958 A1 WO 2024103958A1 CN 2023119787 W CN2023119787 W CN 2023119787W WO 2024103958 A1 WO2024103958 A1 WO 2024103958A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
materials
image material
target
target image
Prior art date
Application number
PCT/CN2023/119787
Other languages
English (en)
Chinese (zh)
Inventor
何�轩
伍超
刘可立
袁伦喜
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2024103958A1 publication Critical patent/WO2024103958A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Definitions

  • the present application belongs to the field of communication technology, and specifically relates to a method and related apparatus for processing image materials based on multiple devices.
  • the present application provides a method and related apparatus for processing image materials based on multiple devices, in order to provide a method that can support users to complete viewing and selection of image materials shot from multiple devices and intelligently generate highlight moments in one device, thereby reducing production time and optimizing user experience.
  • an embodiment of the present application provides a method for processing image materials based on multiple devices, which is applied to electronic devices, including:
  • jumping to an image material selection page the image material selection page including an image material display area and a highlight moment generation control, the image material display area being used to display multiple image materials from multiple image acquisition devices;
  • a highlight moment generation screen is displayed.
  • the electronic device concentrates multiple image materials from multiple image acquisition devices in an image material display area and displays them to the user, so that the user can intuitively see the image materials displayed according to the image acquisition devices on the electronic device, and view and select image materials across terminals.
  • the user only needs to click on the highlight moment generation control, and the electronic device can intelligently generate a highlight moment clip corresponding to the target image material for the user, which greatly improves processing efficiency, reduces production time, and optimizes user experience.
  • the embodiment of the present application provides another method for processing image materials based on multiple devices, which is applied to electronic devices, including:
  • jumping to an image material selection page the image material selection page including an image material display area and a highlight moment generation control, the image material display area being used to display a plurality of image materials;
  • a highlight moment generation screen is displayed.
  • the electronic device displays multiple image materials to the user in an image material display area, so that the user can view and select image materials across terminals.
  • the image acquisition device corresponding to the image material will be displayed, and the user only needs to click on the highlight moment generation control, and the electronic device can intelligently generate a highlight moment clip corresponding to the target image material for the user, which greatly improves processing efficiency, reduces production time, and optimizes user experience.
  • the embodiment of the present application provides another method for processing image materials based on multiple devices, which is applied to electronic devices, including:
  • the multiple image materials include at least two target image materials, and the target image materials are image materials selected by a user for generating highlight moment clips;
  • the following operations are performed for the at least two target image material groups: determining a target image acquisition device corresponding to the target image material group currently being processed; determining a real storage path of each target image material in the target image material group currently being processed in the target image acquisition device; sending the real storage path to the target image acquisition device so that the target image acquisition device accesses the real storage path to obtain the image material in the target image material group, and generating a The highlight moment sub-segment corresponding to the target image material group, and sending the highlight moment sub-segment to the electronic device; continuing to process the next target image material group until the at least two target image material groups are all processed, to obtain at least two highlight moment sub-segments;
  • an embodiment of the present application provides an electronic device, comprising a processor, a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing the steps in the first aspect, the second aspect, or the third aspect of the embodiment of the present application.
  • an embodiment of the present application provides a computer-readable storage medium having a computer program/instruction stored thereon, which, when executed by a processor, implements the steps in the first aspect, the second aspect, or the third aspect of the embodiment of the present application.
  • an embodiment of the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to enable a computer to execute part or all of the steps described in the first aspect, the second aspect, or the third aspect of the embodiment of the present application.
  • FIG. 1a is a schematic diagram of a system architecture provided by an embodiment of the present application.
  • FIG1b is a schematic diagram of another system architecture provided in an embodiment of the present application.
  • FIG1c is a schematic diagram of another system architecture provided in an embodiment of the present application.
  • FIG2 is a schematic diagram of a flow chart of a method for processing image materials based on multiple devices provided in an embodiment of the present application;
  • FIG3 is a schematic diagram showing an example of jumping to an image material selection page provided in an embodiment of the present application.
  • FIG4a is an example diagram of an image material selection page provided in an embodiment of the present application.
  • FIG4b is an example diagram of another image material selection page provided in an embodiment of the present application.
  • FIG4b(1) is an example diagram of another image material selection page provided in an embodiment of the present application.
  • FIG4b(2) is an example diagram of another image material selection page provided in an embodiment of the present application.
  • FIG4c is an example diagram of another image material selection page provided in an embodiment of the present application.
  • FIG5 is a schematic diagram of a flow chart of another method for processing image materials based on multiple devices provided in an embodiment of the present application
  • FIG6a is an example diagram of another image selection page provided by an embodiment of the present application.
  • FIG6b is an example diagram of another image selection page provided by an embodiment of the present application.
  • FIG. 7a is an example diagram of another image material selection page provided in an embodiment of the present application.
  • FIG7b is an example diagram of another image material selection page provided in an embodiment of the present application.
  • FIG. 7c is an example diagram of another image material selection page provided in an embodiment of the present application.
  • FIG7d is an example diagram of another image material selection page provided in an embodiment of the present application.
  • FIG7e is an example diagram of another image material selection page provided in an embodiment of the present application.
  • FIG8a is an example diagram of a highlight moment generation screen provided by an embodiment of the present application.
  • FIG8b is an example diagram of another highlight moment generation screen provided by an embodiment of the present application.
  • FIG8c is an example diagram of another highlight moment generation screen provided by an embodiment of the present application.
  • FIG8d is an example diagram of another highlight moment generation screen provided by an embodiment of the present application.
  • FIG9a is an example diagram of a highlight moment display page provided in an embodiment of the present application.
  • FIG9b is an example diagram of another highlight moment display page provided in an embodiment of the present application.
  • FIG9c is an example diagram of a historical highlight moment display page provided in an embodiment of the present application.
  • FIG9d is an example diagram of a video template display page provided in an embodiment of the present application.
  • FIG10 is a flow chart of another method for processing image materials based on multiple devices provided in an embodiment of the present application.
  • FIG11a is a block diagram of functional units of an image material processing apparatus based on multiple devices provided in an embodiment of the present application;
  • FIG. 11 b is a block diagram of functional units of another image material processing apparatus based on multiple devices provided by an embodiment of the present application;
  • FIG12a is a block diagram of functional units of another apparatus for processing image materials based on multiple devices provided in an embodiment of the present application;
  • FIG12b is a block diagram of functional units of another image material processing apparatus based on multiple devices provided in an embodiment of the present application;
  • FIG13a is a block diagram of functional units of another apparatus for processing image materials based on multiple devices provided in an embodiment of the present application;
  • FIG13b is a block diagram of functional units of another image material processing apparatus based on multiple devices provided by an embodiment of the present application;
  • FIG. 14 is a structural block diagram of an electronic device provided in an embodiment of the present application.
  • SDK Software Development Kit
  • the embodiments of the present application provide a method for processing image materials based on multiple devices and a related device, which is applied to an electronic device for users to view, select image materials, and synthesize highlight moment clips.
  • the electronic device interacts with data from multiple image acquisition devices to centrally display multiple image materials from multiple image acquisition devices to the user for the user to view, select, and intelligently generate highlight moment clips on the electronic device, thereby eliminating the complex operation of viewing and selecting materials across devices, reducing operation time, and optimizing user experience.
  • the system architecture 10 includes an electronic device 11 and multiple image acquisition devices 12, wherein the electronic device 11 and the multiple image acquisition devices 12 are communicatively connected, the electronic device 11 is a device for users to view, select image materials, and synthesize highlight moment fragments, and specifically can be a mobile terminal, a laptop computer, a vehicle-mounted terminal, etc.; the image acquisition device 12 is a device for users to collect image materials, and specifically can be a mobile terminal, a vehicle-mounted terminal, a drone, smart glasses, etc.
  • the electronic device 11 can also be one of the multiple image acquisition devices 12.
  • Figure 1b is another system architecture schematic diagram provided in an embodiment of the present application.
  • the electronic device is a mobile terminal 101
  • the multiple image acquisition devices are respectively a vehicle-mounted terminal 102 and a drone 103
  • the mobile terminal 101 is respectively communicated with the vehicle-mounted terminal 102 and the drone 103 to display the image materials collected by the vehicle-mounted terminal 102 and the drone 103 on the mobile terminal 101 for users to view and select.
  • Figure 1c is another system architecture schematic diagram provided in the embodiment of the present application.
  • the system architecture 10 includes multiple image acquisition devices, such as a mobile terminal 101, a vehicle-mounted terminal 102, a drone 103, and another mobile terminal 104. Any of the devices can communicate with another device in the communication network, that is, to achieve a communication connection between two devices, so that the user can freely select any device as the electronic device in the embodiment of the present application.
  • the way to establish a communication connection can be to connect by means of near-field communication, or to establish a connection through a router or a star network, which is not the only limitation here.
  • the following describes a method for processing image materials based on multiple devices provided in an embodiment of the present application.
  • FIG. 2 is a flow chart of a method for processing image materials based on multiple devices provided in an embodiment of the present application. The method is applied to the electronic device 11 shown in FIG. 1 a . As shown in FIG. 2 , the method includes:
  • Step 201 in response to a click operation on a target control in a target application, jump to an image material selection page, wherein the image material selection page includes an image material display area and a highlight moment generation control, wherein the image material display area is used to display multiple image materials from multiple image acquisition devices.
  • the target application may refer to the "Album” application of the electronic device
  • the target control may refer to the functional control in the "Album” application for intelligently generating highlight moment clips.
  • Figure 3 is an example diagram of jumping to an image material selection page provided in an embodiment of the present application.
  • the user taking the electronic device as a mobile terminal as an example, the user first clicks on the "Album” application 301 in the main interface 30 of the electronic device, and then jumps to the main interface 31 of the "Album” application. The user then clicks on a functional control 311 named "Intelligently Generate Highlight Moment Clips" in the main interface 31 of the "Album” application to jump to the image material selection page.
  • the image material display area is used to display multiple groups of image materials, each of the multiple groups of image materials includes at least one image material, wherein the background color of the area between each group of image materials is different, and the background color is used to indicate the image acquisition device.
  • FIG4a is an example diagram of an image material selection page provided in an embodiment of the present application.
  • the image material selection page 40 includes an image material display area 41 and a highlight moment generation control 42 named "Generate Highlight Moments".
  • the image material display area 41 includes multiple groups of image materials, wherein the first group of image materials includes material 1, material 2, material 3 and material 4, and the background color of the area 411 where the first group of image materials is located is a first color, and the first color is used to indicate that the source device of the group of image materials is a mobile terminal; the second group of image materials includes material 5 and material 6, and the background color of the area 412 where the second group of image materials is located is a second color, and the second color is used to indicate that the source device of the group of image materials is a vehicle-mounted terminal; the third group of image materials includes material 7, material 8 and material 9, and the background color of the area 413 where the third group of image materials is located is a third color, and the third color is used to indicate that the source device of the
  • the image material selection page to which the electronic device jumps in response to the user operation includes an image material display area that displays image materials of different groups in different colors, so that the user can intuitively see the source device of each material, and thus select materials more specifically, further reducing the time spent by the user in selecting materials and improving the user experience.
  • the image material display area includes an image acquisition device switching area and an image material display sub-area, the image acquisition device switching area is used to switch the image acquisition device, and the image material display sub-area is used to display multiple image materials stored in the currently selected image acquisition device.
  • the image material selection page 40 includes an image material display area 41 and a highlight moment generation control 42 named "Generate Highlight Moments".
  • the image material display area 41 includes an image acquisition device switching area 43 and an image material display sub-area 44.
  • the image acquisition device switching area 43 is used to switch the above-mentioned mobile terminal, vehicle-mounted terminal, and drone.
  • the image material display sub-area 44 is used to display the image material stored in the image acquisition device selected in the image acquisition device switching area 43.
  • the image acquisition device switching area 43 can be set to switch the image acquisition device by clicking, as shown in Figure 4b(1).
  • the image acquisition device switching area 43 includes a "local" function control, a "vehicle terminal” function control, and a “drone” function control.
  • the above function controls are displayed in a tiled manner and can respond to the user's click operation to achieve the function of switching the image acquisition device.
  • Figure 4b(1) when the user selects the "vehicle terminal” function control, the "vehicle terminal” function control is highlighted, and the image material stored in the vehicle terminal, namely material 5 and material 6, is displayed in the image material display sub-area.
  • the image acquisition device switching area 43 can be set to switch the image acquisition device by scrolling, as shown in Figure 4b(2), the image acquisition device switching area 43 includes a "local" function control, a "vehicle terminal” function control, and a “drone” function control.
  • the above function controls are displayed in a superimposed manner, and the image acquisition device switching area 43 can respond to the user's horizontal sliding operation to achieve the function of switching the image acquisition device.
  • the electronic device currently used by the user to view, select image materials and synthesize highlight clips is also one of the multiple image acquisition devices.
  • the multiple image acquisition devices include multiple mobile terminals, and the user has the need to find the image materials stored in the electronic device currently used by the user, the "local" displayed in the image acquisition device switching area represents that the electronic device currently used by the user can quickly help the user locate the electronic device, thereby improving the user experience.
  • the image material display area therein displays the image acquisition device switching area and the image material display sub-area, so that the user can manually switch the image acquisition device to view the image material in each image acquisition device, thereby selecting materials more specifically, further reducing the time spent by the user in selecting materials and improving the user experience.
  • Step 202 In response to a selection operation on at least two image materials from the plurality of image materials, marking the selected at least two target image materials in the image material display area.
  • the at least two target image materials do not originate from the same image acquisition device.
  • the materials 1, 5, and 7 in the image material display area shown in FIG4a as an example, when the user selects the materials 1, 5, and 7, the materials 1, 5, and 7 are marked in the image material display area, as shown in FIG4c. At this time, the user can also click on the selected image material to cancel the selection of the target image material.
  • Step 203 In response to a click operation on the highlight moment generation control, a highlight moment generation screen is displayed.
  • the electronic device concentrates multiple image materials from multiple image acquisition devices in the image material display area and displays them to the user, so that the user can intuitively see the image materials displayed according to the image acquisition devices on the electronic device, and view and select image materials across terminals.
  • the user only needs to click on the highlight moment generation control, and the electronic device can intelligently generate a highlight moment clip corresponding to the target image material for the user, which greatly improves processing efficiency, reduces production time, and optimizes user experience.
  • FIG. 5 is a flow chart of another method for processing image materials based on multiple devices provided in an embodiment of the present application.
  • the method is applied to the electronic device 11 shown in FIG. 1 a .
  • the method includes:
  • Step 501 in response to a click operation on a target space in a target application, jump to an image material selection page, wherein the image material selection page includes an image material display area and a highlight moment generation control, wherein the image material display area is used to display multiple image materials.
  • the image material selection page 60 includes an image material display area 61 and a highlight moment generation control 62 named "Generate Highlight Moment", and the image material display area 61 displays multiple image materials, such as Material 1, Material 2, Material 3, Material 4, Material 5, Material 6, Material 7 and Material 8.
  • Step 502 in response to a selection operation for at least two of the plurality of image materials, marking the selected at least two target image materials in the image material display area, and displaying an image acquisition device storing the target image materials in a source device display area.
  • the at least two target image materials do not originate from the same image acquisition device.
  • the image material display area 61 marks the materials 1, 2, 5, and 7, and displays the source devices of the materials 1 and 2 as "local", the source device of the material 5 as "vehicle terminal", and the source device of the material 7 as "drone” in the source device display area 62, as shown in FIG6b.
  • the user can also click on the selected image material to cancel the selection of the target image material. Accordingly, after canceling the selection of the target image material, the source device display area will also cancel the display of the source device corresponding to the target image material.
  • Step 503 In response to a click operation on the highlight moment generation control, a highlight moment generation screen is displayed.
  • the electronic device concentrates multiple image materials in the image material display area and displays them to the user, so that the user can view and select image materials across terminals.
  • the image acquisition device corresponding to the image material will be displayed, and the user only needs to click on the highlight moment generation control, and the electronic device can intelligently generate a highlight moment clip corresponding to the target image material for the user, which greatly improves processing efficiency, reduces production time, and optimizes user experience.
  • the image material selection page also includes a selection control, and in response to a selection operation on at least two image materials among the multiple image materials, marking the at least two selected target image materials in the image material display area includes: in response to a click operation on the selection control, marking the multiple image materials as selectable materials; in response to a selection operation on at least two image materials among the selectable materials, marking the at least two selected target image materials in the image material display area.
  • Figure 7a is an example diagram of another image material selection page provided in an embodiment of the present application.
  • the image material selection page 70 also includes a selection control 71.
  • the electronic device marks multiple image materials in the image material display area 72 as selectable materials, supporting the user to select the selectable materials.
  • the electronic device also includes a selection control when jumping to the image material selection page in response to user-related operations. After responding to the user's click operation on the selection control, multiple image materials can be marked as selectable materials to support the user's selection of materials and improve the user's interactive experience.
  • marking the at least two selected target image materials in the image material display area includes: in response to a long press operation on any one of the multiple image materials, marking the image material corresponding to the long press operation as the target image material, and marking the image materials among the multiple image materials except the image material corresponding to the long press operation as selectable materials; in response to a selection operation on at least one image material among the selectable materials, marking the at least two selected target image materials in the image material display area.
  • Figure 7b is an example diagram of another image material selection page provided in an embodiment of the present application.
  • the material 1 corresponding to the long press operation is marked as the selected image material, that is, the target image material, and the other image materials are marked as selectable materials to support the user to continue selecting the selectable materials.
  • the electronic device when the electronic device responds to the user's long press operation on any image material, it can mark the image material corresponding to the long press operation as the target image material, and mark the remaining image materials as selectable materials to support the user's selection of materials and improve the user interaction experience.
  • the selection operation for at least one image material among the selectable materials includes: a click operation on the single image material; if at least one image material among the selectable materials is multiple image materials, then the selection operation for at least one image material among the selectable materials includes: multiple click operations on multiple image materials among the selectable materials; or, a sliding operation on multiple image materials in adjacent positions among the selectable materials.
  • Figure 7c is an example diagram of another image material selection page provided in an embodiment of the present application.
  • the user's selection operation for at least one image material among the selectable materials is a selection operation for a single image material, such as a selection operation for material 3, then the selection operation can be a click operation for material 3, so that material 3 is marked as the target image material.
  • Figure 7d is an example diagram of another image material selection page provided in an embodiment of the present application.
  • the selection operation can be a multiple-click operation for material 7, material 8, and material 9, and since these three materials are multiple image materials in adjacent positions, the selection operation can also be a horizontal rightward sliding operation with the position of material 7 as the starting position and the position of material 9 as the ending position, or a horizontal leftward sliding operation with the position of material 9 as the starting position and the position of material 7 as the ending position, so that materials 7, material 8, and material 9 are marked as target image materials.
  • the electronic device when the user selects selectable materials, the electronic device supports multiple selection methods to improve the user interaction experience.
  • the image material selection page further includes selection prompt information, and the selection prompt information is used to indicate the total number of the target image materials.
  • Figure 7e is an example diagram of another image material selection page provided in an embodiment of the present application.
  • the image material selection page 70 also includes selection prompt information 701.
  • the target image materials are material 1, material 7, material 8 and material 9, a total of 4 target image acquisition materials. Therefore, the selection prompt information 701 can be displayed as "4 items have been selected" to prompt the user the number of target image materials selected.
  • the electronic device displays selection prompt information on the image material selection page to remind the user of the number of target image materials that have been selected, thereby improving the user's interactive experience.
  • the highlight moment generation screen includes a progress bar for indicating a generation progress of the highlight moment generation according to the at least two target image materials.
  • Figure 8a is an example diagram of a highlight moment generation screen provided in an embodiment of the present application.
  • the highlight moment generation screen 80 includes a progress bar 81, and the loading progress of the progress bar is used to indicate the progress of highlight moment generation.
  • the highlight moment generation screen displayed by the electronic device includes a progress bar, so that the user can intuitively see the current highlight moment generation progress, thereby improving the user interaction experience.
  • the highlight moment generation screen also includes generation stage prompt information, and the generation stage prompt information is used to indicate the generation stage of highlight moment generation based on the at least two target image materials, and the generation stage includes a highlight algorithm deployment stage, a highlight algorithm execution stage and a highlight moment generation stage.
  • the generation stage is the highlight algorithm deployment stage, the highlight algorithm execution stage, and the highlight moment generation stage in sequence, that is, when the progress bar starts to load, it starts to enter the highlight algorithm deployment stage, and when the progress bar is loaded, the highlight moment generation stage is also completed, and the highlight moment fragment is generated at this time.
  • Figure 8b is an example diagram of another highlight moment generation screen provided by an embodiment of the present application.
  • the highlight moment generation screen 80 also includes a generation stage prompt information 82.
  • the generation stage prompt information 82 displays: “Deploying the highlight algorithm to the image acquisition device”; when the generation stage is the highlight algorithm execution stage, the generation stage prompt information 82 displays: “Calling multiple devices to execute the highlight algorithm”; when the generation stage is the highlight moment generation stage, the generation stage prompt information 82 displays: “Highlight moment generation", so that the user can intuitively see the specific stage of the selected target image material in the intelligent generation of the highlight moment.
  • the highlight moment generation screen displayed by the electronic device includes generation stage prompt information, and different generation stage prompt information and different progress bar loading progress are present in different generation stages, so that the user can intuitively see the current generation stage and improve the user interaction experience.
  • the highlight moment generation screen also includes an icon of a target device for executing the highlight algorithm, the target device being one or more devices in the image acquisition device corresponding to the target image material, the generation stage being the highlight algorithm execution stage, and when the target device starts to execute the highlight algorithm, the icon of the target device is highlighted until the target device completes the execution.
  • the target device for executing the highlight algorithm refers to an image device with data processing capability in the image acquisition device corresponding to the target image material.
  • the target image material comes from a mobile terminal, a vehicle terminal, a drone, and smart glasses, and the smart glasses only have image acquisition capability but not data processing capability, the smart glasses cannot execute the highlight algorithm and thus cannot process the image material stored therein. Therefore, the stored image material can only be handed over to other devices with data processing capability for processing. The selected location can be handed over to the device with the best data processing capability for processing to improve processing efficiency.
  • the target devices only include mobile terminals, vehicle-mounted terminals and drones, but do not include smart glasses that do not have data processing capabilities.
  • the highlight moment generation screen 80 also includes the icons of the target devices, namely the icon 801 of the mobile terminal, the icon 802 of the vehicle-mounted terminal, and the icon 803 of the drone, and the generation stage is in the highlight algorithm execution stage, and the generation stage prompt information 82 displays "Calling multiple devices to execute the highlight algorithm".
  • the three target devices are synchronously called to execute the highlight algorithm, and the target image materials stored in each of them are processed respectively.
  • the device starts to execute the highlight algorithm its corresponding icon will be highlighted until all the target image materials to be processed in the device are processed. The highlight display ends.
  • the icon 801 of the mobile terminal and the icon 802 of the vehicle-mounted terminal are still in a highlighted state, indicating that the mobile terminal and the vehicle-mounted terminal are still executing the highlight algorithm
  • the icon 803 of the drone is highlighted, indicating that the drone has completed the execution of the highlight algorithm.
  • the duration of the highlighted display is related to the amount of data that the device itself needs to process, and also to the data processing capability of the device itself.
  • the target device receives target image materials from an image acquisition device that does not have data processing capabilities, it is also necessary to process the group of target image materials. The amount of data that needs to be processed and the duration of the highlighted display will also increase accordingly.
  • the highlight moment generation screen displayed by the electronic device also includes an icon of the target device that can be highlighted, so that the user can intuitively see the device that is executing the highlight algorithm in the current processing stage, thereby improving the user's interactive experience.
  • the highlight moment generation screen further includes a cancel control, and the cancel control is used to instruct to cancel the currently ongoing highlight moment generation process.
  • the highlight moment generation screen 80 also includes a cancel control 83.
  • the electronic device responds to a user's click operation on the cancel control 83, it closes the highlight moment generation screen 80, cancels the currently ongoing highlight moment generation process, and returns to the image material selection page.
  • the highlight moment generation screen displayed by the electronic device also includes a cancel control, so that the user can freely cancel the highlight moment generation process, thereby improving the freedom of user interaction and optimizing the user experience.
  • the method further includes: when the progress bar is loaded, playing the generated highlight moment clip.
  • the electronic device can automatically play the generated highlight moment clip, eliminating the need for the user to manually play the highlight moment clip to view the production effect, thereby improving the user experience.
  • playing the generated highlight moment clip includes: when the progress bar is loaded, jumping to the highlight moment display page, the highlight moment display page at least includes the generated highlight moment clip; playing the generated highlight moment clip on the highlight moment display page.
  • Figure 9a is an example diagram of a highlight moment display page provided in an embodiment of the present application.
  • the highlight moment display page 90 at least includes the generated highlight moment clip 91.
  • the highlight moment clip 91 is automatically played.
  • the electronic device after the electronic device generates the highlight moment clip, it will automatically jump to the highlight moment display page and automatically play the generated highlight moment clip, eliminating the need for the user to manually play the highlight moment clip to view the production effect, thereby improving the user experience.
  • the highlight moment display page includes a highlight moment viewing control and a template viewing control, wherein the highlight moment viewing control is used to indicate a collection of highlight moments historically generated by the electronic device, and the template viewing control is used to indicate a template used by the electronic device when generating highlight moments.
  • the highlight moment collection includes one or more highlight moment segments generated historically by the electronic device, and the cover of the one or more highlight moment segments generated historically is marked with the generation date of the highlight moment segment.
  • the highlight moment display page 90 includes a highlight moment viewing control 92 with a control name of "My Moments" and a template viewing control 93 with a control name of "Video Template”.
  • the user clicks the highlight moment viewing control 92 it jumps to the historical highlight moment display page 921 as shown in FIG9c, and the historical highlight display page 921 includes three historical highlight moments, and the cover of the three historical highlight moments is marked with the generation date of the highlight moment segment, and the generation date can be in the form of year and month, such as the generation date of highlight moment 1 is January 2022, the generation date of highlight moment 2 is February 2022, and the generation date of highlight moment 3 is March 2022.
  • the user can also long press any highlight moment segment to rename, delete, forward, etc. the highlight moment segment.
  • the template viewing control 93 the page jumps to the video template display page 931 shown in FIG. 9d.
  • the video template display page 931 includes the user's history of use.
  • the video template 930 may include multiple video templates for use, such as video template 1, video template 2, and video template 3, and also includes a search box 932, through which the user can search for more video templates.
  • the user can long press a video template in the video template display page to view the historically generated highlight moment clips corresponding to the video template, or delete the video template.
  • the highlight moment viewing controls and template viewing controls in the highlight moment display page displayed by the electronic device can support users to view historically generated highlight moment clips and historically used video templates, which can, to a certain extent, evoke users' good memories when making highlight moment clips and improve the user experience.
  • the method further includes: when the progress bar is loaded, sending the generated highlight moment segment to an image playback device that is communicatively connected to the electronic device, so that the image playback device plays the generated highlight moment segment.
  • the image playback device can be a large-screen device, such as a TV. After the image playback device is communicatively connected with the electronic device, the electronic device will encapsulate the generated highlight moment clips and send them to the image playback device, which will then parse and automatically play the highlight moment clips.
  • the electronic device after the electronic device generates the highlight moment segment, it can also choose to play the highlight moment segment on another image playback device to give the user a better viewing experience.
  • FIG. 10 is a flow chart of another method for processing image materials based on multiple devices provided in an embodiment of the present application.
  • the method is applied to the electronic device 11 shown in FIG. 1a .
  • the method includes:
  • Step 1001 Acquire multiple image materials from multiple image acquisition devices, wherein the multiple image materials include at least two target image materials.
  • the target image material is the image material selected by the user for generating the highlight moment clip.
  • acquiring multiple image materials from multiple image acquisition devices includes: receiving metadata information of each of the multiple image materials sent by the multiple image acquisition devices, the metadata information of each image material including the file name of each image material, the material type of each image material and the image acquisition device corresponding to each image material, the material type including picture material and video material; generating the multiple image materials and multiple local storage paths corresponding to the multiple image materials according to the metadata information, the local storage path refers to the storage path of the image material in the electronic device.
  • the metadata information of material 5 is recorded as follows: the file name is picture5.jpg, the material type is picture material, and the corresponding image acquisition device is the vehicle terminal, that is, the vehicle terminal shoots and stores the material 5, and the real storage path of the material 5 on the vehicle terminal is: /storage/emulated/0/DCIM/picture5.jpg.
  • the user uses the mobile terminal to view and select materials across terminals, and the vehicle terminal sends the metadata information of material 5 to the mobile terminal.
  • the mobile terminal then generates a local storage path for material 5 on the mobile terminal based on the above metadata information, and the local storage path is strongly associated with the real storage path.
  • the local storage path may be: /storage/emulated/0/dfs/mnt/ ⁇ device-picture ⁇ / ⁇ device-car ⁇ /DCIM/picture5.jpg, wherein ⁇ device-picture ⁇ represents that the material type of the material is a picture, ⁇ device-car ⁇ represents that the source device of the material is a vehicle-mounted terminal, picture5.jpg represents the file name of the material, dfs represents a distributed file system (Distributed File System), and mnt represents the mount command in a distributed file, which sends the target image material stored on the target image acquisition device to the file system of the electronic device, thereby enabling the electronic device to create the above-mentioned local storage path, and in the subsequent execution process, the electronic device triggers the media library to scan and update, thereby enabling the generation of file thumbnails and cross-end file access.
  • the method in the above example to generate the multiple image materials and multiple local storage paths corresponding to the multiple image materials.
  • the electronic device receives metadata information of multiple image materials and generates multiple image materials and multiple local storage paths corresponding to the multiple image materials on the electronic device according to the metadata information, thereby enabling users to view and select image materials across terminals on the electronic device, reducing user operation time and improving user experience.
  • Step 1002 group the at least two target image materials according to the image acquisition devices to obtain at least two target image material groups.
  • the at least two target image materials do not originate from the same image acquisition device, and one target image material group corresponds to one image acquisition device, that is, the target image materials in the target image material group originate from the same image acquisition device.
  • Step 1003 performing the following operations for the at least two target image material groups: determining the currently processed target image material group pair;
  • the target image acquisition device to be processed is used as the processing unit;
  • the real storage path of each target image material in the target image material group currently being processed is determined in the target image acquisition device;
  • the real storage path is sent to the target image acquisition device so that the target image acquisition device accesses the real storage path to obtain the image material in the target image material group, and the highlight moment sub-segment corresponding to the target image material group is generated according to the image material in the target image material group, and the highlight moment sub-segment is sent to the electronic device;
  • the next target image material group is processed continuously until the at least two target image material groups are fully processed to obtain at least two highlight moment sub-segments;
  • the electronic device while sending the real storage path to the target image acquisition device, the electronic device also sends a series of instructions to the target image acquisition device to deploy the highlight algorithm to the target image acquisition device, so that the target image acquisition device executes the highlight algorithm, that is, generates the highlight moment sub-segment corresponding to the target image material group according to the image material in the target image material group.
  • the highlight algorithm includes: preprocessing the video material and extracting the picture frames, and scoring each frame in an aesthetic dimension, wherein the score of the aesthetic dimension is associated with the clarity and sharpness of the picture frame, and filtering out the picture frames whose aesthetic dimension scores are lower than a preset score; dividing the video material into multiple video clips; scoring each video clip according to the content change degree and content excitement of the picture frames corresponding to each video clip, and obtaining multiple video clip scores corresponding to the multiple video clips; semantically clustering the multiple video clips, and sorting the multiple video clips by content relevance according to the clustering results; selecting the target highlight video clip according to the content relevance clustering sorting results of the video clips and the video clip scores, and obtaining the highlight video sub-segment.
  • a more reasonable highlight clip selection can be provided based on big data and user-personalized training models.
  • determining the actual storage path of each target image material in the target image material group currently being processed in the target image acquisition device includes: determining the actual storage path of each target image material in the target image material group currently being processed in the target image acquisition device based on the local storage path corresponding to each target image material in the target image material group currently being processed and the metadata information of each target image material.
  • the local storage path corresponding to material 5 is: /storage/emulated/0/dfs/mnt/ ⁇ device-picture ⁇ / ⁇ device-car ⁇ /DCIM/picture5.jpg, and its metadata information includes: the material type is picture material, that is, ⁇ device-picture ⁇ , and the source device is a vehicle-mounted terminal, that is, ⁇ device-car ⁇ .
  • the target application such as the "Album” application can call the interface provided by the SDK to convert the local storage path into a real storage path where material 5 is stored on the vehicle-mounted terminal: /storage/emulated/0/DCIM/picture5.jpg, so that the vehicle-mounted terminal performs read and write operations on the local file system, accesses the real storage path to obtain material 5, and obtains other image materials stored in the vehicle-mounted terminal through the same method, and then produces a highlight moment sub-segment corresponding to the image material shot by the vehicle-mounted terminal.
  • the image acquisition device when deploying the highlight algorithm, the image acquisition device needs to find the image material stored locally. At this time, the electronic device determines the real storage path corresponding to the target image material through the local storage path corresponding to the target image material, so that the image acquisition device can find the target image material and execute the highlight algorithm, and uses parallel processing to obtain highlight moment sub-segments, which greatly improves processing efficiency.
  • Step 1004 Generate a highlight moment segment according to the at least two highlight moment sub-segments.
  • the at least two highlight moment sub-segments can be synthesized into the highlight moment segment, and the generated highlight moment segment can be written into the local file system to obtain the storage path corresponding to the highlight moment segment.
  • Step 1005 play the highlight moment clip.
  • the highlight moment clip can be found and played by accessing the storage path corresponding to the highlight moment clip in the above file system.
  • the highlight video clip can also be sent to another video playback device for playback, such as a large-screen device such as a TV, to further improve the user's viewing experience.
  • the electronic device acquires multiple image materials from multiple image acquisition devices, displays the multiple image materials to the user in a centralized manner, and supports the user to view and select, thereby eliminating the complex operation of the user switching devices back and forth to view and select materials, and reducing the operation time.
  • the image acquisition devices are called to independently process the target image material groups stored in each device to obtain highlight moment sub-segments, which are then synthesized by the electronic device to obtain highlight moment segments, i.e., parallel processing.
  • the embodiments of the present application The method provided by the example can greatly improve processing efficiency, reduce production time, and improve user experience.
  • the multi-device image material processing device 110 includes: a first response unit 1101, which is used to jump to an image material selection page in response to a click operation on a target control in a target application, and the image material selection page includes an image material display area and a highlight moment generation control, and the image material display area is used to display multiple image materials from multiple image acquisition devices; a second response unit 1102, which is used to respond to a selection operation on at least two image materials among the multiple image materials, and mark the selected at least two target image materials in the image material display area, and the at least two target image materials do not come from the same image acquisition device; a third response unit 1103, which is used to respond to a click operation on the highlight moment generation control and display a highlight moment generation screen.
  • the image material display area is used to display multiple groups of image materials, each of the multiple groups of image materials includes at least one image material, wherein the background color of the area between each group of image materials is different, and the background color is used to indicate the image acquisition device.
  • the image material display area includes an image acquisition device switching area and an image material display sub-area, the image acquisition device switching area is used to switch the image acquisition device, and the image material display sub-area is used to display multiple image materials stored in the currently selected image acquisition device.
  • the image material selection page also includes a selection control, and in terms of marking the at least two selected target image materials in the image material display area in response to a selection operation on at least two image materials among the multiple image materials, the second response unit 1102 is specifically used to: in response to a click operation on the selection control, mark the multiple image materials as selectable materials; in response to a selection operation on at least two image materials among the selectable materials, mark the at least two selected target image materials in the image material display area.
  • the second response unit 1102 in response to a selection operation on at least two image materials among the multiple image materials, in terms of marking the at least two selected target image materials in the image material display area, is specifically used to: in response to a long press operation on any one of the multiple image materials, mark the image material corresponding to the long press operation as the target image material, and mark the image materials among the multiple image materials except the image material corresponding to the long press operation as selectable materials; in response to a selection operation on at least one image material among the selectable materials, mark the at least two selected target image materials in the image material display area.
  • the selection operation for at least one image material among the selectable materials includes: a click operation on the single image material; if at least one image material among the selectable materials is multiple image materials, then the selection operation for at least one image material among the selectable materials includes: multiple click operations on multiple image materials among the selectable materials; or, a sliding operation on multiple image materials in adjacent positions among the selectable materials.
  • the image material selection page further includes selection prompt information, and the selection prompt information is used to indicate the total number of the target image materials.
  • the highlight moment generation screen includes a progress bar for indicating a generation progress of the highlight moment generation according to the at least two target image materials.
  • the highlight moment generation screen also includes generation stage prompt information, and the generation stage prompt information is used to indicate the generation stage of highlight moment generation based on the at least two target image materials, and the generation stage includes a highlight algorithm deployment stage, a highlight algorithm execution stage and a highlight moment generation stage.
  • the highlight moment generation screen also includes an icon of a target device for executing the highlight algorithm, the target device being one or more devices in the image acquisition device corresponding to the target image material, the generation stage being the highlight algorithm execution stage, and when the target device starts to execute the highlight algorithm, the icon of the target device is highlighted until the target device completes the execution.
  • the highlight moment generation screen further includes a cancel control, and the cancel control is used to instruct to cancel the currently ongoing highlight moment generation process.
  • a highlight moment generation picture is displayed.
  • the multi-device based image material processing apparatus 110 is also used to: when the progress bar is loaded, play the generated highlight moment clip.
  • the multi-device-based image material processing device 110 is specifically used to: when the progress bar is loaded, jump to the highlight moment display page, the highlight moment display page at least includes the generated highlight moment clip; play the generated highlight moment clip on the highlight moment display page.
  • the highlight moment display page includes a highlight moment viewing control and a template viewing control, wherein the highlight moment viewing control is used to indicate a collection of highlight moments historically generated by the electronic device, and the template viewing control is used to indicate a template used by the electronic device when generating highlight moments.
  • the multi-device-based image material processing device 110 is also used to: when the progress bar is loaded, send the generated highlight moment segment to the image playback device that is communicatively connected to the electronic device, so that the image playback device plays the generated highlight moment segment.
  • Figure 11b is a functional unit composition block diagram of another image material processing device based on multiple devices provided in an embodiment of the present application.
  • the image material processing device 111 based on multiple devices includes: a processing module 1112 and a communication module 1111.
  • the processing module 1112 is used to control and manage the actions of the image material processing device based on multiple devices, for example, to execute the steps of the first response unit 1101, the second response unit 1102 and the third response unit 1103, and/or to perform other processes of the technology described herein.
  • the communication module 1111 is used to support the interaction between the image material processing device based on multiple devices and other devices.
  • the image material processing device based on multiple devices may also include a storage module 1113, which is used to store program codes and data of the image material processing device based on multiple devices.
  • the processing module 1112 can be a processor or a controller, for example, a central processing unit (CPU), a general processor, a digital signal processor (DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component or any combination thereof. It can implement or execute various exemplary logic blocks, modules and circuits described in conjunction with the disclosure of this application.
  • the processor can also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of DSP and microprocessors, and the like.
  • the communication module 1111 can be a transceiver, an RF circuit or a communication interface, and the like.
  • the storage module 1113 can be a memory.
  • the above multi-device based image material processing apparatus 111 can execute the multi-device based image material processing method shown in FIG.
  • the image material processing device 120 based on multiple devices includes: a first response unit 1201, which is used to jump to an image material selection page in response to a click operation on a target space in a target application, and the image material selection page includes an image material display area and a highlight moment generation control, and the image material display area is used to display multiple image materials; a second response unit 1202, which is used to respond to a selection operation on at least two image materials among the multiple image materials, mark the at least two selected target image materials in the image material display area, and display the image acquisition device storing the target image materials in the source device display area, and the at least two target image materials do not come from the same image acquisition device; a third response unit 1203, which is used to respond to a click operation on the highlight moment generation
  • the image material selection page also includes a selection control, and in terms of marking the at least two selected target image materials in the image material display area in response to a selection operation on at least two image materials among the multiple image materials, the second response unit 1202 is specifically used to: in response to a click operation on the selection control, mark the multiple image materials as selectable materials; in response to a selection operation on at least two image materials among the selectable materials, mark the at least two selected target image materials in the image material display area.
  • the second response unit 1202 in response to a selection operation on at least two image materials among the multiple image materials, in terms of marking the at least two selected target image materials in the image material display area, is specifically used to: in response to a long press operation on any one of the multiple image materials, mark the image material corresponding to the long press operation as the target image material, and mark the image materials among the multiple image materials except the image material corresponding to the long press operation as selectable materials; in response to a selection operation on at least one image material among the selectable materials, mark the at least two selected target image materials in the image material display area.
  • the selection operation for at least one image material among the selectable materials includes: a click operation on the single image material; if at least one image material among the selectable materials is multiple image materials, then the selection operation for at least one image material among the selectable materials includes: multiple click operations on multiple image materials among the selectable materials; or, a sliding operation on multiple image materials in adjacent positions among the selectable materials.
  • the image material selection page further includes selection prompt information, and the selection prompt information is used to indicate the total number of the target image materials.
  • the highlight moment generation screen includes a progress bar for indicating a generation progress of the highlight moment generation according to the at least two target image materials.
  • the highlight moment generation screen also includes generation stage prompt information, and the generation stage prompt information is used to indicate the generation stage of highlight moment generation based on the at least two target image materials, and the generation stage includes a highlight algorithm deployment stage, a highlight algorithm execution stage and a highlight moment generation stage.
  • the highlight moment generation screen also includes an icon of a target device for executing the highlight algorithm, the target device being one or more devices in the image acquisition device corresponding to the target image material, the generation stage being the highlight algorithm execution stage, and when the target device starts to execute the highlight algorithm, the icon of the target device is highlighted until the target device completes the execution.
  • the highlight moment generation screen further includes a cancel control, and the cancel control is used to instruct to cancel the currently ongoing highlight moment generation process.
  • the multi-device-based image material processing apparatus 120 is further used to play the generated highlight moment clip when the progress bar is loaded.
  • the multi-device-based image material processing device 120 is specifically used to: when the progress bar is loaded, jump to the highlight moment display page, the highlight moment display page at least includes the generated highlight moment clip; play the generated highlight moment clip on the highlight moment display page.
  • the highlight moment display page includes a highlight moment viewing control and a template viewing control, wherein the highlight moment viewing control is used to indicate a collection of highlight moments historically generated by the electronic device, and the template viewing control is used to indicate a template used by the electronic device when generating highlight moments.
  • the multi-device-based image material processing device 120 is also used to: when the progress bar is loaded, send the generated highlight moment segment to the image playback device that is communicatively connected to the electronic device, so that the image playback device plays the generated highlight moment segment.
  • FIG12b is a functional unit composition block diagram of another image material processing device based on multiple devices provided in an embodiment of the present application.
  • the image material processing device based on multiple devices 121 includes: a processing module 1212 and a communication module 1211.
  • the processing module 1212 is used to control and manage the actions of the image material processing device based on multiple devices, for example, to execute the steps of the first response unit 1201, the second response unit 1202 and the third response unit 1203, and/or to execute other processes of the technology described herein.
  • the communication module 1211 is used to support communication between the image material processing device based on multiple devices and other devices.
  • the apparatus for processing image materials based on multiple devices may further include a storage module 1213 , and the storage module 1213 is used to store program codes and data of the apparatus for processing image materials based on multiple devices.
  • the processing module 1212 can be a processor or a controller, for example, a central processing unit (CPU), a general processor, a digital signal processor (DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component or any combination thereof. It can implement or execute various exemplary logic blocks, modules and circuits described in conjunction with the disclosure of this application.
  • the processor can also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of DSP and microprocessors, and the like.
  • the communication module 1211 can be a transceiver, an RF circuit or a communication interface, and the like.
  • the storage module 1213 can be a memory.
  • the above multi-device based image material processing apparatus 121 can execute the multi-device based image material processing method shown in FIG5 .
  • Figure 13a is a functional unit composition block diagram of another image material processing device based on multiple devices provided in an embodiment of the present application. The device is applied to the electronic device 11 shown in Figure 1a.
  • the image material processing device 130 based on multiple devices includes: an acquisition unit 1301, used to acquire multiple image materials from multiple image acquisition devices, the multiple image materials include at least two target image materials, and the target image materials are image materials selected by the user for generating highlight moment fragments; a grouping unit 1302, used to group the at least two target image materials according to the image acquisition device to obtain at least two target image material groups, and the at least two target image materials do not come from the same image acquisition device; an execution unit 1303, used to perform the following operations on the at least two target image material groups : Determine the target image acquisition device corresponding to the target image material group currently being processed; determine the real storage path of each target image material in the target image material group currently being processed in the target image acquisition device; send the real storage path to the
  • the acquisition unit 1301 is specifically used to: receive metadata information of each of the multiple image materials sent by the multiple image acquisition devices, the metadata information of each image material including the file name of each image material, the material type of each image material and the image acquisition device corresponding to each image material, the material type including picture material and video material; generate the multiple image materials and multiple local storage paths corresponding to the multiple image materials according to the metadata information, the local storage path refers to the storage path of the image material in the electronic device.
  • the execution unit 1303 is specifically used to: determine the actual storage path of each target image material in the currently processed target image material group in the target image acquisition device according to the local storage path corresponding to each target image material in the currently processed target image material group and the metadata information of each target image material.
  • FIG13b is a functional unit composition block diagram of another image material processing device based on multiple devices provided in an embodiment of the present application.
  • the image material processing device 131 based on multiple devices includes: a processing module 1312 and a communication module 1311.
  • the processing module 1312 is used to control and manage the actions of the image material processing device based on multiple devices, for example, to execute the steps of the acquisition unit 1301, the grouping unit 1302, the execution unit 1303, the generation unit 1304 and the playback unit 1305, and/or to perform other processes of the technology described herein.
  • the communication module 1311 is used to support the interaction between the image material processing device based on multiple devices and other devices.
  • the image material processing device based on multiple devices may also include a storage module 1313, which is used to store program codes and data of the image material processing device based on multiple devices.
  • the processing module 1312 may be a processor or a controller, such as a central processing unit (CPU), a general purpose processor, a digital signal processor (DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component or any combination thereof. It may implement or execute various examples described in conjunction with the disclosure of this application.
  • the processor may also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of a DSP and a microprocessor, etc.
  • the communication module 1311 may be a transceiver, an RF circuit or a communication interface, etc.
  • the storage module 1313 may be a memory.
  • the above multi-device based image material processing apparatus 131 can execute the multi-device based image material processing method shown in FIG. 10 .
  • the above embodiments can be implemented in whole or in part by software, hardware, firmware or any other combination.
  • the above embodiments can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions or computer programs. When the computer instructions or computer programs are loaded or executed on a computer, the process or function described in the embodiment of the present application is generated in whole or in part.
  • the computer can be a general-purpose computer, a special-purpose computer, a computer network, or other programmable device.
  • the computer instructions can be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions can be transmitted from one website site, computer, server or data center to another website site, computer, server or data center by wired or wireless means.
  • the computer-readable storage medium can be any available medium that a computer can access or a data storage device such as a server or data center containing one or more available media sets.
  • the available medium can be a magnetic medium (e.g., a floppy disk, a hard disk, a tape), an optical medium (e.g., a DVD), or a semiconductor medium.
  • the semiconductor medium can be a solid-state hard disk.
  • FIG14 is a block diagram of an electronic device provided in an embodiment of the present application.
  • the electronic device 1400 may include one or more of the following components: a processor 1401, a memory 1402 coupled to the processor 1401, wherein the memory 1402 may store one or more computer programs, and the one or more computer programs may be configured to implement the methods described in the above embodiments when executed by one or more processors 1401.
  • the electronic device 1400 may be the electronic device 11 in the above embodiments.
  • the processor 1401 may include one or more processing cores.
  • the processor 1401 uses various interfaces and lines to connect various parts of the entire electronic device 1400, and executes various functions and processes data of the electronic device 1400 by running or executing instructions, programs, code sets or instruction sets stored in the memory 1402, and calling data stored in the memory 1402.
  • the processor 1401 can be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field-programmable gate array (Field-Programmable Gate Array, FPGA), and programmable logic array (Programmable Logic Array, PLA).
  • the processor 1401 can integrate one or a combination of a central processing unit (Central Processing Unit, CPU), a graphics processing unit (Graphics Processing Unit, GPU), and a modem.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • modem modem
  • the CPU mainly processes the operating system, user interface, and application programs; the GPU is responsible for rendering and drawing display content; and the modem is used to process wireless communications. It is understandable that the above-mentioned modem may not be integrated into the processor 1401, and may be implemented separately through a communication chip.
  • the memory 1402 may include a random access memory (RAM) or a read-only memory (ROM).
  • the memory 1402 may be used to store instructions, programs, codes, code sets or instruction sets.
  • the memory 1402 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playback function, an image playback function, etc.), instructions for implementing the above-mentioned various method embodiments, etc.
  • the data storage area may also store data created by the electronic device 1400 during use, etc.
  • the electronic device 1400 may include more or fewer structural elements than those in the above structural block diagram, which is not limited here.
  • An embodiment of the present application also provides a computer storage medium, on which a computer program/instruction is stored.
  • a computer program/instruction is stored on which a computer program/instruction is stored.
  • An embodiment of the present application also provides a computer program product, which includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to execute part or all of the steps of any method recorded in the above method embodiments.
  • the size of the serial numbers of the above-mentioned processes does not mean the order of execution.
  • the execution order of each process should be determined by its function and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
  • the disclosed methods, devices and systems can be implemented in other ways.
  • the device embodiments described above are only illustrative; for example, the division of the units is only a logical function division, and there may be other division methods in actual implementation; for example, multiple units or components can be combined or integrated into another system, Or some features may be ignored or not performed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, which may be electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place or distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may be physically included separately, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit may be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the above-mentioned integrated unit implemented in the form of a software functional unit can be stored in a computer-readable storage medium.
  • the above-mentioned software functional unit is stored in a storage medium, including a number of instructions for a computer device (which can be a personal computer, a server, or a network device, etc.) to perform some steps of the method described in each embodiment of the present invention.
  • the aforementioned storage medium includes: a USB flash drive, a mobile hard disk, a magnetic disk, an optical disk, a volatile memory or a non-volatile memory.
  • the non-volatile memory can be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM) or a flash memory.
  • the volatile memory can be a random access memory (RAM), which is used as an external cache.
  • RAM random access memory
  • many forms of random access memory (RAM) are available, such as static RAM (SRAM), dynamic random access memory (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Sync Link DRAM (SLDRAM) and direct Rambus RAM (DR RAM), among other media that can store program code.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente demande concerne un procédé de traitement de contenus d'images sur la base de multiples dispositifs, ainsi qu'un appareil associé. Le procédé comprend les étapes consistant à : en réponse à une opération de clic sur une commande cible dans un programme d'application cible, sauter jusqu'à une page de sélection de contenus d'images, la page de sélection de contenus d'images contenant une zone de présentation de contenus d'images et une commande de génération d'un moment marquant et la zone de présentation de contenus d'images étant utilisée pour présenter une pluralité de contenus d'images provenant d'une pluralité de dispositifs de collecte d'images ; en réponse à une opération de sélection portant sur au moins deux contenus d'images de la pluralité de contenus d'images, marquer au moins deux contenus d'images cibles sélectionnés dans la zone de présentation de contenus d'images ; et, en réponse à une opération de clic sur la commande de génération d'un moment marquant, afficher une illustration de génération d'un moment marquant. Ainsi un utilisateur peut-il visualiser et sélectionner sur un seul dispositif et de manière croisée des contenus d'images provenant d'une pluralité de dispositifs. L'efficacité de traitement s'en trouve considérablement améliorée et le temps de production réduit, ce qui optimise l'expérience d'utilisation de l'utilisateur.
PCT/CN2023/119787 2022-11-18 2023-09-19 Procédé de traitement de contenus d'images sur la base de multiples dispositifs et appareil associé WO2024103958A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211445958.2 2022-11-18
CN202211445958.2A CN115904168A (zh) 2022-11-18 2022-11-18 基于多设备的影像素材处理方法及相关装置

Publications (1)

Publication Number Publication Date
WO2024103958A1 true WO2024103958A1 (fr) 2024-05-23

Family

ID=86495877

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/119787 WO2024103958A1 (fr) 2022-11-18 2023-09-19 Procédé de traitement de contenus d'images sur la base de multiples dispositifs et appareil associé

Country Status (2)

Country Link
CN (1) CN115904168A (fr)
WO (1) WO2024103958A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115904168A (zh) * 2022-11-18 2023-04-04 Oppo广东移动通信有限公司 基于多设备的影像素材处理方法及相关装置

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1645902A (zh) * 2004-01-20 2005-07-27 富士施乐株式会社 图像处理设备、图像处理方法及其程序产品
CN111382697A (zh) * 2020-03-09 2020-07-07 中国铁塔股份有限公司 影像数据处理方法及第一电子设备
CN112672200A (zh) * 2020-12-14 2021-04-16 完美世界征奇(上海)多媒体科技有限公司 视频生成方法和装置、电子设备和存储介质
CN112738403A (zh) * 2020-12-30 2021-04-30 维沃移动通信(杭州)有限公司 拍摄方法、拍摄装置、电子设备和介质
CN113259601A (zh) * 2020-02-11 2021-08-13 北京字节跳动网络技术有限公司 视频处理方法、装置、可读介质和电子设备
CN113420247A (zh) * 2021-06-23 2021-09-21 北京字跳网络技术有限公司 页面展示方法、装置、电子设备、存储介质及程序产品
CN113473204A (zh) * 2021-05-31 2021-10-01 北京达佳互联信息技术有限公司 一种信息展示方法、装置、电子设备及存储介质
CN114070993A (zh) * 2020-07-29 2022-02-18 华为技术有限公司 摄像方法、装置、摄像设备及可读存储介质
CN114297437A (zh) * 2021-12-29 2022-04-08 重庆紫光华山智安科技有限公司 一种基于图像聚档的档案展示方法、装置及设备
CN114745505A (zh) * 2022-04-28 2022-07-12 维沃移动通信有限公司 拍摄方法、装置、电子设备和可读存储介质
CN115171014A (zh) * 2022-06-30 2022-10-11 腾讯科技(深圳)有限公司 视频处理方法、装置、电子设备及计算机可读存储介质
CN115904168A (zh) * 2022-11-18 2023-04-04 Oppo广东移动通信有限公司 基于多设备的影像素材处理方法及相关装置

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1645902A (zh) * 2004-01-20 2005-07-27 富士施乐株式会社 图像处理设备、图像处理方法及其程序产品
CN113259601A (zh) * 2020-02-11 2021-08-13 北京字节跳动网络技术有限公司 视频处理方法、装置、可读介质和电子设备
CN111382697A (zh) * 2020-03-09 2020-07-07 中国铁塔股份有限公司 影像数据处理方法及第一电子设备
CN114070993A (zh) * 2020-07-29 2022-02-18 华为技术有限公司 摄像方法、装置、摄像设备及可读存储介质
CN112672200A (zh) * 2020-12-14 2021-04-16 完美世界征奇(上海)多媒体科技有限公司 视频生成方法和装置、电子设备和存储介质
CN112738403A (zh) * 2020-12-30 2021-04-30 维沃移动通信(杭州)有限公司 拍摄方法、拍摄装置、电子设备和介质
CN113473204A (zh) * 2021-05-31 2021-10-01 北京达佳互联信息技术有限公司 一种信息展示方法、装置、电子设备及存储介质
CN113420247A (zh) * 2021-06-23 2021-09-21 北京字跳网络技术有限公司 页面展示方法、装置、电子设备、存储介质及程序产品
CN114297437A (zh) * 2021-12-29 2022-04-08 重庆紫光华山智安科技有限公司 一种基于图像聚档的档案展示方法、装置及设备
CN114745505A (zh) * 2022-04-28 2022-07-12 维沃移动通信有限公司 拍摄方法、装置、电子设备和可读存储介质
CN115171014A (zh) * 2022-06-30 2022-10-11 腾讯科技(深圳)有限公司 视频处理方法、装置、电子设备及计算机可读存储介质
CN115904168A (zh) * 2022-11-18 2023-04-04 Oppo广东移动通信有限公司 基于多设备的影像素材处理方法及相关装置

Also Published As

Publication number Publication date
CN115904168A (zh) 2023-04-04

Similar Documents

Publication Publication Date Title
CN108900902B (zh) 确定视频背景音乐的方法、装置、终端设备及存储介质
WO2020015333A1 (fr) Procédé et appareil de prise de vue vidéo, dispositif terminal et support d'informations
WO2021258821A1 (fr) Procédé et dispositif d'édition vidéo, terminal et support de stockage
WO2017173781A1 (fr) Procédé et dispositif de capture de trame vidéo
US20230168805A1 (en) Configuration of application execution spaces and sub-spaces for sharing data on a mobile touch screen device
KR102339205B1 (ko) 가상 장면 디스플레이 방법 및 디바이스, 및 저장 매체
US8941615B2 (en) Information processing apparatus, information processing method, and program
WO2024103958A1 (fr) Procédé de traitement de contenus d'images sur la base de multiples dispositifs et appareil associé
WO2020062683A1 (fr) Procédé et dispositif d'acquisition vidéo, terminal et support
CN111935505B (zh) 视频封面生成方法、装置、设备及存储介质
CN107291329A (zh) 一种屏幕呈现动态壁纸的方法和装置
CN109286836B (zh) 多媒体数据处理方法、装置及智能终端、存储介质
CN112887794B (zh) 视频剪辑方法及装置
JP2021506196A (ja) ビデオ処理方法、装置、端末および媒体
US20180005618A1 (en) Information processing method, terminal device and computer storage medium
CN109379623A (zh) 视频内容生成方法、装置、计算机设备和存储介质
CN112083915A (zh) 页面布局方法、装置、电子设备及存储介质
US11941728B2 (en) Previewing method and apparatus for effect application, and device, and storage medium
CN108241672A (zh) 一种在线展示演示文稿的方法和装置
JP6265659B2 (ja) 情報処理装置およびその制御方法およびプログラム
CN111158573B (zh) 基于图片构架的车机交互方法、系统、介质及设备
CN106961624B (zh) 一种视频切换方法、装置及系统
CN108984263B (zh) 视频显示方法和装置
WO2024099280A1 (fr) Procédé et appareil de montage vidéo, dispositif électronique et support d'enregistrement
WO2024041514A1 (fr) Procédé et appareil de lecture vidéo et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23890395

Country of ref document: EP

Kind code of ref document: A1