CN109068063B - Three-dimensional image data processing and displaying method and device and mobile terminal - Google Patents

Three-dimensional image data processing and displaying method and device and mobile terminal Download PDF

Info

Publication number
CN109068063B
CN109068063B CN201811103389.7A CN201811103389A CN109068063B CN 109068063 B CN109068063 B CN 109068063B CN 201811103389 A CN201811103389 A CN 201811103389A CN 109068063 B CN109068063 B CN 109068063B
Authority
CN
China
Prior art keywords
target material
preview interface
receiving
user
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811103389.7A
Other languages
Chinese (zh)
Other versions
CN109068063A (en
Inventor
罗桂钊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811103389.7A priority Critical patent/CN109068063B/en
Publication of CN109068063A publication Critical patent/CN109068063A/en
Application granted granted Critical
Publication of CN109068063B publication Critical patent/CN109068063B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application discloses a method and a device for processing and displaying three-dimensional image data and a mobile terminal, wherein the method for processing the three-dimensional image data comprises the following steps: receiving selection operation of a user on target materials in a shooting preview interface, wherein the target materials are one or more materials in a three-dimensional image displayed in the shooting preview interface, determining a receiving object of the target materials, and sending data of the target materials to the receiving object. By the method, real-time image matting and picture splicing can be realized, and user experience is improved.

Description

Three-dimensional image data processing and displaying method and device and mobile terminal
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for processing three-dimensional image data, and a mobile terminal.
Background
The development of the image processing technology is promoted by the development of the computer technology, the matting function in the image processing technology is also commonly applied in life and work of people, and the matting function is combined with the terminal equipment, so that people can matte an interested part in a certain image or video on the terminal equipment and use the image or video in other images or videos.
Generally, in the sectional image is applied to a planar image, the conventional sectional image mode is usually implemented based on color information, and in the sectional process of a certain element, a corresponding image needs to be obtained, such as shooting an image, and then the static sectional image is implemented by selecting a certain fixed area from the obtained image.
However, with the general application of the matting technique, the requirement of people on the matting technique has not been satisfied with the traditional matting method, the above matting method requires people to firstly obtain an image (such as shooting an image, etc.), and the obtained image is stored at the terminal, which can occupy the storage space of the terminal, and often the user may not want to keep the above image completed by matting, and more operation flow is needed to delete the image, thereby causing a large amount of images to be kept in the terminal, occupying terminal resources, making user experience poor, for this reason, a scheme capable of realizing real-time quick matting is needed, and reducing the resource occupation of the terminal.
Disclosure of Invention
The embodiment of the application aims to provide a method and a device for processing and displaying three-dimensional image data and a mobile terminal, so as to solve the problem that a large number of images possibly remain in the terminal and occupy terminal resources in a plane image matting mode.
In order to solve the above technical problem, the embodiment of the present application is implemented as follows:
in a first aspect, a method for processing three-dimensional image data provided in an embodiment of the present application includes:
receiving selection operation of a user on target materials in a shooting preview interface, wherein the target materials are one or more materials in a three-dimensional image displayed in the shooting preview interface;
determining a receiving object of the target material;
and sending the data of the target material to the receiving object.
Optionally, the receiving a selection operation of the user on the target material in the shooting preview interface includes:
receiving the selection operation of a user on the material in a shooting preview interface;
and determining the target material selected by the selection operation according to the color information and the depth information of the pixel point corresponding to the selection operation.
Optionally, the sending the data of the target material to the receiving object includes:
and sending the real-time data stream of the target material captured in the shooting preview interface to the receiving object.
Optionally, after the receiving of the selection operation of the target material by the user in the shooting preview interface, the method further includes:
and reserving the color information of the target material, and carrying out gray processing on the images except the target material in the three-dimensional image.
Optionally, the determining the receiving object of the target material includes:
displaying a sending option;
when receiving the selection operation of the user on the sending option, acquiring a candidate receiving object of the target material;
and determining a receiving object of the target material from the candidate receiving objects.
Optionally, after the sending the data of the target material to the receiving object, the method further includes:
displaying a cancel sending option;
and when the selection operation of the user on the cancel sending option is received, stopping sending the data of the target material to the receiving object.
Optionally, after the displaying the cancel sending option, the method further includes:
and correspondingly highlighting the option of canceling sending according to the condition that whether the receiving object receives the data of the target material.
In a second aspect, a method for displaying three-dimensional image data provided in an embodiment of the present application is applied to a second terminal, and the method includes:
receiving data of a target material sent by a first terminal;
and under the condition that a shooting preview interface is included in a display interface of the second terminal, displaying the target material in the shooting preview interface of the second terminal.
Optionally, the received data of the target material is a real-time data stream of the target material captured by the first terminal,
the displaying the target material in a shooting preview interface of the second terminal includes:
and displaying the target material in a shooting preview interface of the second terminal based on the real-time data stream of the target material, wherein the target material in the shooting preview interface of the second terminal is synchronously displayed with the target material in the shooting preview interface of the first terminal.
Optionally, the method further comprises:
and when an input operation of a user on the target material is received on a shooting preview interface of the second terminal, processing the target material according to the input operation, wherein the input operation comprises a dragging position operation, an amplifying operation or a reducing operation.
In a third aspect, an embodiment of the present application provides an apparatus for processing three-dimensional image data, where the apparatus includes:
the shooting preview interface comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving the selection operation of a user on a target material in the shooting preview interface, and the target material is one or more materials in a three-dimensional image displayed in the shooting preview interface;
the determining module is used for determining a receiving object of the target material;
and the sending module is used for sending the data of the target material to the receiving object.
Optionally, the receiving module includes:
the receiving unit is used for receiving the selection operation of the user on the material in the shooting preview interface;
and the selecting unit is used for determining the target material selected by the selecting operation according to the color information and the depth information of the pixel point corresponding to the selecting operation.
Optionally, the sending module is configured to send the real-time data stream of the target material captured in the shooting preview interface to the receiving object.
Optionally, the apparatus further comprises:
and the gray processing module is used for reserving the color information of the target material and carrying out gray processing on the images except the target material in the three-dimensional image.
Optionally, the determining module includes:
a display unit for displaying the sending option;
the candidate object determining unit is used for acquiring a candidate receiving object of the target material when the selection operation of the user on the sending option is received;
a receiving object determining unit configured to determine a receiving object of the target material from the candidate receiving objects.
Optionally, the apparatus further comprises:
the second display module is used for displaying a sending canceling option;
and the canceling module is used for stopping sending the data of the target material to the receiving object when receiving the selection operation of the user on the cancel sending option.
Optionally, the apparatus further comprises:
and the highlight display module is used for correspondingly highlighting the option of canceling sending according to the condition that whether the receiving object receives the data of the target material.
In a fourth aspect, an embodiment of the present application provides an apparatus for displaying three-dimensional image data, the apparatus including:
the acquisition module is used for receiving data of a target material sent by a first terminal;
and the display module is used for displaying the target material in the shooting preview interface of the device under the condition that the display interface of the device comprises the shooting preview interface.
Optionally, the received data of the target material is a real-time data stream of the target material captured by the first terminal, and the display module is configured to display the target material in a shooting preview interface of the device based on the real-time data stream of the target material, where the target material in the shooting preview interface of the device and the target material in the shooting preview interface of the first terminal are displayed synchronously.
Optionally, the apparatus further comprises:
and the management module is used for processing the target material according to the input operation when the input operation of the target material by a user is received on a shooting preview interface of the device, wherein the input operation comprises a position dragging operation and an enlarging or reducing operation.
In a fifth aspect, an embodiment of the present application provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the method for processing three-dimensional image data provided in the first aspect.
In a sixth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when executed by a processor, the computer program implements the steps of the method for processing three-dimensional image data provided in the first aspect.
In a seventh aspect, an embodiment of the present application provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the method for displaying three-dimensional image data provided in the second aspect.
In an eighth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program, when executed by a processor, implements the steps of the method for displaying three-dimensional image data provided in the second aspect.
According to the technical scheme provided by the embodiment of the application, the selection operation of the user on the target material in the shooting preview interface is received, the target material is one or more materials in the three-dimensional image displayed in the shooting preview interface, then the receiving object of the target material is determined, and finally the data of the target material is sent to the receiving object. Like this, the user just can select the target material that needs to matte in shooting preview interface and scratch and share this target material to this target material realizes that the user accurately scratches and shares fast in real time to the image, moreover, because the process of scratching of image is being shot the preview interface realization, need not shoot the image in the terminal, more need not save the image of shooting, thereby reduce the resource occupation at terminal. Meanwhile, after the receiving end receives the target material, the target material can be projected into a local shooting preview interface, the requirement of real-time picture splicing of a user is met, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present specification, and for those skilled in the art, other drawings can be obtained according to the drawings without any creative effort.
FIG. 1 is a flowchart illustrating an embodiment of a method for processing three-dimensional image data according to the present application;
FIG. 2 is a flowchart illustrating an embodiment of a method for displaying three-dimensional image data according to the present disclosure;
FIG. 3 is a flowchart illustrating an embodiment of a method for processing three-dimensional image data according to the present application;
FIG. 4 is a schematic diagram of a three-dimensional image data synchronous display according to the present application;
FIG. 5 is a schematic structural diagram of a three-dimensional image data processing apparatus according to the present application;
FIG. 6 is a schematic structural diagram of a display device for three-dimensional image data according to the present application;
fig. 7 is a schematic structural diagram of a mobile terminal according to the present application.
Detailed Description
The embodiment of the application provides a method and a device for processing and displaying three-dimensional image data and a mobile terminal.
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present specification, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Example one
As shown in fig. 1, an execution main body of the method may be a terminal device or a server, where the terminal device may be a device such as a personal computer, or may also be a mobile terminal device such as a mobile phone and a tablet computer, and the terminal device may be a terminal device used by a user. The method can be used for deducting corresponding materials from the three-dimensional image and sharing the materials to a receiving object and other processing. The method may specifically comprise the steps of:
in step S102, a selection operation of a user on a target material in the shooting preview interface is received, the target material being one or more materials in a three-dimensional image displayed in the shooting preview interface.
The three-dimensional image can be an image acquired by a user in real time through a camera when the user uses the 3D camera to shoot, and the terminal equipment can display the three-dimensional image acquired in real time in a shooting preview interface so that the user can preview the three-dimensional image. The selection operation by the user may be selection at any point in the three-dimensional image, or may be determination of a material region to be selected by a plurality of selection operations.
In implementation, the development of the computer technology promotes the development of the image processing technology, the matting function in the image processing technology is also commonly applied in life and work of people, and the matting function is combined with the terminal equipment, so that people can extract interested parts in a certain image or video on the terminal equipment and use the interested parts in other images or videos. However, with the widespread use of matting techniques, the demands on matting techniques are increasing. Because the color information of the traditional plane image is limited, static image matting can be realized by selecting a certain fixed area on the basis of image acquisition in the process of matting a certain element in the image. At the present stage, people have higher application requirements on real-time cutout and jigsaw puzzle, and the traditional plane image cannot be applied to real-time communication social application due to the fact that the traditional plane image does not have real-time performance, cannot meet user requirements, and causes poor user experience. Therefore, the embodiments of the present invention provide a technical solution to solve the above problems, and refer to the following contents.
Taking the selection operation of the material in the shot three-dimensional image data by the user as an example, the user can use the 3D camera to shoot the three-dimensional image, and can obtain the three-dimensional image data (i.e., the three-dimensional image) of Depth and RGB with high resolution, high precision and low time delay, in the process of shooting the three-dimensional image, the terminal device can collect the three-dimensional image in real time through the 3D camera and display the three-dimensional image in the shooting preview interface, and the user can select the elements in the three-dimensional image displayed in the shooting preview interface through the selection operation. For example, the three-dimensional image displayed in the shooting preview interface includes three materials, namely, a ball a, a cube B and a cylinder C, and the user can select the three materials in the three-dimensional image and select any one or more materials from the three materials, namely, the ball a, the cube B and the cylinder C.
In step S104, a reception target of the target material is determined.
The receiving object of the target material may be a contact created by the user, or a contact acquired by the user from the terminal device, or a contact in a real-time communication application installed in the terminal device by the user, or a certain sharing page, and the like, which is not limited in the present application.
In implementation, a user may create contact information of one or more receiving objects in advance, and may use the created one or more receiving objects as a default sending object. When the target material selected by the selection operation is determined, the default sending object can be obtained, and the data of the target material can be sent to the receiving object.
In step S106, the data of the target material is transmitted to the receiving object.
In implementation, the data of the target material can be directly sent to a receiving object, and a connection between the first terminal and the second terminal can also be established, so that real-time transmission of the data stream of the target material is realized. For example, the user a selects the ball a and the cube B in the shooting interface of the mobile terminal as target materials, and the user a needs to send the two materials to the user B, so that the user a can send all data of the ball a and the cube B, including color information and depth information of the two objects, to the user B, or the user a can initiate a sharing connection and send a real-time data stream of the ball a and the cube B to the user B. The specific sending method may be different according to the actual application scenario, which is not limited in this application.
The embodiment of the application provides a method for processing three-dimensional image data, which comprises the steps of receiving selection operation of a user on a target material in a shooting preview interface, determining a receiving object of the target material, and finally sending data of the target material to the receiving object, wherein the target material is one or more materials in a three-dimensional image displayed in the shooting preview interface. Like this, the user just can select the target material that needs to matte in shooting the preview interface to scratch and share this target material, with this to realize that the user is to the real-time accurate quick matte of image, moreover, because the process of scratching of image is being shot the preview interface and is realizing, need not shoot the image in the terminal, more need not save the image of shooting, thereby reduce the resource occupation at terminal, improved user experience.
Example two
As shown in fig. 2, an execution body of the method may be a terminal device or a server, where the terminal device may be a device such as a personal computer, or may also be a mobile terminal device such as a mobile phone and a tablet computer, and the terminal device may be a terminal device used by a user. The method can display the target material data in the three-dimensional image in the shooting preview interface after receiving the target material data. In order to distinguish from a first terminal appearing later, the execution subject of the embodiment is described as a second terminal, and the method specifically includes the following steps:
in step S202, data of the target material transmitted by the first terminal is received.
In implementation, when receiving the data of the target material sent by the first terminal, a notification message may be generated, and the notification message may include user information of the sender, that is, the first terminal, and a thumbnail of the target material, or when the receiver enters the camera application, the thumbnail of the target material is displayed in the shooting preview interface, and the user confirms the reception of the data of the target material sent by the first terminal through a selection operation of the notification or the thumbnail.
In step S204, in a case where the shooting preview interface is included in the display interface of the second terminal, the target material is displayed in the shooting preview interface of the second terminal.
In implementation, when data of the target material from the first terminal is received, a prompt box containing receiving and rejecting can be generated at the second terminal, and when a clicking operation of a user on a receiving key is received, the camera application is opened, and the target material is displayed in a shooting preview interface of the second terminal. In the display process, thumbnails of target materials or keys with 'end sharing' can be generated at any positions of four corners of a shooting preview interface, when a long-time pressing operation of a user on the thumbnails or a clicking operation on the keys with 'end sharing' labels is received, a label with 'whether materials shared by friends are no longer needed' and a dialog box with 'yes' and 'no' options can be popped up, and when a selection operation of the user on the 'yes' option is received, the target materials are stopped being displayed in the shooting preview interface of the second terminal.
The embodiment of the application provides a method for displaying three-dimensional image data, wherein data of a target material sent by a first terminal are displayed in a shooting preview interface of a second terminal under the condition that the display interface of the second terminal comprises the shooting preview interface, so that the second terminal can project the target element into the local shooting preview interface after receiving the target element, the requirement of a user on real-time jigsaw puzzle is met, and the user experience is improved.
EXAMPLE III
As shown in fig. 3, an execution main body of the method may be a terminal device or a server, where the terminal device may be a device such as a personal computer, or may also be a mobile terminal device such as a mobile phone and a tablet computer, and the terminal device may be a terminal device used by a user. The method can be used for deducting corresponding materials from the three-dimensional image and sharing the materials to a receiving object and other processing. In this embodiment, for convenience of description, a device used by a sender or a user is denoted as a first terminal, and a device used by an object to be received is denoted as a second terminal, where the method specifically includes the following steps:
in step S302, the first terminal receives a selection operation of a material in the shooting preview interface by the user.
For the specific processing procedure of S302, reference may be made to relevant contents of S102 in the first embodiment, which is not described herein again.
In step S304, the first terminal determines the target material selected by the selection operation according to the color information and the depth information of the pixel point corresponding to the selection operation.
The target material may be composed of a plurality of pixels, and each pixel may include corresponding color information and depth information, for example, the material a is composed of 30 pixels, each pixel includes respective color information and depth information, and the depth information of the 30 pixels constitutes position information of the material.
In implementation, a user can perform a selection operation on a target material in a three-dimensional image, and the target material corresponding to the selection operation is determined through a pixel point corresponding to the selection operation. Or the user can perform continuous selection operation in the three-dimensional image, all the selection operation can form a closed graph, pixel points corresponding to all the selection operation are extracted, and the pixel points form a target material. For example, a three-dimensional image includes a material ball a, a cylinder B and a cube C, when a selection operation of a user on any point on the ball a is received, color information and depth information of the pixel point and color information of a background color are obtained, the selection operation corresponds to all pixel points around the pixel point, which are different from the color information of the background, and a set of the pixel points is image data of the ball a, so that the ball a is a target material selected by the user, or the user can perform continuous selection operation on the edge of the ball a to form a closed selection track, and the target material selected by the user, namely the ball a, is determined by obtaining all the pixel points on the selection track and in the selection track.
It should be noted that, when the color information of the background in the three-dimensional image data is uniform and has a large color difference with the material contained in the image data, the material is easier to extract, and the color information of the extracted target material is more complete.
In step S306, the first terminal retains color information of the target material and performs gradation processing on images other than the target material in the three-dimensional image.
The grayscale processing may be a process of converting a color image into a grayscale image, where the color of each pixel point is determined by a combination of three primary colors, namely red (R), green (G), and blue (B), the component values of the three primary colors are all in the range of 0-255, and the grayscale image may be a special image in which the component values of the three primary colors, namely red (R), green (G), and blue (B), of each pixel point in the image are equal.
In practice, the color information of the target material may be retained, and the rest of the three-dimensional image may be subjected to gray scale processing, and the method of performing gray scale processing includes various methods, such as a component method, a maximum value method, an average value method, a weighted average method, and the like. Taking the maximum value method as an example, if the three-dimensional image contains a sphere a, a cube B and a cylinder C, the sphere a is a target material, the color information of the sphere a may not be processed, but the gray level processing is performed on the other objects with color information (i.e. the cube B, the cylinder C, a picture background, etc.) except the color information of the sphere a in the three-dimensional image, i.e. the color information of all the pixel points of the above elements is obtained, the component values of the three primary colors of red (R), green (G) and blue (B) in each pixel point are obtained, the color with the largest component value in each pixel point is selected according to the maximum value method in the gray level processing, the values of the other two components are also set as the component values of the color with the largest component value, for example, if the component values of the three primary colors of one pixel point are red (150), green (200) and blue (50), the component value of green is, that is, 200, the component values of red and blue are also set to 200, so as to perform corresponding gray processing on the color information except for the ball a.
In the above embodiment, the maximum method in the gray scale processing is taken as an example, in practical application, different processing methods may be selected according to different application scenarios, and this is not specifically limited in this application.
In step S308, the first terminal displays the transmission option.
Wherein, the sending option can be a key or an icon with a "share" label.
In implementation, after the grayscale image except for the color information of the target material is obtained in step S306, a sending option is displayed, where the sending option may be a key or an icon with a "share" or "send" label, and may be displayed at any position in the display interface, for example, a key with a "share" label is displayed in the upper right corner of the display interface. And when the sending option is displayed, if the user performs selection operation on any point except the target material on the gray-scale image, restoring the color information of all pixel points on the three-dimensional image, canceling the sending option, and when the selection operation on the material of the user is received again, performing gray-scale processing on the image except the target material, and then displaying the sending option.
In step S310, when a user selects a sending option, the first terminal obtains a candidate receiving object of the target material.
In implementation, when a selection operation of a user on a sending option is received, a contact list of the user is obtained, and a candidate receiving object of the target material is formed. For example, when a user selects an element on a shot three-dimensional image in a real-time communication application and clicks a sending option, the terminal device automatically obtains a contact list of the user in the real-time communication application and displays the contact list on a display screen.
In step S312, a reception object of the target material is determined from among the candidate reception objects.
The candidate receiving object may be a contact list created by the user, or a contact list acquired by the user from the terminal device, or a contact in a real-time communication application installed in the terminal device by the user, which is not limited in this application.
In implementation, the user may select one or more candidate receiving objects as the receiving objects from among the candidate receiving objects displayed in the display screen. If the user wants to reselect the target material or wants to select the object to be received not in the displayed candidate receiving objects, the user can perform selection operation at any position in the display screen except the candidate receiving objects, so that the candidate receiving objects are closed. And when a selection instruction of the candidate receiving object by the user is received, acquiring the corresponding receiving object.
In step S314, the first terminal transmits data of the target material to the receiving object.
The data of the target material may be data including color information and depth information of all pixel points in the target material, and the number of the receiving objects may be one or more.
In implementation, after the receiving object is determined in step S312, data such as color information and depth information of all pixel points included in the selected target material is sent to the receiving object.
The processing manner of step S314 may be various, and besides the above-mentioned manner, the processing manner may also be various other manners, and specifically, the following step one may be included.
Step one, sending a real-time data stream of a target material captured in a shooting preview interface to a receiving object.
The real-time data stream of the target material captured in the shooting preview interface can be real-time color information, depth information and the like of a certain material in the preview interface of a three-dimensional image obtained by a 3D camera.
In implementation, when a user captures a target material in a shooting preview interface, color information and depth information of pixel points of the target material and a color difference value between the pixel points of the target material and background pixel points are obtained, when the position of a camera is changed, the pixel points contained in the target material are obtained again according to the color difference value between the pixel points of the target material and the background pixel points, and the data stream is sent to a receiving object in real time.
In step S316, the first terminal displays a cancel transmission option.
The sending option may be a key or an icon with a "cancel sharing" tag, and may be displayed at any position of the display screen.
In step S318, the first terminal performs corresponding highlighting on the cancel transmission option according to whether the receiving object receives the data of the target material.
In implementation, the cancel sending option is highlighted correspondingly according to whether the receiving object receives the data of the target material, for example, the receiving object may not be beside the terminal device, or the real-time communication application is not opened, and the target material to be received is neither received nor rejected, at this time, the color of the cancel sending option may be changed to yellow to indicate that the receiving object is waiting for receiving the target material, and if the receiving object has received the target material, the color of the cancel sending option is set to green to indicate that the sending is successful.
The above embodiments provide an optional and realizable solution, and in a specific implementation process, there may be a plurality of optional and realizable solutions, and the application does not limit the selection of the specific solution.
In step S320, when the first terminal receives a selection operation of the cancel transmission option from the user, the first terminal stops transmitting the data of the target material to the receiving object.
In implementation, after receiving the selection operation of the user on the cancel option, the sending target material is cancelled, and the process returns to step S306 to re-select the receiving object.
In step S322, the second terminal receives the data of the target material transmitted by the first terminal.
For the specific processing procedure of S322, reference may be made to the relevant content of S202 in the second embodiment, which is not described herein again.
In step S324, in a case where the shooting preview interface is included in the second terminal display interface, the target material is displayed in the shooting preview interface of the second terminal.
The specific processing procedure of S324 may refer to the related content of S204 in the second embodiment, which is not described herein again.
And when the received data of the target material is a real-time data stream of the target material captured by the first terminal, displaying the target material in a shooting preview interface of the second terminal based on the real-time data stream of the target material, wherein the target material in the shooting preview interface of the second terminal is synchronously displayed with the target material in the shooting preview interface of the first terminal.
In implementation, when the second terminal acquires the real-time data stream, the target material is displayed in the shooting preview interface, and when the shooting angle of the first terminal changes, the target material displayed in the shooting preview interface of the second terminal also changes in a similar manner through transmission of the real-time data stream. For example, as shown in fig. 4, the user 1 selects the target material T in the shooting preview interface and sends the selected target material T to the user 2, the user 2 displays the selected target material T in the shooting preview interface after receiving the target material, when the shooting angle of the user 1 changes, the pixel point of the acquired target material T also changes, and the target material T in the shooting preview interface of the user 2 also changes along with the change of the target material T in the shooting preview interface of the user 1 through transmission of the changed real-time data stream.
In step S326, when an input operation of the target material by the user is received at the shooting preview interface of the second terminal, the second terminal processes the target material according to the input operation, wherein the input operation includes a drag position operation, a zoom-in or zoom-out operation.
In implementation, in the shooting preview interface, a user can perform input operation on a target material, and the user can zoom in the target material through double-click operation on the target material, or perform zoom-in or zoom-out operation on the target material through pressing and opening operation on the target material, or change the position of the target material on the shooting preview interface through dragging operation on the target material by a finger. In practical applications, there may be a plurality of ways to implement input operations on target materials, the above embodiment provides an optional and realizable operation way, and the selection of a specific operation way may be different according to different application scenarios, and the selection of the operation way in the present application is not limited.
The embodiment of the application provides a method for processing three-dimensional image data, which comprises the steps of receiving selection operation of a user on a target material in a shooting preview interface, wherein the target material is one or more materials in a three-dimensional image displayed in the shooting preview interface, then determining a receiving object of the target material, finally sending data of the target material to the receiving object, and subsequently displaying the target material in the shooting preview interface of a second terminal when the shooting preview interface is included in a display interface of the second terminal. Like this, the user just can select the target material that needs to matte in shooting the preview interface, and scratch and share this target material to this realizes that the user accurately rubs out in real time to the image fast, and moreover, because the process of scratching out of image is being shot the preview interface and is realizing, need not shoot the image in the terminal, more need not save the image of shooting, thereby reduces the resource occupation at terminal. Meanwhile, after the receiving end receives the target material, the target material can be projected into a local shooting preview interface, the requirement of real-time picture splicing of a user is met, and the user experience is improved.
Example four
Based on the same idea, the method for processing a three-dimensional image provided in the embodiments of the present specification further provides a device for processing three-dimensional image data, as shown in fig. 5.
The processing device of the three-dimensional image data comprises: a receiving module 501, a determining module 502 and a sending module 503, wherein:
the receiving module 501 is configured to receive a selection operation of a user on a target material in a shooting preview interface, where the target material is one or more materials in a three-dimensional image displayed in the shooting preview interface;
a determining module 502, configured to determine a receiving object of the target material;
a sending module 503, configured to send the data of the target material to the receiving object.
In this embodiment of the application, the receiving module 501 includes:
the receiving unit is used for receiving the selection operation of the user on the material in the shooting preview interface;
and the selecting unit is used for determining the target material selected by the selecting operation according to the color information and the depth information of the pixel point corresponding to the selecting operation.
In this embodiment of the application, the sending module 503 is configured to send the real-time data stream of the target material captured in the shooting preview interface to the receiving object.
In an embodiment of the present application, the apparatus further includes:
and the gray processing module is used for reserving the color information of the target material and carrying out gray processing on the images except the target material in the three-dimensional image.
In this embodiment of the application, the determining module 502 includes:
a first display unit for displaying a transmission option;
the candidate object determining unit is used for acquiring a candidate receiving object of the target material when the selection operation of the user on the sending option is received;
a receiving object determining unit configured to determine a receiving object of the target material from the candidate receiving objects.
In an embodiment of the present application, the apparatus further includes:
the second display module is used for displaying a sending canceling option;
and the canceling module is used for stopping sending the data of the target material to the receiving object when receiving the selection operation of the user on the cancel sending option.
In the embodiment of the present application, the apparatus further includes:
and the highlight display module is used for correspondingly highlighting the option of canceling sending according to the condition that whether the receiving object receives the data of the target material.
The embodiment of the application provides a method for processing three-dimensional image data, which comprises the steps of receiving selection operation of a user on a target material in a shooting preview interface, determining a receiving object of the target material, and finally sending data of the target material to the receiving object, wherein the target material is one or more materials in a three-dimensional image displayed in the shooting preview interface. Like this, the user just can select the target material that needs to matte in shooting the preview interface, and scratch and share this target material to this realizes that the user accurately rubs out in real time to the image fast, and moreover, because the process of scratching out of image is being shot the preview interface and is realizing, need not shoot the image in the terminal, more need not save the image of shooting, thereby reduces the resource occupation at terminal. Meanwhile, after the receiving end receives the target material, the target material can be projected into a local shooting preview interface, the requirement of real-time picture splicing of a user is met, and the user experience is improved.
EXAMPLE five
Based on the same idea, embodiments of the present specification further provide a display device for three-dimensional image data, as shown in fig. 6.
The display device for three-dimensional image data includes: an obtaining module 601 and a display module 602, wherein:
an obtaining module 601, configured to receive data of a target material sent by a first terminal;
a display module 602, configured to display the target material in a shooting preview interface of the device if the display interface of the device includes the shooting preview interface.
In this embodiment of the application, the received data of the target material is a real-time data stream of the target material captured by the first terminal,
the display module 602 is configured to display the target material in a shooting preview interface of the device based on the real-time data stream of the target material, where the target material in the shooting preview interface of the device and the target material in the shooting preview interface of the first terminal are displayed synchronously.
In an embodiment of the present application, the apparatus further includes:
and the management module is used for processing the target material according to the input operation when the input operation of the target material by a user is received on a shooting preview interface of the device, wherein the input operation comprises a position dragging operation and an enlarging or reducing operation.
The embodiment of the application provides a display device of three-dimensional image data, receives the data of the target material that first terminal sent, the target material is the user at first terminal selects operation to one or more materials in the three-dimensional image, and passes through color information and the degree of depth information of the pixel that selection operation corresponds determine the material that selection operation was selected, when including shooting the preview interface in the display interface, will the target material shows in shooting the preview interface, like this, the receiving end after receiving the target element, can be with the projection of target element to local shooting preview interface in, satisfied the demand of the real-time picture arragement of user, improved user experience.
EXAMPLE six
Figure 7 is a schematic diagram of a hardware configuration of a mobile terminal implementing various embodiments of the present invention,
the mobile terminal 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 7 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 710 is configured to receive a selection operation of a user on a target material in a shooting preview interface, where the target material is one or more materials in a three-dimensional image displayed in the shooting preview interface;
a processor 710 for determining a receiving object of the target material;
the processor 710 is further configured to send data of the target material to the receiving object.
In addition, the processor 710 is further configured to receive a selection operation of the material in the shooting preview interface by the user;
in addition, the processor 710 is further configured to determine a target material selected by the selection operation according to the color information and the depth information of the pixel point corresponding to the selection operation.
In addition, the processor 710 is further configured to send a real-time data stream of the target material captured in the shooting preview interface to the receiving object.
The processor 710 is further configured to retain color information of the target material, and perform gray processing on images other than the target material in the three-dimensional image.
In addition, the processor 710 is further configured to display a sending option.
In addition, the processor 710 is further configured to obtain a candidate receiving object of the target material when receiving a selection operation of the sending option by the user.
The processor 710 is further configured to determine a receiving object of the target material from the candidate receiving objects.
The processor 710 is further configured to display a cancel send option.
In addition, the processor 710 is further configured to stop sending the data of the target material to the receiving object when a user selects the cancel sending option.
The display unit 706 is configured to perform corresponding highlighting on the cancel sending option according to a condition that whether the receiving object receives the data of the target material.
In addition, the mobile terminal 700 may further perform the following processing procedures:
the processor 710 is further configured to receive data of the target material sent by the first terminal.
In addition, the display unit 706 is further configured to display the target material in the shooting preview interface of the mobile terminal 700 if the shooting preview interface is included in the display interface of the mobile terminal 700.
In addition, the display unit 706 is further configured to display the target material in the shooting preview interface of the mobile terminal 700 based on the real-time data stream of the target material, where the target material in the shooting preview interface of the mobile terminal 700 is displayed in synchronization with the target material in the shooting preview interface of the first terminal.
In addition, the processor 710 is further configured to, when an input operation of the target material by a user is received on the shooting preview interface of the mobile terminal 700, process the target material according to the input operation, where the input operation includes a drag position operation, a zoom-in operation, or a zoom-out operation.
The embodiment of the application provides a mobile terminal, which is used for receiving the selection operation of a user on a target material in a shooting preview interface, wherein the target material is one or more materials in a three-dimensional image displayed in the shooting preview interface, then determining a receiving object of the target material, and finally sending the data of the target material to the receiving object, and the target material is displayed in the shooting preview interface under the condition that the shooting preview interface is included in a subsequent display interface. Like this, the user just can select the target material that needs to matte in shooting the preview interface, and scratch and share this target material to this realizes that the user accurately rubs out in real time to the image fast, and moreover, because the process of scratching out of image is being shot the preview interface and is realizing, need not shoot the image in the terminal, more need not save the image of shooting, thereby reduces the resource occupation at terminal. Meanwhile, after the receiving end receives the target material, the target material can be projected into a local shooting preview interface, the requirement of real-time picture splicing of a user is met, and the user experience is improved.
It should be understood that, in the embodiment of the present application, the radio frequency unit 701 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 710; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 701 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access via the network module 702, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output as sound. Also, the audio output unit 703 may also provide audio output related to a specific function performed by the mobile terminal 700 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used to receive audio or video signals. The input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics processor 7041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 706. The image frames processed by the graphic processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 701 in case of a phone call mode.
The mobile terminal 700 also includes at least one sensor 705, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 7061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 7061 and/or a backlight when the mobile terminal 700 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 705 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 706 is used to display information input by the user or information provided to the user. The Display unit 706 may include a Display panel 7061, and the Display panel 7061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 7071 (e.g., operations by a user on or near the touch panel 7071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 7071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, receives a command from the processor 710, and executes the command. In addition, the touch panel 7071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, the other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 7071 may be overlaid on the display panel 7061, and when the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch operation is transmitted to the processor 710 to determine the type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event. Although the touch panel 7071 and the display panel 7061 are shown in fig. 7 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 7071 and the display panel 7061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 708 is an interface through which an external device is connected to the mobile terminal 700. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 708 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 700 or may be used to transmit data between the mobile terminal 700 and external devices.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 409 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 709 and calling data stored in the memory 709, thereby integrally monitoring the mobile terminal. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The mobile terminal 700 may also include a power supply 711 (e.g., a battery) for powering the various components, and the power supply 711 may be logically coupled to the processor 710 via a power management system that may enable managing charging, discharging, and power consumption by the power management system.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 710, a memory 709, and a computer program stored in the memory 709 and capable of running on the processor 710, where the computer program is executed by the processor 710 to implement each process of the foregoing three-dimensional image data processing method or three-dimensional image data display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here.
EXAMPLE seven
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the foregoing three-dimensional image data processing method or three-dimensional image data display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
The embodiment of the application provides a computer-readable storage medium, which is characterized in that the selection operation of a user on a target material in a shooting preview interface is received, the target material is one or more materials in a three-dimensional image displayed in the shooting preview interface, then a receiving object of the target material is determined, finally, data of the target material is sent to the receiving object, and the target material is displayed in the shooting preview interface of a second terminal when the shooting preview interface is included in a display interface of the second terminal subsequently. Therefore, the user can determine corresponding materials through color information and depth information in the three-dimensional image data, and the materials are scratched and shared, so that the user can accurately scratch the three-dimensional image.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (18)

1. A method of processing three-dimensional image data, the method comprising:
receiving selection operation of a user on target materials in a shooting preview interface, wherein the target materials are one or more materials in a three-dimensional image displayed in the shooting preview interface;
determining a receiving object of the target material;
sending the data of the target material to the receiving object;
after receiving a selection operation of a user on a target material in a shooting preview interface, the method further comprises the following steps:
reserving color information of the target material, and carrying out gray processing on images except the target material in the three-dimensional image;
the determining the receiving object of the target material comprises:
displaying a sending option;
when receiving the selection operation of the user on the sending option, acquiring a candidate receiving object of the target material;
determining a receiving object of the target material from the candidate receiving objects;
and when receiving the selection operation of the user on any point of the gray-scale image except the target material in the three-dimensional image, recovering the color information of all pixel points on the three-dimensional image, and canceling the sending option.
2. The method of claim 1, wherein receiving a selection operation of a user on a target material in a shooting preview interface comprises:
receiving the selection operation of a user on the material in a shooting preview interface;
and determining the target material selected by the selection operation according to the color information and the depth information of the pixel point corresponding to the selection operation.
3. The method of claim 2, wherein said sending data of said target material to said receiving object comprises:
and sending the real-time data stream of the target material captured in the shooting preview interface to the receiving object.
4. The method of claim 1, wherein after transmitting the data of the target material to the receiving object, the method further comprises:
displaying a cancel sending option;
and when the selection operation of the user on the cancel sending option is received, stopping sending the data of the target material to the receiving object.
5. The method of claim 4, wherein after displaying the cancel send option, the method further comprises:
and correspondingly highlighting the option of canceling sending according to the condition that whether the receiving object receives the data of the target material.
6. A method for displaying three-dimensional image data is applied to a second terminal, and is characterized by comprising the following steps:
receiving data of a target material sent by a first terminal;
under the condition that a display interface of a second terminal comprises a shooting preview interface, displaying the target material in the shooting preview interface of the second terminal; generating thumbnails of the target materials at any positions of four corners of the shooting preview interface, and stopping displaying the target materials in the shooting preview interface of the second terminal when long-time pressing operation of a user on the thumbnails is received;
when receiving data of a target material sent by the first terminal, generating a notification message, wherein the notification message contains user information of the first terminal and a thumbnail of the target material.
7. The method of claim 6, wherein the received data of the target material is a real-time data stream of the target material captured by the first terminal,
the displaying the target material in a shooting preview interface of the second terminal includes:
and displaying the target material in a shooting preview interface of the second terminal based on the real-time data stream of the target material, wherein the target material in the shooting preview interface of the second terminal is synchronously displayed with the target material in the shooting preview interface of the first terminal.
8. The method of claim 6, further comprising:
and when an input operation of a user on the target material is received on a shooting preview interface of the second terminal, processing the target material according to the input operation, wherein the input operation comprises a dragging position operation, an amplifying operation or a reducing operation.
9. An apparatus for processing three-dimensional image data, the apparatus comprising:
the shooting preview interface comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving the selection operation of a user on a target material in the shooting preview interface, and the target material is one or more materials in a three-dimensional image displayed in the shooting preview interface;
the determining module is used for determining a receiving object of the target material;
the sending module is used for sending the data of the target material to the receiving object;
the gray processing module is used for reserving the color information of the target material and carrying out gray processing on the images except the target material in the three-dimensional image;
the determining module includes:
a display unit for displaying the sending option;
the candidate object determining unit is used for acquiring a candidate receiving object of the target material when the selection operation of the user on the sending option is received;
a receiving object determining unit configured to determine a receiving object of the target material from the candidate receiving objects;
and the color recovery unit is used for recovering the color information of all pixel points on the three-dimensional image and canceling the sending option when receiving the selection operation of the user on any point of the gray-scale image except the target material in the three-dimensional image.
10. The apparatus of claim 9, wherein the receiving module comprises:
the receiving unit is used for receiving the selection operation of the user on the material in the shooting preview interface;
and the selecting unit is used for determining the target material selected by the selecting operation according to the color information and the depth information of the pixel point corresponding to the selecting operation.
11. The apparatus of claim 10, wherein the sending module is configured to send the real-time data stream of the target material captured in the capture preview interface to the receiving object.
12. The apparatus of claim 9, further comprising:
the second display module is used for displaying a sending canceling option;
and the canceling module is used for stopping sending the data of the target material to the receiving object when receiving the selection operation of the user on the cancel sending option.
13. The apparatus of claim 12, further comprising:
and the highlight display module is used for correspondingly highlighting the option of canceling sending according to the condition that whether the receiving object receives the data of the target material.
14. An apparatus for displaying three-dimensional image data, the apparatus comprising:
the acquisition module is used for receiving data of a target material sent by a first terminal;
the display module is used for displaying the target material in a shooting preview interface of the device under the condition that the display interface of the device comprises the shooting preview interface;
the stopping module is used for generating thumbnails of the target materials at any position of four corners of the shooting preview interface, and stopping displaying the target materials in the shooting preview interface of the second terminal when long-time pressing operation of a user on the thumbnails is received;
and the notification module is used for generating a notification message when receiving the data of the target material sent by the first terminal, wherein the notification message contains the user information of the first terminal and the thumbnail of the target material.
15. The apparatus of claim 14, wherein the received data of the target material is a real-time data stream of the target material captured by the first terminal, and the display module is configured to display the target material in the capture preview interface of the apparatus based on the real-time data stream of the target material, wherein the target material in the capture preview interface of the apparatus is displayed in synchronization with the target material in the capture preview interface of the first terminal.
16. The apparatus of claim 14, further comprising:
and the management module is used for processing the target material according to the input operation when the input operation of the target material by a user is received on a shooting preview interface of the device, wherein the input operation comprises a position dragging operation and an enlarging or reducing operation.
17. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the method of processing three-dimensional image data according to any one of claims 1 to 5.
18. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on said memory and executable on said processor, said computer program, when executed by said processor, implementing the steps of a method of displaying three-dimensional image data according to any one of claims 6 to 8.
CN201811103389.7A 2018-09-20 2018-09-20 Three-dimensional image data processing and displaying method and device and mobile terminal Active CN109068063B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811103389.7A CN109068063B (en) 2018-09-20 2018-09-20 Three-dimensional image data processing and displaying method and device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811103389.7A CN109068063B (en) 2018-09-20 2018-09-20 Three-dimensional image data processing and displaying method and device and mobile terminal

Publications (2)

Publication Number Publication Date
CN109068063A CN109068063A (en) 2018-12-21
CN109068063B true CN109068063B (en) 2021-01-15

Family

ID=64762308

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811103389.7A Active CN109068063B (en) 2018-09-20 2018-09-20 Three-dimensional image data processing and displaying method and device and mobile terminal

Country Status (1)

Country Link
CN (1) CN109068063B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111163264B (en) * 2019-12-31 2022-02-01 维沃移动通信有限公司 Information display method and electronic equipment
CN113362220B (en) * 2021-05-26 2023-08-18 稿定(厦门)科技有限公司 Multi-equipment matting drawing method
CN114329221A (en) * 2021-12-31 2022-04-12 钻技(上海)信息科技有限公司 Commodity searching method, equipment and storage medium
CN114979495B (en) * 2022-06-28 2024-04-12 北京字跳网络技术有限公司 Method, apparatus, device and storage medium for content shooting

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101137152A (en) * 2007-09-27 2008-03-05 腾讯科技(深圳)有限公司 Method, system and equipment for interacting three-dimensional cartoon in mobile instant communication
CN101577795A (en) * 2009-06-17 2009-11-11 深圳华为通信技术有限公司 Method and device for realizing real-time viewing of panoramic picture
CN103220644A (en) * 2013-03-06 2013-07-24 北京小米科技有限责任公司 Short message sending method, device and equipment
CN103973977A (en) * 2014-04-15 2014-08-06 联想(北京)有限公司 Blurring processing method and device for preview interface and electronic equipment
CN105554364A (en) * 2015-07-30 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN107509031A (en) * 2017-08-31 2017-12-22 广东欧珀移动通信有限公司 Image processing method, device, mobile terminal and computer-readable recording medium
CN107592466A (en) * 2017-10-13 2018-01-16 维沃移动通信有限公司 A kind of photographic method and mobile terminal
CN107809580A (en) * 2017-09-21 2018-03-16 努比亚技术有限公司 One kind shooting processing method, terminal and computer-readable recording medium
CN107846566A (en) * 2017-10-31 2018-03-27 努比亚技术有限公司 A kind of information processing method, equipment and computer-readable recording medium
CN108307173A (en) * 2016-08-31 2018-07-20 北京康得新创科技股份有限公司 The processing method of picture receives terminal, sends terminal

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9116596B2 (en) * 2012-06-10 2015-08-25 Apple Inc. Sharing images and comments across different devices
CN103513752B (en) * 2012-06-18 2017-02-22 华为技术有限公司 Gesture operation method, gesture operation device and gesture operation system
TWI520577B (en) * 2012-08-10 2016-02-01 晨星半導體股份有限公司 Stereo image output apparatus and associated stereo image output method
JP6121787B2 (en) * 2013-04-26 2017-04-26 株式会社ソニー・インタラクティブエンタテインメント Imaging apparatus, information processing system, and image data processing method
KR101582726B1 (en) * 2013-12-27 2016-01-06 재단법인대구경북과학기술원 Apparatus and method for recognizing distance of stereo type
CN103824210A (en) * 2014-02-13 2014-05-28 夷希数码科技(上海)有限公司 Plane magazine three-dimensional display method and system
EP3486815A1 (en) * 2014-07-31 2019-05-22 Hewlett-Packard Development Company, L.P. Model data of an object disposed on a movable surface
CN104883494B (en) * 2015-04-30 2016-08-24 努比亚技术有限公司 A kind of method and device of video capture
KR102531117B1 (en) * 2015-10-07 2023-05-10 삼성메디슨 주식회사 Method and apparatus for displaying an image which indicates an object
CN106296805B (en) * 2016-06-06 2019-02-26 厦门铭微科技有限公司 A kind of augmented reality human body positioning navigation method and device based on Real-time Feedback
CN106296574A (en) * 2016-08-02 2017-01-04 乐视控股(北京)有限公司 3-d photographs generates method and apparatus
CN106803921A (en) * 2017-03-20 2017-06-06 深圳市丰巨泰科电子有限公司 Instant audio/video communication means and device based on AR technologies
CN107890671B (en) * 2017-12-05 2020-10-30 腾讯科技(深圳)有限公司 Three-dimensional model rendering method and device for WEB side, computer equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101137152A (en) * 2007-09-27 2008-03-05 腾讯科技(深圳)有限公司 Method, system and equipment for interacting three-dimensional cartoon in mobile instant communication
CN101577795A (en) * 2009-06-17 2009-11-11 深圳华为通信技术有限公司 Method and device for realizing real-time viewing of panoramic picture
CN103220644A (en) * 2013-03-06 2013-07-24 北京小米科技有限责任公司 Short message sending method, device and equipment
CN103973977A (en) * 2014-04-15 2014-08-06 联想(北京)有限公司 Blurring processing method and device for preview interface and electronic equipment
CN105554364A (en) * 2015-07-30 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN108307173A (en) * 2016-08-31 2018-07-20 北京康得新创科技股份有限公司 The processing method of picture receives terminal, sends terminal
CN107509031A (en) * 2017-08-31 2017-12-22 广东欧珀移动通信有限公司 Image processing method, device, mobile terminal and computer-readable recording medium
CN107809580A (en) * 2017-09-21 2018-03-16 努比亚技术有限公司 One kind shooting processing method, terminal and computer-readable recording medium
CN107592466A (en) * 2017-10-13 2018-01-16 维沃移动通信有限公司 A kind of photographic method and mobile terminal
CN107846566A (en) * 2017-10-31 2018-03-27 努比亚技术有限公司 A kind of information processing method, equipment and computer-readable recording medium

Also Published As

Publication number Publication date
CN109068063A (en) 2018-12-21

Similar Documents

Publication Publication Date Title
US10497097B2 (en) Image processing method and device, computer readable storage medium and electronic device
CN110913132B (en) Object tracking method and electronic equipment
CN107977144B (en) Screen capture processing method and mobile terminal
CN109068063B (en) Three-dimensional image data processing and displaying method and device and mobile terminal
CN108495029B (en) Photographing method and mobile terminal
CN110719402B (en) Image processing method and terminal equipment
CN107977652B (en) Method for extracting screen display content and mobile terminal
CN110365907B (en) Photographing method and device and electronic equipment
CN107230065B (en) Two-dimensional code display method and device and computer readable storage medium
CN109246351B (en) Composition method and terminal equipment
CN111752450A (en) Display method and device and electronic equipment
CN111597370A (en) Shooting method and electronic equipment
CN111124231B (en) Picture generation method and electronic equipment
CN112511741A (en) Image processing method, mobile terminal and computer storage medium
CN110209324B (en) Display method and terminal equipment
CN109639981B (en) Image shooting method and mobile terminal
CN109491964B (en) File sharing method and terminal
CN110086998B (en) Shooting method and terminal
CN109104573B (en) Method for determining focusing point and terminal equipment
CN109005314B (en) Image processing method and terminal
CN110944163A (en) Image processing method and electronic equipment
CN109104564B (en) Shooting prompting method and terminal equipment
CN107734269B (en) Image processing method and mobile terminal
CN109922256B (en) Shooting method and terminal equipment
CN108509126B (en) Picture processing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant