CN115002440B - AR-based image acquisition method and device, electronic equipment and storage medium - Google Patents
AR-based image acquisition method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN115002440B CN115002440B CN202210496170.8A CN202210496170A CN115002440B CN 115002440 B CN115002440 B CN 115002440B CN 202210496170 A CN202210496170 A CN 202210496170A CN 115002440 B CN115002440 B CN 115002440B
- Authority
- CN
- China
- Prior art keywords
- acquisition
- layer
- range
- displaying
- guide
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention provides an AR-based image acquisition method, an AR-based image acquisition device, electronic equipment and a storage medium, wherein the method comprises the following steps: responding to a range positioning instruction, displaying an AR range positioning layer in the real scene, and determining an acquisition range corresponding to the AR range positioning layer in the real scene; responding to a guiding and positioning instruction aiming at the acquisition range, displaying an acquisition indication layer corresponding to the guiding and positioning instruction in the AR range positioning layer, and determining an acquisition guiding layer corresponding to the acquisition indication layer in the acquisition range; and responding to the movement of the electronic terminal, controlling the electronic terminal to perform circular shooting movement on the acquisition guide layer along a preset direction, and performing image acquisition on a target object positioned in the center of the acquisition guide layer to obtain a target image corresponding to the target object.
Description
Technical Field
The present invention relates to the field of image acquisition technologies, and in particular, to an AR-based image acquisition method, an AR-based image acquisition apparatus, an electronic device, and a computer-readable storage medium.
Background
With the development of VR (Virtual Reality) technology, VR commodity display becomes a novel display form with strong interactivity following pictures and videos, and is beginning to be widely applied in people's life. Currently, the real objects are mainly re-engraved into the virtual world, and the process involves the acquisition of image data, so that the efficiency and quality of content acquisition are key factors affecting the popularization of VR commodity display forms and user experience. In the process of image acquisition of an object, a user shoots a plurality of angle images of the object through a handheld shooting device. In the image acquisition process, the shooting angle can be positioned through a model, a template and the like so as to realize image acquisition of an object. However, in these image acquisition processes, for the model, the template, etc., the user is relied on to align the model, the template, etc. with the object in the shooting process, meanwhile, the model, the template, etc. are not necessarily completely attached to the shape of the object, so that the alignment is not accurate enough, and the user needs to not only find the shooting point location but also adjust the shooting terminal to align with the object in the shooting process, so that the operation difficulty and the cost are high. In addition, still rely on stability and the accuracy of user's handheld in the shooting process, lead to final product bandwagon effect to be unstable easily when the fluctuation takes place, reduced the quality of image acquisition.
Disclosure of Invention
The embodiment of the invention provides an AR-based image acquisition method, an AR-based image acquisition device, an electronic device and a computer-readable storage medium, which are used for solving or partially solving the problems of difficulty in image acquisition operation, high cost, unstable acquisition effect and poor acquired image quality in the image acquisition process in the related technology.
The embodiment of the invention discloses an AR-based image acquisition method, wherein content displayed through a graphical user interface of an electronic terminal at least comprises a real scene, and the method comprises the following steps:
responding to a range positioning instruction, displaying an AR range positioning layer in the real scene, and determining an acquisition range corresponding to the AR range positioning layer in the real scene;
responding to a guiding and positioning instruction aiming at the acquisition range, displaying an acquisition indication layer corresponding to the guiding and positioning instruction in the AR range positioning layer, and determining an acquisition guiding layer corresponding to the acquisition indication layer in the acquisition range;
and responding to the movement of the electronic terminal, controlling the electronic terminal to perform circular shooting movement on the acquisition guide layer along a preset direction, and performing image acquisition on a target object positioned in the center of the acquisition guide layer to obtain a target image corresponding to the target object.
Optionally, the responding to the range positioning instruction displays an AR range positioning layer in the real scene, and determines an acquisition range corresponding to the AR range positioning layer in the real scene, including:
displaying an AR range positioning layer in the real scene;
responding to a dragging operation aiming at the AR range positioning layer, controlling the AR range positioning layer to move in the real scene, and acquiring a scene image corresponding to a coverage area of the AR range positioning layer in the real scene;
displaying the AR range positioning layer in the real scene according to the scene image in a preset display mode, responding to a range positioning instruction aiming at the AR range positioning layer, and taking the coverage area of the AR range positioning layer in the real scene as an acquisition range corresponding to the range positioning instruction.
Optionally, the displaying the AR range positioning layer on the real scene according to the scene image in a preset display style includes:
if the surface of the horizontal plane corresponding to the scene image is flat and has no obstacle, displaying the AR range positioning layer on the real scene in a first display mode;
And if the horizontal plane corresponding to the scene image is uneven and/or an obstacle exists, displaying the AR range positioning layer in the real scene in a second display mode.
Optionally, the displaying, in response to the guiding positioning instruction for the acquisition range, an acquisition indication layer corresponding to the guiding positioning instruction in the AR range positioning layer, and determining, in the acquisition range, an acquisition guide layer corresponding to the acquisition indication layer includes:
responding to a guide positioning instruction aiming at the acquisition range, and displaying an acquisition indication layer corresponding to the guide positioning instruction and a guide line landing point corresponding to the center of the acquisition indication layer in the AR range positioning layer;
if a target object to be subjected to image acquisition exists in the acquisition range of the real scene, responding to the motion of the electronic terminal, and controlling the acquisition indication layer to move on the AR range positioning layer so that the landing point of the guide line is positioned on the target object;
and in the case that the finger landing point is positioned on the target object, in response to a guiding positioning instruction for the acquisition instruction layer, placing the acquisition instruction layer on the AR range positioning layer, and displaying the acquisition instruction layer for the target object in the acquisition range.
Optionally, the displaying, in response to the guiding and positioning instruction for the acquisition range, an acquisition indication layer corresponding to the guiding and positioning instruction in the AR range positioning layer, and determining, in the acquisition range, an acquisition guide layer corresponding to the acquisition indication layer, further includes:
if the target object does not exist in the acquisition range of the real scene, responding to the motion of the electronic terminal, and controlling the acquisition indication layer to move on the AR range positioning layer so that the landing point of the guide line is positioned at the center of the AR range positioning layer;
and when the finger landing point is positioned at the center of the AR range positioning layer, responding to a guiding positioning instruction aiming at the acquisition indication layer, placing the acquisition indication layer on the AR range positioning layer, and displaying an acquisition guide layer corresponding to the acquisition range.
Optionally, the positioning the acquisition indication layer on the AR range positioning layer in response to a guiding positioning instruction for the acquisition indication layer includes:
and responding to touch operation of any position except the acquisition indication layer in the graphical user interface, and placing the acquisition indication layer on the AR range positioning layer.
Optionally, the method further comprises:
and in response to a radius adjustment operation for the acquisition guide area, adjusting the display radius of the acquisition guide layer according to the radius adjustment operation, and simultaneously displaying the radius value of the acquisition guide layer.
Optionally, the method further comprises:
and responding to the movement of the acquisition indication layer in the AR range positioning layer, if the acquisition indication layer contacts with the edge of the AR range positioning layer, displaying position abnormality prompting information aiming at the acquisition indication layer in the graphical user interface, and setting the acquisition indication layer in a non-placeable state.
Optionally, the method further comprises:
displaying a reset control for the acquisition guide layer in the graphical user interface;
and canceling the acquisition guide layer in response to the touch operation for the reset control, and redisplaying the acquisition guide layer on the AR range positioning layer.
Optionally, the method further comprises:
displaying a determination control for the acquisition guide layer in the graphical user interface;
responding to the touch operation of the determination control, acquiring a point closest to the electronic terminal from the acquisition guide layer as an initial acquisition point, and acquiring an acquisition mode of the current image acquisition;
And if the acquisition mode is a video frame extraction acquisition mode, displaying a point position identifier and an acquisition direction identifier corresponding to the initial acquisition point on the acquisition guide layer by taking the initial acquisition point as the initial point.
Optionally, the method further comprises:
displaying an acquisition pointing identifier of a pointing target object in the graphical user interface;
responding to the motion of the electronic terminal, and acquiring acquisition state parameters corresponding to the electronic terminal and calibration parameters corresponding to the acquisition guide layer in the process of performing circular shooting motion along the acquisition guide layer;
and displaying acquisition prompt information aiming at the target object in the graphical user interface according to the acquisition state parameter and the calibration parameter, wherein the acquisition prompt information comprises calibration prompt information for prompting a user to adjust the terminal state of the electronic terminal and normal prompt information for prompting the user to continuously keep the terminal state of the electronic terminal.
Optionally, the acquisition state parameter includes a horizontal distance between the electronic terminal and a center point of the acquisition guiding layer, the calibration parameter includes a distance threshold range corresponding to the acquisition guiding layer, and the displaying, in the graphical user interface, acquisition prompt information for the target object according to the acquisition state parameter and the calibration parameter includes;
If the horizontal distance is outside the distance threshold range, displaying distance calibration prompt information in the graphical user interface and displaying the acquisition pointing identifier in a third display mode;
and if the horizontal distance is within the distance threshold range, displaying the acquisition pointing identifier in the graphical user interface in a fourth display mode.
Optionally, the acquisition state parameter includes a real-time motion speed, the calibration parameter includes a speed threshold range, and displaying, in the graphical user interface, acquisition prompt information for the target object according to the acquisition state parameter and the calibration parameter includes:
displaying a real-time speed value corresponding to the acquisition guide layer in the graphical user interface;
if the real-time movement speed is within the speed threshold range, displaying the acquisition pointing identifier in the graphical user interface by adopting a fourth display mode;
and if the real-time movement speed is out of the speed threshold range, displaying speed control prompt information in the acquisition interface and displaying the real-time speed value by adopting a fifth display mode.
Optionally, the collection state parameter includes a real-time pitch angle, the calibration parameter includes a pitch angle range, and the displaying, in the graphical user interface, collection prompt information for the target object according to the collection state parameter and the calibration parameter includes:
if the real-time pitch angle is out of the pitch angle range, angle calibration prompt information and a calibration sight are displayed in the graphical user interface, the acquisition pointing identifier is displayed by adopting a third display mode, and the real-time angle sight is displayed at the head position of the target object pointed by the acquisition pointing identifier;
and responding to the movement of the electronic terminal, controlling the movement of the electronic terminal to enable the real-time angle sight to move towards the calibration sight, hiding the real-time angle sight and the calibration sight if the coincidence degree between the real-time guiding sight and the calibration sight reaches a preset threshold, and displaying the acquisition pointing identifier by adopting a fourth display mode.
Optionally, the method further comprises:
if at least two calibration prompt messages exist, respectively acquiring prompt priorities corresponding to the calibration prompt messages, and displaying the calibration prompt message with the highest prompt priority;
And if the terminal state corresponding to the calibration prompt information with the highest prompt priority meets the condition and the calibration prompt information with the next prompt priority exists, displaying the calibration prompt information with the next prompt priority.
Optionally, the method further comprises:
and responding to the movement of the electronic terminal, and if the current position of the electronic terminal is located outside the acquisition guide layer and within the acquisition range, displaying the whole acquisition progress aiming at the target object on the acquisition guide layer.
Optionally, the responding to the movement of the electronic terminal controls the electronic terminal to perform loop shooting movement on the acquisition guiding layer along a preset direction, performs image acquisition on a target object located in the center of the acquisition guiding layer, and obtains a target image corresponding to the target object, including:
and responding to the movement of the electronic terminal, controlling the electronic terminal to surround at least one circle along the direction corresponding to the acquisition direction mark by taking a target object positioned in the center of the acquisition guide layer as a central axis, and acquiring an image of the target object to obtain a target image corresponding to the target object.
The embodiment of the invention also discloses an AR-based image acquisition device, wherein the content displayed through the graphical user interface of the electronic terminal at least comprises a real scene, and the device comprises:
The acquisition range determining module is used for responding to a range positioning instruction, displaying an AR range positioning layer in the real scene and determining an acquisition range corresponding to the AR range positioning layer in the real scene;
an acquisition guide layer determining module, configured to display an acquisition guide layer corresponding to a guide positioning instruction in the AR range positioning layer in response to the guide positioning instruction for the acquisition range, and determine an acquisition guide layer corresponding to the acquisition guide layer in the acquisition range;
and the target image acquisition module is used for responding to the movement of the electronic terminal, controlling the electronic terminal to perform loop shooting movement on the acquisition guide layer along a preset direction, and performing image acquisition on a target object positioned in the center of the acquisition guide layer to obtain a target image corresponding to the target object.
Optionally, the acquisition range determining module is specifically configured to:
displaying an AR range positioning layer in the real scene;
responding to a dragging operation aiming at the AR range positioning layer, controlling the AR range positioning layer to move in the real scene, and acquiring a scene image corresponding to a coverage area of the AR range positioning layer in the real scene;
Displaying the AR range positioning layer in the real scene according to the scene image in a preset display mode, responding to a range positioning instruction aiming at the AR range positioning layer, and taking the coverage area of the AR range positioning layer in the real scene as an acquisition range corresponding to the range positioning instruction.
Optionally, the acquisition range determining module is specifically configured to:
if the surface of the horizontal plane corresponding to the scene image is flat and has no obstacle, displaying the AR range positioning layer on the real scene in a first display mode;
and if the horizontal plane corresponding to the scene image is uneven and/or an obstacle exists, displaying the AR range positioning layer in the real scene in a second display mode.
Optionally, the acquisition guide layer determining module is specifically configured to:
responding to a guide positioning instruction aiming at the acquisition range, and displaying an acquisition indication layer corresponding to the guide positioning instruction and a guide line landing point corresponding to the center of the acquisition indication layer in the AR range positioning layer;
if a target object to be subjected to image acquisition exists in the acquisition range of the real scene, responding to the motion of the electronic terminal, and controlling the acquisition indication layer to move on the AR range positioning layer so that the landing point of the guide line is positioned on the target object;
And in the case that the finger landing point is positioned on the target object, in response to a guiding positioning instruction for the acquisition instruction layer, placing the acquisition instruction layer on the AR range positioning layer, and displaying the acquisition instruction layer for the target object in the acquisition range.
Optionally, the acquisition guide layer determining module is specifically further configured to:
if the target object does not exist in the acquisition range of the real scene, responding to the motion of the electronic terminal, and controlling the acquisition indication layer to move on the AR range positioning layer so that the landing point of the guide line is positioned at the center of the AR range positioning layer;
and when the finger landing point is positioned at the center of the AR range positioning layer, responding to a guiding positioning instruction aiming at the acquisition indication layer, placing the acquisition indication layer on the AR range positioning layer, and displaying an acquisition guide layer corresponding to the acquisition range.
Optionally, the acquisition guide layer determining module is specifically configured to:
and responding to touch operation of any position except the acquisition indication layer in the graphical user interface, and placing the acquisition indication layer on the AR range positioning layer.
Optionally, the method further comprises:
and the radius adjusting module is used for responding to the radius adjusting operation for the acquisition guiding area, adjusting the display radius of the acquisition guiding layer according to the radius adjusting operation and simultaneously displaying the radius value of the acquisition guiding layer.
Optionally, the method further comprises:
and the position abnormality prompting module is used for responding to the movement of the acquisition indication layer in the AR range positioning layer, displaying position abnormality prompting information aiming at the acquisition indication layer in the graphical user interface if the acquisition indication layer contacts with the edge of the AR range positioning layer, and setting the acquisition indication layer in a non-placeable state.
Optionally, the method further comprises:
a reset control display module for displaying a reset control for the acquisition guide layer in the graphical user interface;
and the resetting module is used for responding to the touch operation of the resetting control, canceling the acquisition guiding layer and redisplaying the acquisition guiding layer on the AR range positioning layer.
Optionally, the method further comprises:
the confirmation control display module is used for displaying a confirmation control aiming at the acquisition guide layer in the graphical user interface;
The acquisition mode determining module is used for responding to the touch operation of the determining control, acquiring a point closest to the electronic terminal from the acquisition guide layer as an initial acquisition point, and acquiring an acquisition mode of the current image acquisition;
and the identification display module is used for displaying a point location identification and an acquisition direction identification corresponding to the initial acquisition point on the acquisition guide layer by taking the initial acquisition point as the initial point if the acquisition mode is a video frame extraction acquisition mode.
Optionally, the method further comprises:
the collection pointing identification display module is used for displaying collection pointing identification of a pointing target object in the graphical user interface;
the parameter acquisition module is used for responding to the movement of the electronic terminal and acquiring acquisition state parameters corresponding to the electronic terminal and calibration parameters corresponding to the acquisition guide layer in the process of performing circular shooting movement along the acquisition guide layer;
and the prompt information display module is used for displaying the acquisition prompt information aiming at the target object in the graphical user interface according to the acquisition state parameter and the calibration parameter, wherein the acquisition prompt information comprises the calibration prompt information for prompting a user to adjust the terminal state of the electronic terminal and the normal prompt information for prompting the user to continuously keep the terminal state of the electronic terminal.
Optionally, the collection state parameter includes a horizontal distance between the electronic terminal and a center point of the collection guiding layer, the calibration parameter includes a distance threshold range corresponding to the collection guiding layer, and the prompt information display module is specifically configured to;
if the horizontal distance is outside the distance threshold range, displaying distance calibration prompt information in the graphical user interface and displaying the acquisition pointing identifier in a third display mode;
and if the horizontal distance is within the distance threshold range, displaying the acquisition pointing identifier in the graphical user interface in a fourth display mode.
Optionally, the collection state parameter includes a real-time movement speed, the calibration parameter includes a speed threshold range, and the prompt information display module is specifically configured to;
displaying a real-time speed value corresponding to the acquisition guide layer in the graphical user interface;
if the real-time movement speed is within the speed threshold range, displaying the acquisition pointing identifier in the graphical user interface by adopting a fourth display mode;
and if the real-time movement speed is out of the speed threshold range, displaying speed control prompt information in the acquisition interface and displaying the real-time speed value by adopting a fifth display mode.
Optionally, the collection state parameter includes a real-time pitch angle, the calibration parameter includes a pitch angle range, and the prompt information display module is specifically configured to;
if the real-time pitch angle is out of the pitch angle range, angle calibration prompt information and a calibration sight are displayed in the graphical user interface, the acquisition pointing identifier is displayed by adopting a third display mode, and the real-time angle sight is displayed at the head position of the target object pointed by the acquisition pointing identifier;
and responding to the movement of the electronic terminal, controlling the movement of the electronic terminal to enable the real-time angle sight to move towards the calibration sight, hiding the real-time angle sight and the calibration sight if the coincidence degree between the real-time guiding sight and the calibration sight reaches a preset threshold, and displaying the acquisition pointing identifier by adopting a fourth display mode.
Optionally, the method further comprises:
the priority acquisition module is used for respectively acquiring prompt priorities corresponding to the calibration prompt messages if at least two calibration prompt messages exist, and displaying the calibration prompt message with the highest prompt priority;
The hierarchical display module is used for displaying the calibration prompt information of the next-stage prompt priority if the terminal state corresponding to the calibration prompt information with the highest prompt priority meets the conditions and the calibration prompt information of the next-stage prompt priority exists.
Optionally, the method further comprises:
and the acquisition progress display module is used for responding to the movement of the electronic terminal, and displaying the whole acquisition progress aiming at the target object on the acquisition guide layer if the current position of the electronic terminal is positioned outside the acquisition guide layer and within the acquisition range.
Optionally, the target image acquisition module is specifically configured to:
and responding to the movement of the electronic terminal, controlling the electronic terminal to surround at least one circle along the direction corresponding to the acquisition direction mark by taking a target object positioned in the center of the acquisition guide layer as a central axis, and acquiring an image of the target object to obtain a target image corresponding to the target object.
The embodiment of the invention also discloses electronic equipment, which comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
The memory is used for storing a computer program;
the processor is configured to implement the method according to the embodiment of the present invention when executing the program stored in the memory.
Embodiments of the present invention also disclose a computer-readable storage medium having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the method according to the embodiments of the present invention.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, in the image acquisition process, the real scene to be subjected to image acquisition can be displayed through the graphical user interface of the terminal, before the image acquisition, the terminal can respond to the range positioning instruction, the AR range positioning layer is displayed in the displayed real scene, the acquisition azimuth corresponding to the AR range positioning layer is determined in the real scene, then the acquisition indication layer can respond to the guide positioning instruction aiming at the acquisition range, the acquisition guide layer corresponding to the acquisition indication layer is displayed in the AR range positioning layer, on the one hand, the terminal can determine the acquisition range in the real scene based on the AR technology, so that the user can carry out image acquisition based on the fixed acquisition range, the effectiveness of the user operation in the subsequent image acquisition process is effectively improved, on the other hand, the acquisition guide layer is determined in the acquisition range through the cooperation between the AR range positioning layer and the acquisition indication layer, the image acquisition of the user can be effectively reduced in the acquisition operation difficulty degree through the acquisition guide layer, in the acquisition process, the terminal can respond to the movement of the electronic terminal, the acquisition guide layer can be controlled to the electronic terminal in the preset direction along the acquisition range, the image acquisition guide layer is positioned on the ring, the image acquisition guide layer can be positioned on the preset direction, the image acquisition center corresponds to the image acquisition target, the image acquisition stability can be ensured, and the image acquisition quality can be ensured.
Drawings
FIG. 1 is a flow chart of steps of an AR-based image acquisition method provided in an embodiment of the present invention;
FIG. 2 is a schematic diagram of an AR range positioning layer provided in an embodiment of the present invention;
FIG. 3 is a schematic diagram of an AR range positioning layer provided in an embodiment of the present invention;
FIG. 4 is a schematic diagram of an acquisition indicator layer provided in an embodiment of the present invention;
FIG. 5 is a schematic diagram of an acquisition indicator layer provided in an embodiment of the present invention;
FIG. 6 is a schematic diagram of an acquisition indicator layer provided in an embodiment of the present invention;
FIG. 7 is a schematic view of an acquisition guide layer provided in an embodiment of the present invention;
FIG. 8 is a schematic view of an acquisition guide layer provided in an embodiment of the present invention;
FIG. 9 is a schematic diagram of video snapshot acquisition provided in an embodiment of the present invention;
FIG. 10 is a schematic diagram of distance calibration provided in an embodiment of the present invention;
FIG. 11 is a schematic illustration of pitch angle calibration provided in an embodiment of the present invention;
FIG. 12 is a schematic diagram of a velocity calibration provided in an embodiment of the present invention;
FIG. 13 is a schematic diagram of an acquisition process provided in an embodiment of the present invention;
FIG. 14 is a block diagram of an AR based image capturing device provided in an embodiment of the present invention;
fig. 15 is a block diagram of an electronic device provided in an embodiment of the invention.
Detailed Description
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
As an example, for image acquisition of VR commodity, a user may center an object to be acquired, perform multiple angle image acquisition on the object to be acquired through a handheld photographing device, and then synthesize a corresponding VR image based on the acquired image, so as to implement VR panoramic image display on a related product. For image acquisition of an object to be acquired, the shooting angle can be positioned through a model, a template and the like so as to acquire the image of the object. However, in these image acquisition processes, for the model, the template, etc., the user is relied on to align the model, the template, etc. with the object in the shooting process, meanwhile, the model, the template, etc. are not necessarily completely attached to the shape of the object, so that the alignment is not accurate enough, and the user needs to not only find the shooting point location but also adjust the shooting terminal to align with the object in the shooting process, so that the operation difficulty and the cost are high. In addition, still rely on stability and the accuracy of user's handheld in the shooting process, lead to final product bandwagon effect to be unstable easily when the fluctuation takes place, reduced the quality of image acquisition.
In view of this, one of the core inventions of the present invention is to establish a three-dimensional space guidance system based on AR technology before image acquisition with the position of the object to be acquired as a center point, comprising displaying an AR range positioning layer in a displayed real scene in response to a range positioning instruction, and determining an acquisition azimuth corresponding to the AR range positioning layer in the real scene, then displaying an acquisition indication layer in the AR range positioning layer in response to a guidance positioning instruction for the acquisition range, and simultaneously, a user can determine an acquisition guide layer corresponding to the acquisition indication layer in the acquisition range through the acquisition indication layer, so that on one hand, the terminal can determine the acquisition range in the real scene first through the AR range positioning layer, and by positioning the acquisition range, the user can perform image acquisition based on the fixed acquisition range, effectively improve the effectiveness of user operation in the subsequent image acquisition process, on the other hand, determine the acquisition guide layer in the acquisition range through cooperation between the AR range positioning layer and the acquisition indication layer, and guiding the user to perform image acquisition in the acquisition process, effectively reduce the acquisition operation difficulty in the process, and simultaneously, in addition, the terminal can also control the terminal can perform image acquisition control in the electronic terminal to move along the preset layer in the acquisition direction, and simultaneously, the image is not ensured to be in the same as the image acquisition direction as the target, and the image is not to be positioned in the image acquisition direction Muscle fatigue brought to users by excessive collection models, templates and the like.
Specifically, referring to fig. 1, a flowchart illustrating steps of an AR-based image acquisition method provided in an embodiment of the present invention is shown, where content displayed through a graphical user interface of an electronic terminal includes at least a real scene, and may specifically include the following steps:
step 101, responding to a range positioning instruction, displaying an AR range positioning layer in the real scene, and determining an acquisition range corresponding to the AR range positioning layer in the real scene;
alternatively, the electronic terminal (described below) may be a camera or a mobile terminal having a photographing function, and an application program capable of implementing the photographing function may be run on the mobile terminal, through which panoramic image collection, processing, management, and the like may be performed. In addition, the mobile terminal and the camera can be connected, so that collaborative shooting is realized. For convenience of understanding and description, the embodiment of the present invention is exemplified by a mobile terminal, and it will be understood that the present invention is not limited thereto.
In the embodiment of the invention, the terminal can run a corresponding application program (such as an image acquisition program and the like) and display a corresponding task creation interface in a graphical user interface, so that a user can input a corresponding task creation instruction in the task creation interface, then the terminal can respond to the task creation instruction, display a corresponding acquisition interface in the graphical user interface, display an image stream acquired in real time through a camera in the acquisition interface so as to display a real scene in which the terminal is positioned, and then display an AR range positioning layer in the real scene based on an AR technology so that the user can position the acquisition range of the acquisition task through the AR range positioning layer. For the acquisition range, it may limit the range in which the user performs image acquisition in the corresponding region and the acquisition guiding layer is set, it may be a region with a fixed shape, including a circle, a rectangle, etc., and may perform radius (or side length) adjustment according to the type of the object to be acquired, for example, when the object to be acquired is a vehicle, the radius of the acquisition range corresponding to the AR range positioning layer may be 8 meters; when the object to be acquired is a small object, the radius of the acquisition range corresponding to the AR range locating layer may be 2 meters, etc., and the principle of the rectangle is similar, and will not be described here again.
It should be noted that, for the AR range positioning layer, since it is difficult for the user to intuitively feel the radius or the side length corresponding to the AR range positioning layer through the graphical user interface, the user may be away from the object to be acquired through the handheld terminal, so as to position the acquisition range in the real scene through the AR range positioning layer, and in addition, may also perform positioning through other manners, such as displaying a plan corresponding to the real scene, and displaying an area corresponding to the AR range positioning layer on the plan, so that the user can position the acquisition range by adjusting the position of the AR range positioning layer in the real scene.
In a specific implementation, the terminal may display the AR range positioning layer in the real scene, after the user triggers to display the AR range positioning layer through the range setting instruction, the user may drag the AR range positioning layer to position the acquisition range in the real scene, so that the terminal may control the AR range positioning layer to move in the real scene in response to the drag operation for the AR range positioning layer, and obtain a scene image corresponding to the coverage area of the AR range positioning layer in the real scene, and then may display the AR range positioning layer in a preset display style in the real scene according to the scene image, and respond to the range positioning instruction for the AR range positioning layer, and take the coverage area of the AR range positioning layer in the real scene as the acquisition range corresponding to the range positioning instruction.
Before image acquisition, in order to ensure that a target object to be acquired can be positioned in a flat and barrier-free environment in the acquisition process, when a user adjusts an AR range positioning layer to position an acquisition range, a terminal can acquire a scene image corresponding to a coverage area of the AR range positioning layer in a real scene, the scene image can be a live-action image of a horizontal plane corresponding to a certain area in the real scene, the terminal can perform image recognition on the scene image, and if the horizontal plane surface corresponding to the scene image is flat and barrier-free, the AR range positioning layer is displayed in a first display mode on the real scene; and if the horizontal plane corresponding to the scene image is uneven and/or an obstacle exists, displaying the AR range positioning layer in the real scene in a second display mode.
Optionally, for the first display style and the second display style, which may be display colors for the AR range positioning layer, for example, when a user drags the AR range positioning layer to move in a real scene, the terminal determines, by means of image recognition, whether a scene image corresponding to a coverage area of the real scene by the AR range positioning layer meets a condition, if the horizontal plane corresponding to the representation is flat and no obstacle exists, the AR range positioning layer may be displayed as blue in a graphical user interface, and a class prompt lasting for a certain period (such as 3 seconds) is displayed to guide the user to place the AR range positioning layer, so as to implement positioning of an acquisition range; if the horizontal plane corresponding to the representation is uneven and/or an obstacle exists, the AR range positioning layer can be displayed in red in the graphical user interface, abnormal prompt information is displayed in a resident mode, a user is prompted to adjust the position of the AR range positioning layer, accordingly, the AR range positioning layer is displayed through different display modes, whether the placement position of the AR range positioning layer meets the acquisition condition or not can be effectively prompted to further effectively ensure that a target object can be placed in a flat and obstacle-free environment in the subsequent image acquisition process, and the image acquisition quality is improved.
In an example, referring to fig. 2 and 3, schematic diagrams of an AR range positioning layer provided in an embodiment of the present invention are shown, for an AR range positioning layer 210, the AR range positioning layer may be displayed in a manner (may be blue, not shown in the figure) for representing that a corresponding horizontal plane is flat and no obstacle exists, and at the same time, a corresponding prompt message "please click on a screen to establish a plane" may be displayed in a graphical user interface, so as to guide a user to position the AR range positioning layer, and realize positioning of an acquisition range; for the AR range positioning layer 310, it may be a display manner (may be red, and the color is not shown in the figure) of the AR range positioning layer in the case of representing the corresponding horizontal plane unevenness layer and/or the presence of an obstacle (i.e. the situation that the horizontal plane has faults or slopes, and undulates, etc.), and meanwhile, may display the corresponding prompt information "the space unevenness cannot establish a plane" in the graphical user interface, so as to guide the user to reposition, thereby guiding the user to position the acquisition range by displaying the corresponding prompt manner, so as to position the acquisition range of the current acquisition task in the real scene based on the AR technology, and by positioning the acquisition range, so that the user can acquire images based on the fixed acquisition range, and effectively improve the effectiveness of the user operation in the subsequent image acquisition process.
when the user determines the acquisition range in the real scene through the AR range locating layer, the acquisition guide layer may be further determined in the determined acquisition range to perform image acquisition through the acquisition guide layer. The terminal can display the AR range positioning layer according to the position of the terminal, and when the position of the terminal is out of the acquisition range, the terminal can completely display the AR range positioning layer; when the position of the terminal is located in the acquisition range, the terminal can display part of the AR range positioning layer, so that a user can keep away from the terminal through the handheld terminal to determine where the acquisition range is located, and then an acquisition guide layer is arranged in the corresponding acquisition range. In the process of setting the acquisition guide layer, the guide indication layer can be displayed on the AR range positioning layer based on the AR technology, and the user can adjust the position of the finally determined acquisition guide layer on the AR range positioning layer by dragging the guide indication layer.
In a specific implementation, the terminal may respond to the guiding and positioning instruction for the acquisition range, display an acquisition indication layer corresponding to the guiding and positioning instruction and a guide line landing point corresponding to the center of the acquisition indication layer in the AR range positioning layer, if a target object to be subjected to image acquisition exists in the acquisition range of the real scene, respond to the movement of the electronic terminal, control the acquisition indication layer to move in the AR range positioning layer to enable the guide line landing point to be located on the target object, and then respond to the guiding and positioning instruction for the acquisition indication layer to place the acquisition indication layer on the AR range positioning layer and display the acquisition guide layer for the target object in the acquisition range.
If no target object exists in the acquisition range of the real scene, the electronic terminal responds to the motion of the electronic terminal, the acquisition indication layer is controlled to move on the AR range positioning layer so that the landing point of the guide wire is positioned at the center of the AR range positioning layer, then the acquisition indication layer is placed on the AR range positioning layer in response to the guiding positioning instruction aiming at the acquisition indication layer under the condition that the landing point of the guide wire is positioned at the center of the AR range positioning layer, and the acquisition guide layer corresponding to the acquisition range is displayed.
Alternatively, after the user determines the position of the acquisition guide layer in the AR range positioning layer through the acquisition guide layer, the acquisition guide layer may be placed on the AR range positioning layer by touching any position other than the acquisition guide layer in the graphical user interface.
The placement process of the acquisition guide layer can be divided into two cases, wherein one is that a target object to be subjected to image acquisition exists in a real scene, for example, the object to be acquired is an object inconvenient to move and fixed in position; the other is that the real scene does not have a target object to be subjected to image acquisition, in this case, the user can first position the image acquisition and put the object to be acquired to a corresponding position after positioning, for example, the object to be acquired is an object which can flexibly move, has smaller volume and the like, so that the user can flexibly select a proper mode to set the position of the acquisition guide layer according to the actual condition of the real scene, and the operation flexibility of image acquisition is effectively improved.
The image acquisition is carried out around the object to be acquired, namely the center point of the acquisition guide layer is required to be basically the same as the object to be acquired, and the guide line landing point corresponding to the center of the acquisition guide layer can be displayed while the acquisition guide layer is displayed, so that a user can intuitively know the target position of the acquisition guide layer, which is mapped to the AR range positioning layer, when dragging the acquisition guide layer to move, and the user can effectively acquire and position the object to be acquired through the cooperation between the AR range positioning layer and the acquisition guide layer, so that the subsequent image acquisition process can always take the object to be acquired as the center, the difficulty of image acquisition can be reduced, and the accuracy of image acquisition can be improved.
Specifically, under the condition that a target object to be acquired exists in the acquisition range of the real scene, along with the movement of the acquisition indication layer by a user, the target position of the AR range positioning layer mapped by the central point of the acquisition indication layer can move along with the movement of the acquisition indication layer, so that the user can move the landing point of the guide line to the target object by moving the acquisition indication layer, and in an ideal state, the target object can move to the position of the central point corresponding to the target object; under the condition that no target object to be acquired exists in the acquisition range of the real scene, any area in the AR range positioning layer meets the acquisition condition, a user can move the landing point of the guide line to the center point position of the AR range positioning layer by moving the acquisition indication layer (the corresponding center point can be displayed while the AR range positioning layer is displayed), and other positions meeting the condition (such as areas not exceeding the acquisition range and the like) can be selected in the AR range positioning layer according to actual requirements. After determining the placement position of the acquisition guide layer, the user can further place the acquisition guide layer on the AR range positioning layer by touching any position except the acquisition guide layer, and then the terminal can display the corresponding acquisition guide layer in the graphical user interface so as to enable the user to further set the acquisition guide layer.
When the position of the acquisition guiding layer is adjusted through the acquisition guiding layer, the terminal can respond to the acquisition guiding layer to move on the AR range positioning layer, if the acquisition guiding layer contacts with the edge of the AR range positioning layer, position abnormality prompting information for the acquisition guiding layer is displayed in the graphical user interface, and the acquisition guiding layer is placed in an undeposible state so as to prompt a user that the acquisition guiding layer cannot be set in the current state and needs to be readjusted.
In an example, referring to fig. 4 and 5, which show schematic diagrams of the acquisition indication layer provided in the embodiment of the present invention, fig. 4 may correspond to a case where there is no target object to be acquired in the acquisition range of the real scene, and then the user may place the acquisition indication layer 420 in the AR range positioning layer 410 at a position that meets the condition to determine the acquisition indication layer. Also in this case, the corresponding prompt message "adjust position, ensure click-to-place after the center of circle coincides with the center of object", and because there is no target object, the user can choose to place the acquisition indication layer 420 with the finger drop point 430 coinciding with the center point of the AR range positioning layer 410.
Fig. 5 may correspond to a situation that a target object to be acquired exists in an acquisition range of a real scene, and a corresponding target object 520 may be displayed in the AR range positioning layer 510, and then a user may move the guide line landing point 540 to a position overlapping with a center point of the target object 520 by dragging the acquisition indication layer 530, and then place the acquisition indication layer 530 to obtain an acquisition guide layer for the target object 520. Meanwhile, in the process of moving the acquisition indication layer 530, the terminal can display corresponding prompt information of 'adjusting the position', so as to ensure that the circle center coincides with the center of the object and then clicking and placing the object.
In the above process, the movement of the acquisition indication layer moves in the acquisition range, referring to fig. 6, a schematic diagram of the acquisition indication layer provided in the embodiment of the present invention is shown, and as the user drags the acquisition indication layer, when the acquisition indication layer contacts the edge of the AR range positioning layer, that is, when the acquisition range is about to be exceeded, the terminal may display corresponding position abnormality prompting information (such as "please not exceed the area edge to place", etc.) in the graphical user interface, and place the acquisition indication layer in a non-placeable state, and simultaneously may display the area outside the AR range positioning layer in a corresponding warning color, such as red, etc., and/or display the acquisition indication layer in red, etc., and may also display a toast prompt of resident display, etc., so that through the cooperation between the AR range positioning layer and the acquisition indication layer, the user may effectively perform acquisition positioning, so that the subsequent image acquisition process may always use the object to be acquired as the center, thereby not only reducing the difficulty of image acquisition, but also may improve the accuracy of image acquisition.
In addition, after the acquisition guide layer is displayed, the terminal can display a radius value corresponding to the acquisition guide layer in the graphical user interface, and simultaneously can respond to the radius adjustment operation for the acquisition guide region, adjust the display radius of the acquisition guide layer according to the radius adjustment operation, and simultaneously display the radius value of the acquisition guide layer. Meanwhile, a reset control for the acquisition guide layer can be displayed in the graphical user interface, if the user is not satisfied with the position of the current acquisition guide layer, the current acquisition guide layer can be repositioned through the reset control, the terminal can respond to the touch operation for the reset control, the acquisition guide layer is canceled, and the acquisition guide layer is redisplayed on the AR range positioning layer.
For example, referring to fig. 7 and 8, schematic diagrams of an acquisition guide layer provided in an embodiment of the present invention are shown, in fig. 7, a control indication with an adjustable radius may be displayed around the acquisition guide layer 710, and then a user may drag any place of the acquisition guide layer 710 to control and scale the radius of the acquisition guide layer 710. In fig. 8, the radius value may be changed in real time during the radius adjustment process.
In the embodiment of the invention, after the user determines the acquisition guide layer, the terminal can also display a determination control for the acquisition guide layer in the graphical user interface, the user can determine the position, the radius and the like of the acquisition guide layer through a confirmation control, the subsequent image acquisition is performed based on the acquisition guide layer, when the user touches the confirmation control, the terminal can respond to the touch operation for the determination control to acquire the point closest to the electronic terminal from the acquisition guide layer as an initial acquisition point and acquire the acquisition mode of the image acquisition, and if the acquisition mode is a video frame extraction acquisition mode, the initial acquisition point is taken as a starting point, and the point identification and the acquisition direction identification corresponding to the starting point are displayed on the acquisition guide layer. In addition, the acquisition mode can be a multi-point acquisition mode, and then a plurality of ring shooting points can be displayed on the acquisition guide layer, which is not limited by the invention.
Optionally, for the video frame extraction acquisition mode, after completing video acquisition of the target object, corresponding image frames are extracted from the obtained video according to time sequence, and then panoramic images corresponding to the target object are synthesized according to the image frames. The extraction of the image frames can be completed by the terminal, or can be performed by sending the image frames to the server, and then the terminal receives the panoramic image returned by the server, so that the performance consumption of the terminal is reduced. In addition, the terminal may search the edge portions of the two pictures in the process of continuously performing image acquisition on the target object, and overlap the region with the closest imaging effect, so as to achieve the purpose of combining while shooting and obtain the panoramic image corresponding to the object to be acquired.
For the acquisition mode, when a user creates an image acquisition task, the user can select what mode the image acquisition task is in to acquire an image, in a video frame extraction mode, referring to fig. 9, a schematic diagram of video frame extraction acquisition provided in the embodiment of the present invention is shown, after the user touches the confirmation control 910, the terminal can take a point closest to the current angle distance of the user as an initial acquisition point 930 on the acquisition guide layer 920 and highlight the initial acquisition point, so that the user can acquire an image of a target object from the initial acquisition point and perform ring shooting on the target object along the acquisition guide layer, thereby effectively ensuring smoothness of image acquisition, avoiding the user from repeatedly performing stay shooting on different acquisition points, acquisition models and acquisition masks, improving convenience of user operation and reducing acquisition thresholds.
After the initial acquisition point is displayed, the terminal can be characterized to enter an acquisition mode, in which a user can start to acquire images of a target object, the terminal can display an acquisition pointing identifier pointing to the target object in a graphical user interface, and the acquisition pointing identifier always moves along with the movement of the position of the terminal and always keeps a (camera) vertical forward direction. In the process of image acquisition, the terminal can respond to the movement of the electronic terminal held by the user, acquire the acquisition state parameters corresponding to the electronic terminal and the calibration parameters corresponding to the acquisition guide layer in the process of carrying out circular shooting movement along the acquisition guide layer, then can display acquisition prompt information aiming at a target object in a graphical user interface according to the acquisition state parameters and the calibration parameters, wherein the acquisition prompt information comprises the calibration prompt information for prompting the user to adjust the terminal state of the electronic terminal and the normal prompt information for prompting the user to continuously keep the terminal state of the electronic terminal, so that the terminal can output the acquisition prompt information in real time through the acquisition state parameters of the terminal and the calibration parameters for calibration in the image acquisition process, guide the image acquisition behavior of the user through the calibration prompt information and give positive feedback to the user through the normal prompt information, and further can improve the accuracy of image acquisition of the target object to be acquired by the user through acquisition guide, and can effectively ensure the smoothness of the acquisition flow and the quality of the acquired image.
In the acquisition process, the terminal can detect the acquisition state of the terminal in real time according to the acquired state parameters and the calibration parameters, and if the acquisition state is normal, normal prompt information for an object to be acquired can be displayed in an acquisition interface and/or an acquisition guide area; if the acquisition state is abnormal, calibration prompt information aiming at an object to be acquired can be displayed in an acquisition interface and/or an acquisition guide area, so that the smoothness of image acquisition is effectively ensured by carrying out loop shooting on the object along the acquisition guide layer, the repeated stay shooting of a user on different acquisition points, acquisition models and acquisition masks is avoided, the convenience of user operation is improved, the acquisition threshold is reduced, and meanwhile, when the user needs to ensure the image acquisition precision, the movement of the terminal can be calibrated according to the calibration prompt information output by the terminal, so that the quality of an acquired image is ensured.
For the acquisition state parameters, the acquisition state parameters can comprise a horizontal distance between a terminal and a central point of an acquisition guide layer, a real-time acquisition angle, a real-time pitch angle, a real-time motion speed and the like; the calibration parameters may include parameters corresponding to the acquisition guide layer including a distance threshold range, a standard acquisition angle, a speed threshold range, a pitch angle range, and the like.
In a specific implementation, for a horizontal distance, because the image acquisition task takes a target object as a center, the terminal can acquire the horizontal distance between the terminal and the center point of the acquisition guide layer, and if the horizontal distance is out of a distance threshold range, distance calibration prompt information is displayed in the graphical user interface and the acquisition pointing identifier is displayed by adopting a third display mode; if the horizontal distance is within the range of the distance threshold, displaying the acquisition pointing identifier in the graphical user interface by adopting a fourth display mode, so that the user is prompted to perform position calibration under the condition that the horizontal distance does not meet the condition or does not meet the condition by displaying corresponding prompt information, or the current state of the terminal is kept for image acquisition under the condition that the horizontal distance does not meet the condition, and further the terminal can be ensured to always keep the same distance as the target object for image acquisition in the acquisition process through distance calibration, and the accuracy and quality of image acquisition are effectively ensured.
It should be noted that, for the collection pointing identifier, it may be an indication arrow formed by a plurality of triangles, and for the third display style and the fourth display style, it may correspond to different display colors of the collection pointing identifier, so that the collection state of the current terminal of the user may be intuitively and effectively prompted to whether the collection condition is satisfied by displaying the collection pointing identifier in different colors.
In an example, referring to fig. 10, a schematic diagram of distance calibration provided in an embodiment of the present invention is shown, in an image acquisition process, a terminal may obtain a horizontal distance between a current position and a center point of an acquisition guiding layer, if the horizontal distance is smaller than or greater than a preset distance threshold range, it indicates that the terminal is not currently located on a corresponding ring-shooting point or is not located on the acquisition guiding layer, and position calibration is required, so that the terminal may display an acquisition pointing identifier 1010 (not shown in the figure) by using red, and simultaneously display corresponding distance calibration prompt information such as "distance is too close, please stand to blue-color point shooting", "distance is too far, please stand to blue-color point shooting", and so on, so as to ensure that the terminal always keeps a distance substantially the same as that of a target object in the acquisition process to perform image acquisition through distance calibration, thereby effectively ensuring accuracy and quality of image acquisition.
For the pitch angle of the terminal relative to the target object in the acquisition process, in order to ensure that the shooting angles of all the points at the same height are consistent, whether the point included angle (namely the pitch angle) between the aiming point of the camera of the terminal and the shot height layer meets the pitch angle range in the calibration parameter or not can be detected, so that the image acquisition behavior of the user can be fed back and calibrated. Specifically, the terminal can obtain the real-time pitch angle of the terminal in the acquisition process, if the real-time pitch angle is out of the pitch angle range, the angle calibration prompt information and the calibration sight are displayed in the graphical user interface, the acquisition pointing identifier is displayed by adopting a third display mode, the real-time angle sight is displayed at the head position of the acquisition pointing identifier pointing to the target object, then if the user adjusts the terminal to calibrate, the terminal can respond to the movement of the electronic terminal, the movement of the electronic terminal is controlled to enable the real-time angle sight to move towards the calibration sight, if the coincidence degree between the real-time guiding sight and the calibration sight reaches a preset threshold, the real-time angle sight and the calibration sight are hidden, and the acquisition pointing identifier is displayed by adopting a fourth display mode. In addition, if the real-time pitch angle position pitch angle is within the range, the fourth display mode is adopted to display the acquisition indication mark, so that in the acquisition process, when the pitch angle of the terminal camera relative to the target object does not meet the calibration condition, on one hand, the user can be prompted to perform pitch angle calibration through displaying calibration prompt information, and on the other hand, the real-time angle sight and the calibration sight can be displayed, so that the user can conveniently perform pitch angle calibration based on the terminal and the calibration to the pitch angle, further, the terminal can be effectively ensured to always keep basically the same height to perform image acquisition on the object to be acquired in the acquisition process, and further the accuracy and quality of image acquisition are ensured.
Optionally, the real-time guiding sight may move along with the movement of the terminal, and the calibration sight may be a fixed sight displayed on the target object, which may not change along with the movement of the terminal, so that when the pitch angle calibration needs to be performed, the user may be effectively guided to perform angle calibration through the interaction between the changing real-time guiding sight and the unchanged calibration sight, so that the terminal may always maintain substantially the same height in the acquisition process to perform image acquisition on the target object.
For the pitching angle range, the pitching angle range can be an angle corresponding to the optimal shooting height of the target object, and meanwhile, in order to ensure the smoothness of acquisition, an error value corresponding to the angle is set, so that flexible processing can be realized in the acquisition process, and the smoothness of image acquisition is ensured. In addition, in order to strongly remind feedback, through displaying pitch angle calibration prompt information, a user is prompted to perform pitch angle calibration, and image acquisition of an object to be acquired is interrupted under the condition that the real-time pitch angle (in a video frame extraction acquisition mode) does not meet the pitch angle range, so that the image acquisition of the object to be acquired is resumed after the user finishes the pitch angle calibration of the terminal.
In an example, referring to fig. 11, a schematic diagram of pitch angle calibration provided in an embodiment of the present invention is shown, in an image acquisition process, a terminal may obtain a real-time pitch angle, and if the real-time pitch angle meets a condition, prompt a user to continue shooting; if the real-time pitch angle is out of the pitch angle range, the terminal can display the acquisition pointing identifier 1110 (not shown in the figure) in red and display the pitch angle calibration prompt information 1120 to adjust the shooting angle and keep the same every time, and can display the real-time guide sight 1130 at the head position of the acquisition pointing identifier 1110 and display the calibration sight 1140 on the target object, so that the user can adjust the terminal in a moving way, the contact ratio between the real-time guide sight 1130 and the calibration sight 1140 reaches the preset threshold value, and the calibration of the pitch angle is completed, so that the terminal can be effectively ensured to always keep the same height to acquire images of the object to be acquired in the acquisition process by calibrating the pitch angle, and further the accuracy and quality of image acquisition are ensured.
Besides the calibration prompt of the horizontal distance, the real-time pitch angle and the like, the movement speed of the terminal is also an important factor influencing the acquisition quality, so that the terminal can respond to the movement of the user handheld terminal to acquire the real-time movement speed, a real-time speed value corresponding to the acquisition guide layer is displayed in the graphical user interface, and if the real-time movement speed is within the speed threshold range, the acquisition pointing identifier is displayed in the graphical user interface by adopting a fourth display mode; if the real-time movement speed is out of the speed threshold range, displaying speed control prompt information in the acquisition interface and displaying a real-time speed value in a fifth display mode, so that when the speed is too high, the camera is easy to cause that the camera cannot effectively focus, and at the moment, a user can be reminded of controlling the movement speed; when the speed is too slow, the time consumption of the acquisition process is easy to lengthen, the acquisition efficiency is reduced, and a user can be reminded of properly improving the moving speed at the moment, so that the quality of an acquired image can be further improved by detecting the moving speed of the terminal.
In an example, referring to fig. 12, a schematic diagram of speed calibration provided in an embodiment of the present invention is shown, and in the process of image acquisition, when the speed of the user controlling the terminal to move or rotate does not meet the speed threshold range, the terminal may set the displayed real-time speed value 1210 to be red, and display corresponding speed control prompt information 1220 to prompt the user to control the moving speed, so as to ensure the accuracy and quality of image acquisition. In addition, for the video frame extraction acquisition mode, the corresponding calibration items further comprise a pitch angle, a horizontal distance and the like, when the horizontal distance does not meet the acquisition condition, the terminal can perform abnormal feedback on the acquisition direction identifier and the progress of acquisition completion, for example, the terminal can be displayed as green under normal conditions, and can be displayed as red under abnormal conditions so as to prompt a user to calibrate; the horizontal distance calibration process is the same and will not be described in detail here.
It should be noted that, in the above process, the calibration prompting process corresponding to the multi-point acquisition mode and the video frame extraction acquisition mode is described as an example, it can be understood that the above calibration prompting mode can be included but not limited to any acquisition mode, and in actual situations, the above processes can be reasonably combined to achieve a better calibration prompting effect, which is not limited in the present invention.
In the calibration prompt process, only single parameter calibration prompts are involved, and in the actual process, at least two calibration prompt messages may exist, if at least two calibration prompt messages exist, prompt priorities corresponding to the calibration prompt messages are respectively obtained, and the calibration prompt message with the highest prompt priority is displayed; if the terminal state corresponding to the calibration prompt information with the highest prompt priority meets the conditions and the calibration prompt information with the next prompt priority exists, the calibration prompt information with the next prompt priority is displayed, and the calibration prompt information is displayed through grading, so that a user can calibrate according to terms, the calibration accuracy can be effectively ensured, the anxiety of image acquisition of the user can be reduced, and the user experience of image acquisition is improved.
Optionally, for the prompt priority, when there are multiple parameters to be calibrated, pitch angle calibration prompt information can be displayed preferentially, distance calibration prompt information is displayed, direction calibration prompt information is displayed, and if the acquisition mode is a video frame extraction acquisition mode, speed calibration prompt information can be displayed finally.
In addition, in the image acquisition process, as the user is a handheld terminal and performs loop shooting on the acquisition guide layer, when the user wants to check the acquisition process, the user can hold the terminal away from the target object, the terminal can respond to the movement of the electronic terminal, and if the current position of the electronic terminal is located outside the acquisition guide layer and within the acquisition range, the whole acquisition process for the target object is displayed on the acquisition guide layer. For example, referring to fig. 13, a schematic diagram of an acquisition process provided in an embodiment of the present invention is shown, where the schematic diagram corresponds to a video frame extraction acquisition mode, and when a camera of a terminal is located outside an acquisition guiding layer and within an acquisition range, the terminal may perform differential display on a portion that has been acquired and a portion that has not been acquired on the acquisition guiding layer in different display styles, so that, based on cooperation between the acquisition guiding layer and an AR range positioning layer, a user leaves a range corresponding to the acquisition guiding layer by holding the terminal in a hand and stays in the acquisition range during the acquisition process, and can view the current acquisition process, thereby improving convenience of image acquisition on one hand, and reducing anxiety feeling of image acquisition of the user by displaying the acquisition process on the other hand.
And step 103, responding to the movement of the electronic terminal, controlling the electronic terminal to perform circular shooting movement on the acquisition guide layer along a preset direction, and performing image acquisition on a target object positioned in the center of the acquisition guide layer to obtain a target image corresponding to the target object.
For image acquisition, in a multi-point acquisition mode, the terminal can respond to the movement of the electronic terminal to control the electronic terminal to acquire images of a target object positioned in the center of the acquisition guide layer on each ring shooting point to obtain a target image corresponding to the target object; for the video frame extraction acquisition mode, the terminal can respond to the movement of the electronic terminal, control the electronic terminal to surround at least one circle along the direction corresponding to the acquisition direction mark by taking the target object positioned at the center of the acquisition guide layer as a central axis, and acquire the image of the target object, so as to acquire the target image corresponding to the target object, thereby positioning and guiding the image acquisition through the determined acquisition guide layer, not only effectively ensuring the stability of the image acquisition process, but also ensuring the quality of the acquired image.
Optionally, when entering the shooting mode, a corresponding image acquisition control may be displayed in the graphical user interface, then in the video frame extraction mode, the user may click the image acquisition control first, trigger video recording, and perform loop shooting on the target object based on the acquisition guide layer handheld terminal, when clicking the image acquisition control again, the video recording may be suspended, in addition, the video recording may also be performed by pressing the image acquisition control for a long time, when the finger leaves from the graphical user interface, the video recording may be suspended, and when pressing the image acquisition control for a long time again, the video recording may be performed on the target object continuously, and the shooting point when receiving the interruption may be recorded continuously.
It should be noted that the embodiments of the present invention include, but are not limited to, the foregoing examples, and it will be understood that those skilled in the art may also set the embodiments according to actual requirements under the guidance of the concepts of the embodiments of the present invention, which are not limited thereto.
In the embodiment of the invention, in the image acquisition process, the real scene to be subjected to image acquisition can be displayed through the graphical user interface of the terminal, before the image acquisition, the terminal can respond to the range positioning instruction, the AR range positioning layer is displayed in the displayed real scene, the acquisition azimuth corresponding to the AR range positioning layer is determined in the real scene, then the acquisition indication layer can respond to the guide positioning instruction aiming at the acquisition range, the acquisition guide layer corresponding to the acquisition indication layer is displayed in the AR range positioning layer, on the one hand, the terminal can determine the acquisition range in the real scene based on the AR technology, so that the user can carry out image acquisition based on the fixed acquisition range, the effectiveness of the user operation in the subsequent image acquisition process is effectively improved, on the other hand, the acquisition guide layer is determined in the acquisition range through the cooperation between the AR range positioning layer and the acquisition indication layer, the image acquisition of the user can be effectively reduced in the acquisition operation difficulty degree through the acquisition guide layer, in the acquisition process, the terminal can respond to the movement of the electronic terminal, the acquisition guide layer can be controlled to the electronic terminal in the preset direction along the acquisition range, the image acquisition guide layer is positioned on the ring, the image acquisition guide layer can be positioned on the preset direction, the image acquisition center corresponds to the image acquisition target, the image acquisition stability can be ensured, and the image acquisition quality can be ensured.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the invention.
Referring to fig. 14, a block diagram of an AR-based image capturing device according to an embodiment of the present invention is shown, where content displayed through a graphical user interface of an electronic terminal includes at least a real scene, and may specifically include the following modules:
an acquisition range determining module 1401, configured to display an AR range positioning layer in the real scene in response to a range positioning instruction, and determine an acquisition range corresponding to the AR range positioning layer in the real scene;
an acquisition guidance layer determination module 1402 configured to display an acquisition guidance layer corresponding to a guidance positioning instruction in the AR range positioning layer in response to the guidance positioning instruction for the acquisition range, and determine an acquisition guidance layer corresponding to the acquisition guidance layer in the acquisition range;
The target image acquisition module 1403 is configured to control the electronic terminal to perform loop shooting motion on the acquisition guide layer along a preset direction in response to the motion of the electronic terminal, and perform image acquisition on a target object located at the center of the acquisition guide layer, so as to obtain a target image corresponding to the target object.
In an alternative embodiment, the acquisition range determining module 1401 is specifically configured to:
displaying an AR range positioning layer in the real scene;
responding to a dragging operation aiming at the AR range positioning layer, controlling the AR range positioning layer to move in the real scene, and acquiring a scene image corresponding to a coverage area of the AR range positioning layer in the real scene;
displaying the AR range positioning layer in the real scene according to the scene image in a preset display mode, responding to a range positioning instruction aiming at the AR range positioning layer, and taking the coverage area of the AR range positioning layer in the real scene as an acquisition range corresponding to the range positioning instruction.
In an alternative embodiment, the acquisition range determining module 1401 is specifically configured to:
if the surface of the horizontal plane corresponding to the scene image is flat and has no obstacle, displaying the AR range positioning layer on the real scene in a first display mode;
And if the horizontal plane corresponding to the scene image is uneven and/or an obstacle exists, displaying the AR range positioning layer in the real scene in a second display mode.
In an alternative embodiment, the acquisition guide layer determining module 1402 is specifically configured to:
responding to a guide positioning instruction aiming at the acquisition range, and displaying an acquisition indication layer corresponding to the guide positioning instruction and a guide line landing point corresponding to the center of the acquisition indication layer in the AR range positioning layer;
if a target object to be subjected to image acquisition exists in the acquisition range of the real scene, responding to the motion of the electronic terminal, and controlling the acquisition indication layer to move on the AR range positioning layer so that the landing point of the guide line is positioned on the target object;
and in the case that the finger landing point is positioned on the target object, in response to a guiding positioning instruction for the acquisition instruction layer, placing the acquisition instruction layer on the AR range positioning layer, and displaying the acquisition instruction layer for the target object in the acquisition range.
In an alternative embodiment, the acquisition guide layer determining module 1402 is further specifically configured to:
If the target object does not exist in the acquisition range of the real scene, responding to the motion of the electronic terminal, and controlling the acquisition indication layer to move on the AR range positioning layer so that the landing point of the guide line is positioned at the center of the AR range positioning layer;
and when the finger landing point is positioned at the center of the AR range positioning layer, responding to a guiding positioning instruction aiming at the acquisition indication layer, placing the acquisition indication layer on the AR range positioning layer, and displaying an acquisition guide layer corresponding to the acquisition range.
In an alternative embodiment, the acquisition guide layer determining module 1402 is specifically configured to:
and responding to touch operation of any position except the acquisition indication layer in the graphical user interface, and placing the acquisition indication layer on the AR range positioning layer.
In an alternative embodiment, further comprising:
and the radius adjusting module is used for responding to the radius adjusting operation for the acquisition guiding area, adjusting the display radius of the acquisition guiding layer according to the radius adjusting operation and simultaneously displaying the radius value of the acquisition guiding layer.
In an alternative embodiment, further comprising:
And the position abnormality prompting module is used for responding to the movement of the acquisition indication layer in the AR range positioning layer, displaying position abnormality prompting information aiming at the acquisition indication layer in the graphical user interface if the acquisition indication layer contacts with the edge of the AR range positioning layer, and setting the acquisition indication layer in a non-placeable state.
In an alternative embodiment, further comprising:
a reset control display module for displaying a reset control for the acquisition guide layer in the graphical user interface;
and the resetting module is used for responding to the touch operation of the resetting control, canceling the acquisition guiding layer and redisplaying the acquisition guiding layer on the AR range positioning layer.
In an alternative embodiment, further comprising:
the confirmation control display module is used for displaying a confirmation control aiming at the acquisition guide layer in the graphical user interface;
the acquisition mode determining module is used for responding to the touch operation of the determining control, acquiring a point closest to the electronic terminal from the acquisition guide layer as an initial acquisition point, and acquiring an acquisition mode of the current image acquisition;
And the identification display module is used for displaying a point location identification and an acquisition direction identification corresponding to the initial acquisition point on the acquisition guide layer by taking the initial acquisition point as the initial point if the acquisition mode is a video frame extraction acquisition mode.
In an alternative embodiment, further comprising:
the collection pointing identification display module is used for displaying collection pointing identification of a pointing target object in the graphical user interface;
the parameter acquisition module is used for responding to the movement of the electronic terminal and acquiring acquisition state parameters corresponding to the electronic terminal and calibration parameters corresponding to the acquisition guide layer in the process of performing circular shooting movement along the acquisition guide layer;
and the prompt information display module is used for displaying the acquisition prompt information aiming at the target object in the graphical user interface according to the acquisition state parameter and the calibration parameter, wherein the acquisition prompt information comprises the calibration prompt information for prompting a user to adjust the terminal state of the electronic terminal and the normal prompt information for prompting the user to continuously keep the terminal state of the electronic terminal.
In an optional embodiment, the acquisition state parameter includes a horizontal distance between the electronic terminal and a center point of the acquisition guiding layer, the calibration parameter includes a distance threshold range corresponding to the acquisition guiding layer, and the prompt information display module is specifically configured to;
If the horizontal distance is outside the distance threshold range, displaying distance calibration prompt information in the graphical user interface and displaying the acquisition pointing identifier in a third display mode;
and if the horizontal distance is within the distance threshold range, displaying the acquisition pointing identifier in the graphical user interface in a fourth display mode.
In an alternative embodiment, the acquisition state parameter includes a real-time movement speed, the calibration parameter includes a speed threshold range, and the prompt information display module is specifically configured to;
displaying a real-time speed value corresponding to the acquisition guide layer in the graphical user interface;
if the real-time movement speed is within the speed threshold range, displaying the acquisition pointing identifier in the graphical user interface by adopting a fourth display mode;
and if the real-time movement speed is out of the speed threshold range, displaying speed control prompt information in the acquisition interface and displaying the real-time speed value by adopting a fifth display mode.
In an optional embodiment, the acquisition state parameter includes a real-time pitch angle, the calibration parameter includes a pitch angle range, and the prompt information display module is specifically configured to;
If the real-time pitch angle is out of the pitch angle range, angle calibration prompt information and a calibration sight are displayed in the graphical user interface, the acquisition pointing identifier is displayed by adopting a third display mode, and the real-time angle sight is displayed at the head position of the target object pointed by the acquisition pointing identifier;
and responding to the movement of the electronic terminal, controlling the movement of the electronic terminal to enable the real-time angle sight to move towards the calibration sight, hiding the real-time angle sight and the calibration sight if the coincidence degree between the real-time guiding sight and the calibration sight reaches a preset threshold, and displaying the acquisition pointing identifier by adopting a fourth display mode.
In an alternative embodiment, further comprising:
the priority acquisition module is used for respectively acquiring prompt priorities corresponding to the calibration prompt messages if at least two calibration prompt messages exist, and displaying the calibration prompt message with the highest prompt priority;
the hierarchical display module is used for displaying the calibration prompt information of the next-stage prompt priority if the terminal state corresponding to the calibration prompt information with the highest prompt priority meets the conditions and the calibration prompt information of the next-stage prompt priority exists.
In an alternative embodiment, further comprising:
and the acquisition progress display module is used for responding to the movement of the electronic terminal, and displaying the whole acquisition progress aiming at the target object on the acquisition guide layer if the current position of the electronic terminal is positioned outside the acquisition guide layer and within the acquisition range.
In an alternative embodiment, the target image acquisition module 1403 is specifically configured to:
and responding to the movement of the electronic terminal, controlling the electronic terminal to surround at least one circle along the direction corresponding to the acquisition direction mark by taking a target object positioned in the center of the acquisition guide layer as a central axis, and acquiring an image of the target object to obtain a target image corresponding to the target object.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
In addition, the embodiment of the invention also provides electronic equipment, which comprises: the processor, the memory, store the computer program on the memory and can run on the processor, this computer program realizes each process of the above-mentioned generation method embodiment of the file when being carried out by the processor, and can reach the same technical result, in order to avoid repetition, will not be repeated here.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, realizes the processes of the above-mentioned document generation method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
Fig. 15 is a schematic hardware structure of an electronic device implementing various embodiments of the present invention.
The electronic device 150 includes, but is not limited to: a radio frequency unit 151, a network module 152, an audio output unit 153, an input unit 154, a sensor 155, a display unit 156, a user input unit 157, an interface unit 158, a memory 159, a processor 1510, and a power supply 1511. Those skilled in the art will appreciate that the electronic device structure shown in fig. 3 does not constitute a limitation of the electronic device, and the electronic device may include more or fewer components than shown, or may combine certain components, or may have a different arrangement of components. In the embodiment of the invention, the electronic equipment comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer and the like.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 151 may be used for receiving and transmitting signals during the process of receiving and transmitting information or communication, specifically, receiving downlink data from a base station, and then processing the received downlink data by the processor 1510; and, the uplink data is transmitted to the base station. In general, the radio frequency unit 151 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 151 may also communicate with networks and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 152, such as helping the user to send and receive e-mail, browse web pages, and access streaming media, etc.
The audio output unit 153 may convert audio data received by the radio frequency unit 151 or the network module 152 or stored in the memory 159 into an audio signal and output as sound. Also, the audio output unit 153 may also provide audio outputs (e.g., call signal receiving sounds, message receiving sounds, etc.) related to specific functions performed by the electronic device 150. The audio output unit 153 includes a speaker, a buzzer, a receiver, and the like.
The input unit 154 is for receiving an audio or video signal. The input unit 154 may include a graphics processor (Graphics Processing Unit, GPU) 1541 and a microphone 1542, the graphics processor 1541 processing image data of still pictures or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 156. The image frames processed by the graphics processor 1541 may be stored in the memory 159 (or other storage medium) or transmitted via the radio frequency unit 151 or the network module 152. The microphone 1542 may receive sound and be capable of processing such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 151 in the case of a phone call mode.
The electronic device 150 also includes at least one sensor 155, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1561 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1561 and/or the backlight when the electronic device 150 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for recognizing the gesture of the electronic equipment (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; the sensor 155 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described herein.
The display unit 156 is used to display information input by a user or information provided to the user. The display unit 156 may include a display panel 1561, and the display panel 1561 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 157 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the electronic device. In particular, user input unit 157 includes touch panel 1571 and other input devices 1572. Touch panel 1571, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on touch panel 1571 or thereabout using any suitable object or accessory such as a finger, stylus, or the like). Touch panel 1571 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 1510, and receives commands from the processor 1510 for execution. In addition, touch panel 1571 may be implemented using various types of resistive, capacitive, infrared, surface acoustic wave, and the like. In addition to touch panel 1571, user input unit 157 may also include other input devices 1572. In particular, other input devices 1572 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
Further, touch panel 1571 may be overlaid on display panel 1561 and, upon detection of a touch operation thereon or thereabout by touch panel 1571, transmitted to processor 1510 for determination of a type of touch event, whereupon processor 1510 provides a corresponding visual output on display panel 1561 based on the type of touch event. Although in fig. 13 touch panel 1571 and display panel 1561 are shown as two separate components to implement the input and output functions of an electronic device, in some embodiments touch panel 1571 may be integrated with display panel 1561 to implement the input and output functions of an electronic device, as is not limited herein.
The interface unit 158 is an interface for connecting an external device to the electronic apparatus 150. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 158 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 150 or may be used to transmit data between the electronic apparatus 150 and an external device.
The memory 159 may be used to store software programs and various data. The memory 159 may mainly include a storage program area that may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 159 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 1510 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 159, and calling data stored in the memory 159, thereby performing overall monitoring of the electronic device. The processor 1510 may include one or more processing units; preferably, the processor 1510 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 1510.
The electronic device 150 may also include a power supply 1511 (e.g., a battery) for powering the various components, and preferably the power supply 1511 may be logically connected to the processor 1510 via a power management system that performs functions such as managing charging, discharging, and power consumption.
In addition, the electronic device 150 includes some functional modules, which are not shown, and will not be described herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.
Claims (20)
1. An AR-based image acquisition method, characterized in that content displayed through a graphical user interface of an electronic terminal comprises at least a real scene, the method comprising:
responding to a range positioning instruction, displaying an AR range positioning layer in the real scene, and determining an acquisition range corresponding to the AR range positioning layer in the real scene;
in response to a guiding and positioning instruction for the acquisition range, displaying an acquisition indication layer corresponding to the guiding and positioning instruction and a guide line landing point corresponding to the center of the acquisition indication layer in the AR range positioning layer, and determining an acquisition guide layer corresponding to the acquisition indication layer in the acquisition range, wherein the guide line landing point is used for enabling the guide line landing point to be positioned on the center of the AR range positioning layer or a target object in the acquisition range when the acquisition indication layer is controlled to move in the AR range positioning layer;
and responding to the movement of the electronic terminal, controlling the electronic terminal to perform circular shooting movement on the acquisition guide layer along a preset direction, and performing image acquisition on a target object positioned in the center of the acquisition guide layer to obtain a target image corresponding to the target object.
2. The method of claim 1, wherein the displaying an AR range location layer in the real scene and determining an acquisition range corresponding to the AR range location layer in the real scene in response to a range location instruction comprises:
displaying an AR range positioning layer in the real scene;
responding to a dragging operation aiming at the AR range positioning layer, controlling the AR range positioning layer to move in the real scene, and acquiring a scene image corresponding to a coverage area of the AR range positioning layer in the real scene;
displaying the AR range positioning layer in the real scene according to the scene image in a preset display mode, responding to a range positioning instruction aiming at the AR range positioning layer, and taking the coverage area of the AR range positioning layer in the real scene as an acquisition range corresponding to the range positioning instruction.
3. The method of claim 2, wherein displaying the AR range positioning layer in a preset display style on the real scene according to the scene image comprises:
if the surface of the horizontal plane corresponding to the scene image is flat and has no obstacle, displaying the AR range positioning layer on the real scene in a first display mode;
And if the horizontal plane corresponding to the scene image is uneven and/or an obstacle exists, displaying the AR range positioning layer in the real scene in a second display mode.
4. The method of claim 1, wherein the displaying, in the AR range locating layer, an acquisition indicator layer corresponding to the guide locating instruction and a guideline landing point corresponding to a center of the acquisition indicator layer in response to the guide locating instruction for the acquisition range, and determining an acquisition guide layer corresponding to the acquisition indicator layer within the acquisition range, comprises:
responding to a guide positioning instruction aiming at the acquisition range, and displaying an acquisition indication layer corresponding to the guide positioning instruction and a guide line landing point corresponding to the center of the acquisition indication layer in the AR range positioning layer;
if a target object to be subjected to image acquisition exists in the acquisition range of the real scene, responding to the motion of the electronic terminal, and controlling the acquisition indication layer to move on the AR range positioning layer so that the landing point of the guide line is positioned on the target object;
and in the case that the finger landing point is positioned on the target object, in response to a guiding positioning instruction for the acquisition instruction layer, placing the acquisition instruction layer on the AR range positioning layer, and displaying the acquisition instruction layer for the target object in the acquisition range.
5. The method of claim 4, wherein the displaying, in the AR range locating layer, an acquisition indicator layer corresponding to the guide locating instruction and a guideline landing point corresponding to a center of the acquisition indicator layer in response to the guide locating instruction for the acquisition range, and determining an acquisition guide layer corresponding to the acquisition indicator layer within the acquisition range, further comprises:
if the target object does not exist in the acquisition range of the real scene, responding to the motion of the electronic terminal, and controlling the acquisition indication layer to move on the AR range positioning layer so that the landing point of the guide line is positioned at the center of the AR range positioning layer;
and when the finger landing point is positioned at the center of the AR range positioning layer, responding to a guiding positioning instruction aiming at the acquisition indication layer, placing the acquisition indication layer on the AR range positioning layer, and displaying an acquisition guide layer corresponding to the acquisition range.
6. The method of claim 4 or 5, wherein the placing the acquisition indicator layer on the AR range location layer in response to a guided positioning instruction for the acquisition indicator layer comprises:
And responding to touch operation of any position except the acquisition indication layer in the graphical user interface, and placing the acquisition indication layer on the AR range positioning layer.
7. The method of claim 1 or 4 or 5, further comprising:
and in response to a radius adjustment operation for the acquisition guide area, adjusting the display radius of the acquisition guide layer according to the radius adjustment operation, and simultaneously displaying the radius value of the acquisition guide layer.
8. The method of claim 1 or 4 or 5, further comprising:
and responding to the movement of the acquisition indication layer in the AR range positioning layer, if the acquisition indication layer contacts with the edge of the AR range positioning layer, displaying position abnormality prompting information aiming at the acquisition indication layer in the graphical user interface, and setting the acquisition indication layer in a non-placeable state.
9. The method of claim 1 or 4 or 5, further comprising:
displaying a reset control for the acquisition guide layer in the graphical user interface;
and canceling the acquisition guide layer in response to the touch operation for the reset control, and redisplaying the acquisition guide layer on the AR range positioning layer.
10. The method of claim 1 or 4 or 5, further comprising:
displaying a determination control for the acquisition guide layer in the graphical user interface;
responding to the touch operation of the determination control, acquiring a point closest to the electronic terminal from the acquisition guide layer as an initial acquisition point, and acquiring an acquisition mode of the current image acquisition;
and if the acquisition mode is a video frame extraction acquisition mode, displaying a point position identifier and an acquisition direction identifier corresponding to the initial acquisition point on the acquisition guide layer by taking the initial acquisition point as the initial point.
11. The method as recited in claim 10, further comprising:
displaying an acquisition pointing identifier of a pointing target object in the graphical user interface;
responding to the motion of the electronic terminal, and acquiring acquisition state parameters corresponding to the electronic terminal and calibration parameters corresponding to the acquisition guide layer in the process of performing circular shooting motion along the acquisition guide layer;
and displaying acquisition prompt information aiming at the target object in the graphical user interface according to the acquisition state parameter and the calibration parameter, wherein the acquisition prompt information comprises calibration prompt information for prompting a user to adjust the terminal state of the electronic terminal and normal prompt information for prompting the user to continuously keep the terminal state of the electronic terminal.
12. The method of claim 11, wherein the acquisition status parameter comprises a horizontal distance between the electronic terminal and a center point of the acquisition guide layer, the calibration parameter comprises a distance threshold range corresponding to the acquisition guide layer, and the displaying, in the graphical user interface, acquisition prompt information for the target object according to the acquisition status parameter and the calibration parameter comprises;
if the horizontal distance is outside the distance threshold range, displaying distance calibration prompt information in the graphical user interface and displaying the acquisition pointing identifier in a third display mode;
and if the horizontal distance is within the distance threshold range, displaying the acquisition pointing identifier in the graphical user interface in a fourth display mode.
13. The method of claim 11, wherein the acquisition status parameter comprises a real-time motion speed, the calibration parameter comprises a speed threshold range, and displaying acquisition prompt information for the target object in the graphical user interface based on the acquisition status parameter and the calibration parameter comprises:
Displaying a real-time speed value corresponding to the acquisition guide layer in the graphical user interface;
if the real-time movement speed is within the speed threshold range, displaying the acquisition pointing identifier in the graphical user interface by adopting a fourth display mode;
and if the real-time movement speed is out of the speed threshold range, displaying speed control prompt information in the acquisition interface and displaying the real-time speed value by adopting a fifth display mode.
14. The method of claim 11, wherein the acquisition status parameter comprises a real-time pitch angle, the calibration parameter comprises a pitch angle range, and displaying acquisition prompt information for the target object in the graphical user interface according to the acquisition status parameter and the calibration parameter comprises:
if the real-time pitch angle is out of the pitch angle range, angle calibration prompt information and a calibration sight are displayed in the graphical user interface, the acquisition pointing identifier is displayed by adopting a third display mode, and the real-time angle sight is displayed at the head position of the target object pointed by the acquisition pointing identifier;
And responding to the movement of the electronic terminal, controlling the movement of the electronic terminal to enable the real-time angle sight to move towards the calibration sight, hiding the real-time angle sight and the calibration sight if the coincidence degree between the real-time guiding sight and the calibration sight reaches a preset threshold, and displaying the acquisition pointing identifier by adopting a fourth display mode.
15. The method as recited in claim 11, further comprising:
if at least two calibration prompt messages exist, respectively acquiring prompt priorities corresponding to the calibration prompt messages, and displaying the calibration prompt message with the highest prompt priority;
and if the terminal state corresponding to the calibration prompt information with the highest prompt priority meets the condition and the calibration prompt information with the next prompt priority exists, displaying the calibration prompt information with the next prompt priority.
16. The method of claim 1 or 4 or 5, further comprising:
and responding to the movement of the electronic terminal, and if the current position of the electronic terminal is located outside the acquisition guide layer and within the acquisition range, displaying the whole acquisition progress aiming at the target object on the acquisition guide layer.
17. The method according to claim 10, wherein the controlling the electronic terminal to perform a loop-shooting motion on the acquisition guide layer along a preset direction in response to the motion of the electronic terminal, and performing image acquisition on a target object located in the center of the acquisition guide layer to obtain a target image corresponding to the target object, includes:
and responding to the movement of the electronic terminal, controlling the electronic terminal to surround at least one circle along the direction corresponding to the acquisition direction mark by taking a target object positioned in the center of the acquisition guide layer as a central axis, and acquiring an image of the target object to obtain a target image corresponding to the target object.
18. An AR-based image capturing device, wherein content displayed through a graphical user interface of an electronic terminal includes at least a real scene, the device comprising:
the acquisition range determining module is used for responding to a range positioning instruction, displaying an AR range positioning layer in the real scene and determining an acquisition range corresponding to the AR range positioning layer in the real scene;
an acquisition guide layer determining module, configured to display, in response to a guide positioning instruction for the acquisition range, an acquisition indication layer corresponding to the guide positioning instruction and a guide line landing point corresponding to a center of the acquisition indication layer in the AR range positioning layer, and determine, in the acquisition range, an acquisition guide layer corresponding to the acquisition indication layer, where the guide line landing point is configured to cause the guide line landing point to be located at the center of the AR range positioning layer or on a target object in the acquisition range when the acquisition indication layer is controlled to move in the AR range positioning layer;
And the target image acquisition module is used for responding to the movement of the electronic terminal, controlling the electronic terminal to perform loop shooting movement on the acquisition guide layer along a preset direction, and performing image acquisition on a target object positioned in the center of the acquisition guide layer to obtain a target image corresponding to the target object.
19. An electronic device comprising a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory communicate with each other via the communication bus;
the memory is used for storing a computer program;
the processor being configured to implement the method of any of claims 1-17 when executing a program stored on a memory.
20. A computer-readable storage medium having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the method of any of claims 1-17.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210496170.8A CN115002440B (en) | 2022-05-09 | 2022-05-09 | AR-based image acquisition method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210496170.8A CN115002440B (en) | 2022-05-09 | 2022-05-09 | AR-based image acquisition method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115002440A CN115002440A (en) | 2022-09-02 |
CN115002440B true CN115002440B (en) | 2023-06-09 |
Family
ID=83024351
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210496170.8A Active CN115002440B (en) | 2022-05-09 | 2022-05-09 | AR-based image acquisition method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115002440B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101833896A (en) * | 2010-04-23 | 2010-09-15 | 西安电子科技大学 | Geographic information guide method and system based on augment reality |
CN111737518A (en) * | 2020-06-16 | 2020-10-02 | 浙江大华技术股份有限公司 | Image display method and device based on three-dimensional scene model and electronic equipment |
CN113467600A (en) * | 2020-03-31 | 2021-10-01 | 深圳光峰科技股份有限公司 | Information display method, system and device based on augmented reality and projection equipment |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108037863B (en) * | 2017-12-12 | 2021-03-30 | 北京小米移动软件有限公司 | Method and device for displaying image |
CN107911621B (en) * | 2017-12-28 | 2020-04-07 | 深圳市酷开网络科技有限公司 | Panoramic image shooting method, terminal equipment and storage medium |
CN112346798A (en) * | 2020-10-10 | 2021-02-09 | 北京城市网邻信息技术有限公司 | Information acquisition method and device |
CN114202640A (en) * | 2021-12-10 | 2022-03-18 | 浙江商汤科技开发有限公司 | Data acquisition method and device, computer equipment and storage medium |
-
2022
- 2022-05-09 CN CN202210496170.8A patent/CN115002440B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101833896A (en) * | 2010-04-23 | 2010-09-15 | 西安电子科技大学 | Geographic information guide method and system based on augment reality |
CN113467600A (en) * | 2020-03-31 | 2021-10-01 | 深圳光峰科技股份有限公司 | Information display method, system and device based on augmented reality and projection equipment |
WO2021197189A1 (en) * | 2020-03-31 | 2021-10-07 | 深圳光峰科技股份有限公司 | Augmented reality-based information display method, system and apparatus, and projection device |
CN111737518A (en) * | 2020-06-16 | 2020-10-02 | 浙江大华技术股份有限公司 | Image display method and device based on three-dimensional scene model and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN115002440A (en) | 2022-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109639970B (en) | Shooting method and terminal equipment | |
CN111010510B (en) | Shooting control method and device and electronic equipment | |
CN109381165B (en) | Skin detection method and mobile terminal | |
CN111182205B (en) | Photographing method, electronic device, and medium | |
CN109600550B (en) | Shooting prompting method and terminal equipment | |
CN108989672B (en) | Shooting method and mobile terminal | |
CN107707817B (en) | video shooting method and mobile terminal | |
CN108848313B (en) | Multi-person photographing method, terminal and storage medium | |
WO2021197121A1 (en) | Image photographing method and electronic device | |
CN110970003A (en) | Screen brightness adjusting method and device, electronic equipment and storage medium | |
CN108924412B (en) | Shooting method and terminal equipment | |
CN108628515B (en) | Multimedia content operation method and mobile terminal | |
CN108683850B (en) | Shooting prompting method and mobile terminal | |
CN108132749B (en) | Image editing method and mobile terminal | |
CN109413333B (en) | Display control method and terminal | |
CN111314616A (en) | Image acquisition method, electronic device, medium and wearable device | |
US20220141390A1 (en) | Photographing method, device, and system, and computer-readable storage medium | |
CN109559280B (en) | Image processing method and terminal | |
CN111031246A (en) | Shooting method and electronic equipment | |
CN115002443B (en) | Image acquisition processing method and device, electronic equipment and storage medium | |
CN111246105B (en) | Photographing method, electronic device, and computer-readable storage medium | |
CN110913133B (en) | Shooting method and electronic equipment | |
CN115002440B (en) | AR-based image acquisition method and device, electronic equipment and storage medium | |
CN108696638B (en) | Control method of mobile terminal and mobile terminal | |
CN110955378A (en) | Control method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |