CN110553639A - Method and apparatus for generating location information - Google Patents

Method and apparatus for generating location information Download PDF

Info

Publication number
CN110553639A
CN110553639A CN201810561544.3A CN201810561544A CN110553639A CN 110553639 A CN110553639 A CN 110553639A CN 201810561544 A CN201810561544 A CN 201810561544A CN 110553639 A CN110553639 A CN 110553639A
Authority
CN
China
Prior art keywords
target
target point
image
position information
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810561544.3A
Other languages
Chinese (zh)
Other versions
CN110553639B (en
Inventor
梁耀端
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201810561544.3A priority Critical patent/CN110553639B/en
Publication of CN110553639A publication Critical patent/CN110553639A/en
Application granted granted Critical
Publication of CN110553639B publication Critical patent/CN110553639B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application discloses a method and a device for generating position information. One embodiment of the method comprises: receiving a target image sent by a mobile terminal, wherein the target image is an image obtained by shooting a target point by the mobile terminal; identifying the target image to determine whether the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range; in response to the fact that the distance between the display position of the target point in the target image and the center of the target image does not belong to the preset range, sending a moving instruction to the moving end to enable the moving end to move to the target position; position information of a target position is acquired, and position information of the target point is generated based on the acquired position information. The embodiment enriches the generation mode of the position information and improves the accuracy of information generation.

Description

Method and apparatus for generating location information
Technical Field
the embodiment of the application relates to the technical field of computers, in particular to a method and a device for generating position information.
Background
Currently, positioning technologies are widely used in people's lives, such as mobile phone positioning, car navigation positioning, and the like.
In the prior art, people generally use a Global Positioning System (GPS) to locate a target. In the process, corresponding GPS equipment needs to be placed at the position of the target, so as to position the target.
Disclosure of Invention
the embodiment of the application provides a method and a device for generating position information.
In a first aspect, an embodiment of the present application provides a method for generating location information, where the method includes: receiving a target image sent by a mobile terminal, wherein the target image is an image obtained by shooting a target point by the mobile terminal; identifying the target image to determine whether the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range; in response to determining that the distance between the display position of the target point in the target image and the center of the target image does not belong to the preset range, sending a moving instruction to the moving end to enable the moving end to move to the target position, wherein the target position is the position to which the moving end moves to enable the distance between the display position of the target point in the shot image and the center of the shot image to belong to the preset range; position information of a target position is acquired, and position information of the target point is generated based on the acquired position information.
in some embodiments, after the target image is identified to determine whether a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range, the method further comprises: in response to determining that the distance between the display position of the target point in the target image and the center of the target image belongs to the preset range, acquiring position information of the position of the mobile terminal, and generating the position information of the target point based on the position information of the position of the mobile terminal.
In some embodiments, sending a move instruction to the mobile terminal to move the mobile terminal to the target location includes: sending a moving instruction to the moving end to enable the moving end to move, and acquiring a distance value between a display position of a target point in the shot image and the center of the shot image; in response to determining that the acquired distance values belong to a preset range, determining whether the acquired distance values within a preset time period from the current time all belong to the preset range; and determining that the mobile terminal moves to the target position in response to the fact that the distance values acquired within the preset time period from the current moment all belong to the preset range.
In some embodiments, obtaining location information for a target location comprises: acquiring position information of the position of the mobile terminal every a preset time period to acquire position information of a preset number; based on the obtained preset number of position information, position information of the target position is generated.
In some embodiments, generating the location information of the target point based on the acquired location information comprises: determining an elevation value of a target position; and determining the elevation value of the target point based on the elevation value of the target position, and determining the elevation value of the target point as the position information of the target point.
in some embodiments, generating the position information of the target point based on the acquired position information further comprises: determining plane coordinates of a target position; determining planar coordinates of the target point based on the planar coordinates of the target position, and determining the planar coordinates of the target point as position information of the target point.
In a second aspect, an embodiment of the present application provides an apparatus for generating location information, where the apparatus includes: the image receiving unit is configured to receive a target image sent by the mobile terminal, wherein the target image is an image obtained by shooting a target point by the mobile terminal; an image recognition unit configured to recognize the target image to determine whether a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range; an instruction transmitting unit configured to transmit a movement instruction to the moving terminal to move the moving terminal to a target position in response to a determination that a distance between a display position of the target point in the target image and a center of the target image does not belong to a preset range, wherein the target position is a position to which the moving terminal is moved so that the distance between the display position of the target point in the captured image and the center of the captured image belongs to the preset range; a first generation unit configured to acquire position information of a target position, and generate position information of the target point based on the acquired position information.
in some embodiments, the apparatus further comprises: and a second generating unit configured to acquire position information of a position where the moving end is located in response to a determination that a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range, and generate the position information of the target point based on the position information of the position where the moving end is located.
In some embodiments, the instruction sending unit comprises: the distance acquisition module is configured to send a movement instruction to the mobile terminal so as to enable the mobile terminal to move, and acquire a distance value between a display position of a target point in the shot image and the center of the shot image; a distance determination module configured to determine whether the acquired distance values within a preset time period from a current time all belong to a preset range in response to a determination that the acquired distance values belong to the preset range; and the position determining module is used for determining that the mobile terminal moves to the target position in response to the fact that the distance values acquired within the preset time period from the current moment are within the preset range.
in some embodiments, the first generation unit includes: the information acquisition module is configured to acquire the position information of the position of the mobile terminal every preset time period so as to acquire the position information of a preset number; an information generating module configured to generate position information of the target position based on the obtained preset number of position information.
In some embodiments, the first generating unit is further configured to: determining an elevation value of a target position; and determining the elevation value of the target point based on the elevation value of the target position, and determining the elevation value of the target point as the position information of the target point.
in some embodiments, the first generating unit is further configured to: determining plane coordinates of a target position; determining planar coordinates of the target point based on the planar coordinates of the target position, and determining the planar coordinates of the target point as position information of the target point.
In a third aspect, an embodiment of the present application provides a server, including: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method of any of the embodiments of the method for generating location information described above.
In a fourth aspect, the present application provides a computer-readable medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method of any one of the above-mentioned methods for generating location information.
The method and apparatus for generating location information provided by the embodiment of the application, by receiving a target image sent by a mobile terminal, where the target image is an image obtained by shooting a target point by the mobile terminal, then recognizing the target image to determine whether a distance between a display location of the target point in the target image and a center of the target image belongs to a preset range, then in response to determining that the distance between the display location of the target point in the target image and the center of the target image does not belong to the preset range, sending a movement instruction to the mobile terminal to move the mobile terminal to a target location, where the target location is a location to which the mobile terminal moves so that the distance between the display location of the target point in the shot image and the center of the shot image belongs to the preset range, finally obtaining location information of the target location, and based on the obtained location information, the position information of the target point is generated, so that the position information of the target point is effectively acquired by using the mobile terminal, the generation mode of the position information is enriched, the influence of the surrounding environment of the target point in the information acquisition process can be reduced by acquiring the position information of the target point through the mobile terminal, and the accuracy of information generation is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for generating location information according to the present application;
FIG. 3 is a schematic illustration of an application scenario of a method for generating location information according to the present application;
FIG. 4 is a flow diagram of yet another embodiment of a method for generating location information in accordance with the present application;
FIG. 5 is a schematic diagram illustrating the structure of one embodiment of an apparatus for generating location information according to the present application;
FIG. 6 is a schematic block diagram of a computer system suitable for use in implementing a server according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
it should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
fig. 1 shows an exemplary system architecture 100 to which embodiments of the present method for generating location information or apparatus for generating location information may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. Various devices for positioning, such as a global positioning system, a laser range finder, and the like, may be mounted on the terminal devices 101, 102, 103.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having positioning functions as well as mobile functions and supporting information transfer, including but not limited to drones, artificial earth satellites, intelligent robots, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server that provides various services, such as an information processing server that locates a target point displayed in an image transmitted by the terminal apparatuses 101, 102, 103. The information processing server may perform processing such as analysis on the received data such as the target image, and generate a processing result (e.g., position information of the target point).
It should be noted that the method for generating the location information provided in the embodiment of the present application is generally performed by the server 105, and accordingly, the apparatus for generating the information is generally disposed in the server 105.
the server may be hardware or software. When the server is hardware, it may be implemented as a distributed server cluster formed by multiple servers, or may be implemented as a single server. When the server is software, it may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules used to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for generating location information in accordance with the present application is shown. The method for generating location information includes the steps of:
Step 201, receiving the target image sent by the mobile terminal.
In the present embodiment, the execution subject (e.g., the server shown in fig. 1) of the method for generating location information may receive the target image from the mobile terminal (e.g., the terminal device shown in fig. 1) through a wired connection manner or a wireless connection manner. The target image may be an image obtained by shooting the target point by the moving end. The target point may be a point whose position information is to be determined. Specifically, the target point may be, for example, a point on the ground, a point on a building, a point on natural landscape, or the like.
Here, the moving terminal may obtain the target image in various ways. Specifically, as an example, when the mobile terminal is in a moving state, the execution subject may continuously obtain an image captured by the mobile terminal for a technician to observe, and when the technician observes that a target point is included in the captured image, the execution subject may be used to indicate that the mobile terminal has obtained a target image including the target point; or, the mobile terminal may move to an area for shooting the target point in advance, and then shoot the target point to obtain the target image.
The images (including the target image) captured by the moving end may be images of various shapes (e.g., a circle, a rectangle, a square, etc.) having a center. In addition, the moving direction of the moving end is perpendicular to the shooting direction in the process of shooting the target point.
Step 202, identifying the target image to determine whether the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range.
In this embodiment, based on the target image obtained in step 201, the executing entity may identify the target image to determine whether the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range. The preset range may be a distance range preset by a technician, and when the distance between the display position of the target point in the target image and the center of the target image belongs to the preset range, it may be approximately determined that the display position of the target point in the target image coincides with the center of the target image.
here, the execution subject may determine a distance between a display position of the target point in the target image and a center of the target image. Specifically, the executing entity may identify the target image by using an image identification technology to determine position information (e.g., position coordinates) of a display position of the target point in the target image, and then determine a distance between the display position and a center of the target image by using the position information; alternatively, the executing entity may recognize the target image by using an image recognition technology to determine a target point in the target image, then connect the target point in the target image with the center of the target image, and determine the length of a line segment obtained by the connection as the distance between the display position of the target point in the target image and the center of the target image. It should be noted that the image recognition technology is a well-known technology that is widely researched and applied at present, and is not described herein again.
step 203, in response to determining that the distance between the display position of the target point in the target image and the center of the target image does not belong to the preset range, sending a moving instruction to the moving end to move the moving end to the target position.
In this embodiment, the executing body may send a moving instruction to the moving end to move the moving end to the target position in response to determining that the distance between the display position of the target point in the target image and the center of the target image does not belong to the preset range. The moving instruction may be used to instruct the moving terminal to move. The target position may be a position to which the moving end moves after receiving the moving instruction so that a distance between a display position of the target point in the captured image and the center of the captured image falls within a preset range.
it can be understood that, during the moving process of the moving end, the moving end may continuously shoot the target point and send the shot image to the execution main body, so that the execution main body identifies the shot image, determines the distance between the display position of the target point in the shot image and the center of the shot image, and further determines whether the moving end moves to the target position based on the determined distance. Specifically, as an example, the execution body described above may determine that the moving end moves to the target position in response to determining that the distance between the display position of the target point in the captured image and the center of the captured image belongs to a preset range. Here, in response to determining that the mobile terminal moves to the target position, the execution body may send a stationary command to the mobile terminal to cause the mobile terminal to be stationary at the target position.
Step 204, position information of the target position is acquired, and position information of the target point is generated based on the acquired position information.
In the present embodiment, the execution body described above may acquire position information of a target position, and generate position information of the target point based on the acquired position information. Wherein the location information may include, but is not limited to, at least one of: numbers, words, symbols, pictures. Specifically, the executing body may obtain the position information of the target position sent by the mobile terminal, or may locate the target position after determining the target position, so as to obtain the position information of the target position. It should be noted that, here, the mobile terminal may obtain the location information of its own location in real time through a preset positioning device. The positioning device may be software or hardware.
In some optional implementations of this embodiment, the executing body may obtain the position information of the target position by: first, every preset time period, the execution main body may obtain the location information of the location where the mobile terminal is located, so as to obtain a preset number of location information. Then, the execution body may generate the position information of the target position based on the obtained preset number of position information. For example, the execution subject may determine a position closest to the target point from among positions corresponding to a preset number of pieces of position information, and determine position information of the determined position as position information of the target position; or, the position information is a position coordinate of a position where the mobile terminal is located, and the execution main body may perform an average calculation on coordinate values of a preset number of obtained position coordinates, and determine a result coordinate obtained by the calculation as the position information of the target position.
It can be understood that, since the moving direction of the moving end is perpendicular to the shooting direction, when the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range (approximately, the display position and the center coincide), that is, when the moving end moves to the target position, the moving end and the target point have the same position information in the direction perpendicular to the shooting direction, and the execution main body can determine the position information of the target point based on the position information of the target position.
In some optional implementations of the embodiment, the executing entity may generate the position information of the target point by: first, the execution body may determine an elevation value of the target location. Then, the execution subject may determine the elevation values of the target points based on the elevation values of the target positions, and determine the elevation values of the target points as position information of the target points.
As an example, the moving direction of the moving end is a horizontal direction, and the photographing direction is perpendicular to the moving direction. When the mobile terminal moves to the target position, the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range (approximately the display position is coincident with the center), at this time, the execution main body can acquire the elevation value of the target position, determine the distance value between the target position and the target point, then calculate the difference between the acquired elevation value and the determined distance value, acquire the elevation value of the target point, and determine the acquired elevation value as the position information of the target point.
as another example, the shooting direction of the moving end is the horizontal direction, and the moving direction is perpendicular to the shooting direction. When the mobile terminal moves to the target position, the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range (approximately coinciding with the display position and the center), at this time, the elevation value of the target position is the same as the elevation value of the target point, and the execution main body can acquire the elevation value of the target position, determine the acquired elevation value as the elevation value of the target point, and determine the determined elevation value of the target point as the position information of the target point.
In some optional implementations of this embodiment, the executing body may further generate the position information of the target point by: first, the execution body may determine plane coordinates of the target position. Then, the execution body may determine the plane coordinates of the target point based on the plane coordinates of the target position, and determine the plane coordinates of the target point as the position information of the target point.
as an example, the moving direction of the moving end is a horizontal direction, and the photographing direction is perpendicular to the moving direction. When the mobile terminal moves to the target position, the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range (approximately the display position is coincident with the center), at this time, the plane coordinate of the target position is the same as the plane coordinate of the target point, and further, the execution main body can acquire the plane coordinate of the target position, determine the acquired plane coordinate as the plane coordinate of the target point, and determine the determined plane coordinate of the target point as the position information of the target point.
As another example, the shooting direction of the moving end is the horizontal direction, and the moving direction is perpendicular to the shooting direction. When the mobile terminal moves to the target position, the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range (approximately the display position coincides with the center), at this time, the execution main body can acquire the plane coordinate of the target position, determine the distance value between the target position and the target point, then determine the plane coordinate of the target point based on the acquired plane coordinate and the determined distance value, and further determine the plane coordinate of the determined target point as the position information of the target point.
Specifically, when the shooting direction coincides with the x-axis of the coordinate system in which the plane coordinates are located, the coordinate value in the y-axis direction in the plane coordinates of the target point is the same as the coordinate value in the y-axis direction in the plane coordinates of the target position. For the coordinate value in the x-axis direction in the plane coordinate of the target point, the execution subject may perform a difference between the coordinate value in the x-axis direction in the obtained plane coordinate and the determined distance value to obtain the coordinate value in the x-axis direction in the plane coordinate of the target point, that is, obtain the plane coordinate of the target point; when the photographing direction coincides with the y-axis of the coordinate system in which the plane coordinates are located, the coordinate value of the x-axis direction in the plane coordinates of the target point is the same as the coordinate value of the x-axis direction in the plane coordinates of the target position. For the coordinate value in the y axis direction in the plane coordinate of the target point, the executing entity may perform a difference between the coordinate value in the y axis direction in the obtained plane coordinate and the determined distance value, obtain the coordinate value in the y axis direction in the plane coordinate of the target point, and further obtain the plane coordinate of the target point.
In some optional implementation manners of this embodiment, the executing body may further obtain, in response to determining that a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range (approximately, the display position coincides with the center), position information of a position where the moving end is located, and generate the position information of the target point based on the position information of the position where the moving end is located. In this implementation, the location of the mobile terminal is equivalent to the target location in step 204, and the executing entity may further obtain the location information of the location of the mobile terminal by using the method described in step 204 and related to the target location, and generate the location information of the target point based on the location information of the location of the mobile terminal.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for generating location information according to the present embodiment. In the application scenario of fig. 3, the server 301 first receives a target image 303 sent by the drone 302, where the target image 303 is an image obtained by shooting a target point 304 by the drone. Next, the server 301 may identify the target image 303 to determine whether a distance between a display position of the target point 304 in the target image 303 and the center of the target image 303 belongs to a preset range. Then, in response to determining that the distance between the display position of the target point 304 in the target image 303 and the center of the target image 303 does not belong to the preset range, the server 301 may send a movement instruction 305 to the drone 302 to move the drone 302 to the target position at which the distance between the display position of the target point 304 in the captured image 306 and the center of the captured image 306 belongs to the preset range. Next, the server 301 may acquire the position information 307 of the target position transmitted by the drone 302, and generate the position information 308 of the target point 304 based on the acquired position information 307.
The method provided by the embodiment of the application enriches the generation mode of the position information by acquiring the position information of the target point through the mobile terminal, and can reduce the influence of the surrounding environment of the target point in the information acquisition process and improve the accuracy of information generation by acquiring the position information of the target point through the mobile terminal.
With further reference to fig. 4, a flow 400 of yet another embodiment of a method for generating location information is shown. The flow 400 of the method for generating location information comprises the steps of:
Step 401, receiving a target image sent by a mobile terminal.
in the present embodiment, the execution subject (e.g., the server shown in fig. 1) of the method for generating location information may receive the target image from the mobile terminal (e.g., the terminal device shown in fig. 1) through a wired connection manner or a wireless connection manner. The target image may be an image obtained by shooting the target point by the moving end. The target point may be a point whose position information is to be determined.
Step 402, identifying the target image to determine whether the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range.
In this embodiment, based on the target image obtained in step 401, the executing entity may identify the target image to determine whether the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range. The preset range may be a distance range preset by a technician, and when the distance between the display position of the target point in the target image and the center of the target image belongs to the preset range, it may be approximately determined that the display position of the target point in the target image coincides with the center of the target image.
And step 403, in response to determining that the distance between the display position of the target point in the target image and the center of the target image does not belong to the preset range, sending a moving instruction to the moving end to move the moving end, and acquiring a distance value between the display position of the target point in the shot image and the center of the shot image.
in this embodiment, the execution subject may send a movement instruction to the moving end to move the moving end in response to determining that the distance between the display position of the target point in the target image and the center of the target image does not belong to the preset range, and acquire a value of the distance between the display position of the target point in the captured image and the center of the captured image. The moving instruction may be used to instruct the moving terminal to move. It can be understood that, during the moving process, the moving end may continuously capture the target point and send the captured image to the executing entity, and the executing entity may continuously determine the distance value between the display position of the target point in the captured image and the center of the captured image.
in response to determining that the acquired distance values belong to the preset range, step 404 determines whether the acquired distance values within a preset time period from the current time all belong to the preset range.
In this embodiment, the executing body may determine whether the distance values acquired within the preset time period from the current time all belong to the preset range in response to determining that the currently acquired distance value belongs to the preset range. The preset time period may be any time period preset by a technician, for example, 10 minutes.
Step 405, in response to determining that the distance values acquired within the preset time period from the current time all belong to the preset range, determining that the mobile terminal moves to the target position.
in this embodiment, the executing body may determine that the mobile terminal moves to the target position in response to determining that the distance values acquired within the preset time period from the current time all belong to the preset range.
In particular, in response to determining that the mobile terminal moves to the target position, the execution body may send a stationary command to the mobile terminal to cause the mobile terminal to be stationary at the target position.
Step 406, position information of the target position is acquired, and position information of the target point is generated based on the acquired position information.
in the present embodiment, the execution body described above may acquire position information of a target position, and generate position information of the target point based on the acquired position information. Wherein the location information may include, but is not limited to, at least one of: numbers, words, symbols, pictures. Specifically, the executing body may obtain the position information of the target position sent by the mobile terminal, or may locate the target position after determining the target position, so as to obtain the position information of the target position. It should be noted that, here, the mobile terminal may obtain the location information of its own location in real time through a preset positioning device. The positioning device may be software or hardware.
it can be understood that, since the moving direction of the moving end is perpendicular to the shooting direction, when the display position of the target point in the target image coincides with the center of the target image, that is, when the moving end moves to the target position, the moving end and the target point have the same position information in the direction perpendicular to the shooting direction, and the execution subject can determine the position information of the target point based on the position information of the target position.
Step 401, step 402, and step 406 are respectively the same as step 201, step 202, and step 204 in the foregoing embodiment, and the above description for step 201, step 202, and step 204 also applies to step 401, step 402, and step 406, which is not described herein again.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, the flow 400 of the method for generating position information in the present embodiment highlights the step of determining whether the moving end moves to the target position by the distance value between the display position of the target point in the captured image and the center of the captured image within the preset time period. Therefore, the scheme described in this embodiment can determine the position to which the mobile terminal with a more stable moving state moves as the target position, so that more accurate position information can be generated.
with further reference to fig. 5, as an implementation of the method shown in the above-mentioned figures, the present application provides an embodiment of an apparatus for generating location information, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable to various electronic devices.
As shown in fig. 5, the apparatus 500 for generating location information of the present embodiment includes: an image receiving unit 501, an image recognizing unit 502, an instruction transmitting unit 503, and a first generating unit 504. The image receiving unit 501 is configured to receive a target image sent by a mobile terminal, where the target image is an image obtained by shooting a target point by the mobile terminal; the image recognition unit 502 is configured to recognize the target image to determine whether a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range; the instruction transmitting unit 503 is configured to transmit a movement instruction to the moving terminal to move the moving terminal to the target position in response to a determination that the distance between the display position of the target point in the target image and the center of the target image does not belong to the preset range, wherein the target position is a position to which the moving terminal is moved such that the distance between the display position of the target point in the captured image and the center of the captured image belongs to the preset range; the first generation unit 504 is configured to acquire position information of a target position, and generate position information of the target point based on the acquired position information.
In this embodiment, the image receiving unit 501 of the apparatus 500 for generating location information may receive the target image from a mobile terminal (e.g., a terminal device shown in fig. 1) through a wired connection manner or a wireless connection manner. The target image may be an image obtained by shooting the target point by the moving end. The target point may be a point whose position information is to be determined.
Here, the moving terminal may obtain the target image in various ways. Specifically, as an example, when the mobile terminal is in a moving state, the image receiving unit 501 may continuously obtain an image captured by the mobile terminal for a technician to observe, and when the technician observes that a target point is included in the captured image, the technician may use the apparatus 500 to indicate that the mobile terminal has obtained a target image including the target point; or, the mobile terminal may move to an area for shooting the target point in advance, and then shoot the target point to obtain the target image.
The images (including the target image) captured by the moving end may be images of various shapes (e.g., a circle, a rectangle, a square, etc.) having a center. In addition, the moving direction of the moving end is perpendicular to the shooting direction in the process of shooting the target point.
In this embodiment, based on the target image obtained by the image receiving unit 501, the image recognition unit 502 may recognize the target image to determine whether a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range. The preset range may be a distance range preset by a technician, and when the distance between the display position of the target point in the target image and the center of the target image belongs to the preset range, it may be approximately determined that the display position of the target point in the target image coincides with the center of the target image.
Here, the image recognition unit 502 may determine a distance between the display position of the target point in the target image and the center of the target image. Specifically, the image recognition unit 502 may recognize the target image by using an image recognition technology to determine position information (e.g., position coordinates) of a display position of the target point in the target image, and then determine a distance between the display position and the center of the target image by using the position information; alternatively, the image recognition unit 502 may recognize the target image by using an image recognition technology to determine the target point in the target image, and then connect the target point in the target image with the center of the target image, and determine the length of the line segment obtained by the connection as the distance between the display position of the target point in the target image and the center of the target image. It should be noted that the image recognition technology is a well-known technology that is widely researched and applied at present, and is not described herein again.
in this embodiment, the instruction transmitting unit 503 may transmit a movement instruction to the moving end to move the moving end to the target position in response to determining that the distance between the display position of the target point in the target image and the center of the target image does not belong to the preset range. The moving instruction may be used to instruct the moving terminal to move. The target position may be a position to which the moving end moves after receiving the moving instruction so that a distance between a display position of the target point in the captured image and the center of the captured image falls within a preset range.
In the present embodiment, the first generation unit 504 may acquire position information of the target position, and generate position information of the target point based on the acquired position information. Wherein the location information may include, but is not limited to, at least one of: numbers, words, symbols, pictures. Specifically, the first generating unit 504 may obtain the position information of the target position sent by the mobile terminal, or may locate the target position after determining the target position, so as to obtain the position information of the target position. It should be noted that, here, the mobile terminal may obtain the location information of its own location in real time through a preset positioning device. The positioning device may be software or hardware.
It is to be understood that, since the moving direction of the moving end is perpendicular to the shooting direction, when the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range (approximately, the display position coincides with the center), that is, when the moving end moves to the target position, the moving end and the target point have the same position information in the direction perpendicular to the shooting direction, and the first generating unit 504 can determine the position information of the target point based on the position information of the target position.
In some optional implementations of this embodiment, the apparatus 500 may further include: and a second generating unit configured to acquire position information of a position where the moving end is located in response to a determination that a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range, and generate the position information of the target point based on the position information of the position where the moving end is located.
in some optional implementations of this embodiment, the instruction sending unit 503 may include: the distance acquisition module is configured to send a movement instruction to the mobile terminal so as to enable the mobile terminal to move, and acquire a distance value between a display position of a target point in the shot image and the center of the shot image; a distance determination module configured to determine whether the acquired distance values within a preset time period from a current time all belong to a preset range in response to a determination that the acquired distance values belong to the preset range; and the position determining module is used for determining that the mobile terminal moves to the target position in response to the fact that the distance values acquired within the preset time period from the current moment are within the preset range.
In some optional implementations of the present embodiment, the first generating unit 504 may include: the information acquisition module is configured to acquire the position information of the position of the mobile terminal every preset time period so as to acquire the position information of a preset number; an information generating module configured to generate position information of the target position based on the obtained preset number of position information.
In some optional implementations of the present embodiment, the first generating unit 504 may be further configured to: determining an elevation value of a target position; and determining the elevation value of the target point based on the elevation value of the target position, and determining the elevation value of the target point as the position information of the target point.
In some optional implementations of the present embodiment, the first generating unit 504 may be further configured to: determining plane coordinates of a target position; determining planar coordinates of the target point based on the planar coordinates of the target position, and determining the planar coordinates of the target point as position information of the target point.
The apparatus 500 provided in the above embodiment of the present application receives, by the image receiving unit 501, a target image transmitted by a mobile terminal, where the target image is an image obtained by shooting a target point by the mobile terminal, then the image recognizing unit 502 recognizes the target image to determine whether a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range, then the instruction transmitting unit 503 transmits a movement instruction to the mobile terminal to move the mobile terminal to the target position in response to determining that the distance between the display position of the target point in the target image and the center of the target image does not belong to the preset range, and finally the first generating unit 504 acquires position information of the target position and generates position information of the target point based on the acquired position information, thereby effectively utilizing the mobile terminal to acquire the position information of the target point, the generation mode of the position information is enriched, and the influence of the surrounding environment of the target point in the information acquisition process can be reduced by acquiring the position information of the target point through the mobile terminal, so that the accuracy of information generation is improved.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing a server according to embodiments of the present application. The server shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
in particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an image receiving unit, an image recognizing unit, an instruction transmitting unit, and a first generating unit. Here, the names of these units do not constitute a limitation of the unit itself in some cases, and for example, the image receiving unit may also be described as a "unit that receives a target image moved to transmission".
as another aspect, the present application also provides a computer-readable medium, which may be contained in the server described in the above embodiments; or may exist separately and not be assembled into the server. The computer readable medium carries one or more programs which, when executed by the server, cause the server to: receiving a target image sent by a mobile terminal, wherein the target image is an image obtained by shooting a target point by the mobile terminal; identifying the target image to determine whether the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range; in response to determining that the distance between the display position of the target point in the target image and the center of the target image does not belong to the preset range, sending a moving instruction to the moving end to enable the moving end to move to the target position, wherein the target position is the position to which the moving end moves to enable the distance between the display position of the target point in the shot image and the center of the shot image to belong to the preset range; position information of a target position is acquired, and position information of the target point is generated based on the acquired position information.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (14)

1. A method for generating location information, comprising:
Receiving a target image sent by a mobile terminal, wherein the target image is an image obtained by shooting a target point by the mobile terminal;
Identifying the target image to determine whether the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range;
In response to determining that the distance between the display position of the target point in the target image and the center of the target image does not belong to a preset range, sending a moving instruction to the moving end to move the moving end to a target position, wherein the target position is a position to which the moving end moves to enable the distance between the display position of the target point in the captured image and the center of the captured image to belong to the preset range;
Position information of the target position is acquired, and position information of the target point is generated based on the acquired position information.
2. the method of claim 1, wherein after the identifying the target image to determine whether a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range, the method further comprises:
And in response to determining that the distance between the display position of the target point in the target image and the center of the target image belongs to a preset range, acquiring the position information of the position of the mobile terminal, and generating the position information of the target point based on the position information of the position of the mobile terminal.
3. The method of claim 1, wherein the sending a move command to the mobile terminal to move the mobile terminal to a target location comprises:
sending a moving instruction to the moving end to enable the moving end to move and obtain a distance value between a display position of a target point in the shot image and the center of the shot image;
In response to determining that the acquired distance values belong to the preset range, determining whether the acquired distance values within a preset time period from the current time all belong to the preset range;
And determining that the mobile terminal moves to the target position in response to the fact that the distance values acquired within the preset time period from the current moment all belong to the preset range.
4. The method of claim 1, wherein the obtaining location information of the target location comprises:
Acquiring position information of the position of the mobile terminal every a preset time period to acquire position information of a preset number;
And generating the position information of the target position based on the obtained preset number of position information.
5. the method according to one of claims 1 to 4, wherein the generating of the position information of the target point based on the acquired position information comprises:
Determining an elevation value of the target position;
And determining the elevation value of the target point based on the elevation value of the target position, and determining the elevation value of the target point as the position information of the target point.
6. The method according to one of claims 1 to 4, wherein the generating of the position information of the target point based on the acquired position information further comprises:
determining plane coordinates of the target position;
Determining the plane coordinates of the target point based on the plane coordinates of the target position, and determining the plane coordinates of the target point as the position information of the target point.
7. An apparatus for generating location information, comprising:
the image receiving unit is configured to receive a target image sent by a mobile terminal, wherein the target image is an image obtained by shooting a target point by the mobile terminal;
An image recognition unit configured to recognize the target image to determine whether a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range;
An instruction transmitting unit configured to transmit a movement instruction to the moving end to move the moving end to a target position in response to a determination that a distance between a display position of the target point in the target image and a center of the target image does not belong to a preset range, wherein the target position is a position to which the moving end is moved such that the distance between the display position of the target point in the captured image and the center of the captured image belongs to the preset range;
A first generation unit configured to acquire position information of the target position, and generate position information of the target point based on the acquired position information.
8. The apparatus of claim 7, wherein the apparatus further comprises:
A second generating unit configured to acquire position information of a position where the moving end is located in response to a determination that a distance between a display position of the target point in the target image and a center of the target image belongs to a preset range, and generate the position information of the target point based on the position information of the position where the moving end is located.
9. the apparatus of claim 7, wherein the instruction transmitting unit comprises:
The distance acquisition module is configured to send a movement instruction to the mobile terminal so as to enable the mobile terminal to move, and acquire a distance value between a display position of a target point in the shot image and the center of the shot image;
A distance determination module configured to determine whether the acquired distance values within a preset time period from a current time all belong to the preset range in response to a determination that the acquired distance values belong to the preset range;
And the position determining module is used for determining that the mobile terminal moves to the target position in response to the fact that the distance values acquired within the preset time period from the current moment are all within the preset range.
10. The apparatus of claim 7, wherein the first generating unit comprises:
the information acquisition module is configured to acquire the position information of the position of the mobile terminal every preset time period so as to acquire the position information of a preset number;
an information generating module configured to generate position information of the target position based on the obtained preset number of position information.
11. The apparatus according to one of claims 7-10, wherein the first generating unit is further configured to:
determining an elevation value of the target position;
And determining the elevation value of the target point based on the elevation value of the target position, and determining the elevation value of the target point as the position information of the target point.
12. The apparatus according to one of claims 7-10, wherein the first generating unit is further configured to:
Determining plane coordinates of the target position;
Determining the plane coordinates of the target point based on the plane coordinates of the target position, and determining the plane coordinates of the target point as the position information of the target point.
13. A server, comprising:
One or more processors;
A storage device having one or more programs stored thereon,
When executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-6.
14. a computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-6.
CN201810561544.3A 2018-06-04 2018-06-04 Method and apparatus for generating location information Active CN110553639B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810561544.3A CN110553639B (en) 2018-06-04 2018-06-04 Method and apparatus for generating location information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810561544.3A CN110553639B (en) 2018-06-04 2018-06-04 Method and apparatus for generating location information

Publications (2)

Publication Number Publication Date
CN110553639A true CN110553639A (en) 2019-12-10
CN110553639B CN110553639B (en) 2021-04-27

Family

ID=68734620

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810561544.3A Active CN110553639B (en) 2018-06-04 2018-06-04 Method and apparatus for generating location information

Country Status (1)

Country Link
CN (1) CN110553639B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116147648A (en) * 2022-12-31 2023-05-23 珠海泰坦新动力电子有限公司 Positioning adjustment method, positioning tool, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101346603A (en) * 2006-05-17 2009-01-14 丰田自动车株式会社 Object recognition device
CN104197901A (en) * 2014-09-19 2014-12-10 成都翼比特科技有限责任公司 Image distance measurement method based on marker
CN104883497A (en) * 2015-04-30 2015-09-02 广东欧珀移动通信有限公司 Positioning shooting method and mobile terminal
CN105588543A (en) * 2014-10-22 2016-05-18 中兴通讯股份有限公司 Camera-based positioning method, device and positioning system
CN105865451A (en) * 2016-04-19 2016-08-17 深圳市神州云海智能科技有限公司 Method and device applied to indoor location of mobile robot
US20170301104A1 (en) * 2015-12-16 2017-10-19 Objectvideo, Inc. Profile matching of buildings and urban structures
CN107702714A (en) * 2017-07-31 2018-02-16 广州维绅科技有限公司 Localization method, apparatus and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101346603A (en) * 2006-05-17 2009-01-14 丰田自动车株式会社 Object recognition device
CN104197901A (en) * 2014-09-19 2014-12-10 成都翼比特科技有限责任公司 Image distance measurement method based on marker
CN105588543A (en) * 2014-10-22 2016-05-18 中兴通讯股份有限公司 Camera-based positioning method, device and positioning system
CN104883497A (en) * 2015-04-30 2015-09-02 广东欧珀移动通信有限公司 Positioning shooting method and mobile terminal
US20170301104A1 (en) * 2015-12-16 2017-10-19 Objectvideo, Inc. Profile matching of buildings and urban structures
CN105865451A (en) * 2016-04-19 2016-08-17 深圳市神州云海智能科技有限公司 Method and device applied to indoor location of mobile robot
CN107702714A (en) * 2017-07-31 2018-02-16 广州维绅科技有限公司 Localization method, apparatus and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116147648A (en) * 2022-12-31 2023-05-23 珠海泰坦新动力电子有限公司 Positioning adjustment method, positioning tool, device, equipment and storage medium
CN116147648B (en) * 2022-12-31 2024-04-05 珠海泰坦新动力电子有限公司 Positioning adjustment method, positioning tool, device, equipment and storage medium

Also Published As

Publication number Publication date
CN110553639B (en) 2021-04-27

Similar Documents

Publication Publication Date Title
CN109191514B (en) Method and apparatus for generating a depth detection model
CN110427917B (en) Method and device for detecting key points
CN108492364B (en) Method and apparatus for generating image generation model
CN109255337B (en) Face key point detection method and device
CN107941226B (en) Method and device for generating a direction guideline for a vehicle
CN110033423B (en) Method and apparatus for processing image
CN110059623B (en) Method and apparatus for generating information
CN113255619B (en) Lane line recognition and positioning method, electronic device, and computer-readable medium
CN110225400B (en) Motion capture method and device, mobile terminal and storage medium
CN110555876B (en) Method and apparatus for determining position
CN109029466A (en) indoor navigation method and device
CN110673717A (en) Method and apparatus for controlling output device
CN109034214B (en) Method and apparatus for generating a mark
CN111340015A (en) Positioning method and device
CN110553639B (en) Method and apparatus for generating location information
CN109840059B (en) Method and apparatus for displaying image
CN109816791B (en) Method and apparatus for generating information
CN112102417A (en) Method and device for determining world coordinates and external reference calibration method for vehicle-road cooperative roadside camera
CN107942692B (en) Information display method and device
CN111586295B (en) Image generation method and device and electronic equipment
CN111385460A (en) Image processing method and device
CN115170395A (en) Panoramic image stitching method, panoramic image stitching device, electronic equipment, panoramic image stitching medium and program product
CN111768443A (en) Image processing method and device based on mobile camera
CN112070903A (en) Virtual object display method and device, electronic equipment and computer storage medium
CN109977784B (en) Method and device for acquiring information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant