CN110418049B - Position information processing method and device, mobile terminal and storage medium - Google Patents

Position information processing method and device, mobile terminal and storage medium Download PDF

Info

Publication number
CN110418049B
CN110418049B CN201810386192.2A CN201810386192A CN110418049B CN 110418049 B CN110418049 B CN 110418049B CN 201810386192 A CN201810386192 A CN 201810386192A CN 110418049 B CN110418049 B CN 110418049B
Authority
CN
China
Prior art keywords
terminal
camera
position information
environment image
camera module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810386192.2A
Other languages
Chinese (zh)
Other versions
CN110418049A (en
Inventor
张正山
范晓宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810386192.2A priority Critical patent/CN110418049B/en
Publication of CN110418049A publication Critical patent/CN110418049A/en
Application granted granted Critical
Publication of CN110418049B publication Critical patent/CN110418049B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/02Reservations, e.g. for tickets, services or events
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information

Abstract

The application relates to a position information processing method, a position information processing device, a mobile terminal and a computer readable storage medium. The method comprises the following steps: the method comprises the steps of receiving a position information acquisition request, controlling a camera to rotationally acquire an environment image of the position of a first terminal according to the position information acquisition request, and sending the environment image to a second terminal, wherein the environment image is used for indicating the second terminal to determine the position of the first terminal. The second terminal can accurately determine the position of the first terminal according to the environment image acquired by controlling the camera to rotate, so that the accuracy of the position information is improved.

Description

Position information processing method and device, mobile terminal and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for processing location information, a mobile terminal, and a computer-readable storage medium.
Background
With the development of computer technology, the phenomenon of reserving a travel vehicle through application software is more and more common. When the passenger reserves the vehicle through the application software, the current position of the passenger is automatically positioned through a Global Positioning System (GPS), the position information of the current position is acquired and sent to the vehicle driver, and the vehicle driver can know the position of the passenger according to the received position information.
However, the conventional technology provides limited location information, and when the passenger is located at a traffic-intensive or intersection, the driver of the vehicle cannot accurately know the specific location of the passenger.
Disclosure of Invention
The embodiment of the application provides a position information processing method and device, a mobile terminal and a computer readable storage medium, which can improve the accuracy of position information.
A location information processing method, comprising:
receiving a position information acquisition request;
controlling a camera to rotate to acquire an environment image of the position of the first terminal according to the position information acquisition request;
and sending the environment image to a second terminal, wherein the environment image is used for indicating the second terminal to determine the position of the first terminal.
A positional information processing apparatus comprising:
the request receiving module is used for receiving a position information acquisition request;
the image acquisition module is used for controlling the camera to rotate according to the position information acquisition request so as to acquire an environment image of the position of the first terminal;
and the sending module is used for sending the environment image to a second terminal, and the environment image is used for indicating the second terminal to determine the position of the first terminal.
A mobile terminal comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of:
receiving a position information acquisition request;
controlling a camera to rotate to acquire an environment image of the position of the first terminal according to the position information acquisition request;
and sending the environment image to a second terminal, wherein the environment image is used for indicating the second terminal to determine the position of the first terminal.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
receiving a position information acquisition request;
controlling a camera to rotate to acquire an environment image of the position of the first terminal according to the position information acquisition request;
and sending the environment image to a second terminal, wherein the environment image is used for indicating the second terminal to determine the position of the first terminal.
According to the position information processing method, the position information processing device, the mobile terminal and the computer readable storage medium, the camera is controlled to rotate to acquire the environment image of the position of the first terminal according to the position information acquisition request by receiving the position information acquisition request, the environment image is sent to the second terminal, and the environment image is used for indicating the second terminal to determine the position of the first terminal. The second terminal can determine the specific position of the first terminal according to the environment image of the position of the first terminal, which is rotationally collected by the first terminal, so that the accuracy of the position information is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an application environment of a location information processing method in one embodiment;
FIG. 2 is a flow diagram of a method for location information processing in one embodiment;
fig. 3 is a flowchart of a location information processing method in another embodiment;
FIG. 4 is a schematic diagram illustrating movement of a camera to a second position in one embodiment;
FIG. 5 is a schematic view of the camera head moving to a second position in another embodiment;
FIG. 6 is a flow diagram of a method for location information processing in one embodiment;
FIG. 7 is a diagram illustrating the first terminal shown in FIG. 5 adjusting a photographing angle according to an embodiment;
fig. 8 is a flowchart of a location information processing method in still another embodiment;
FIG. 9 is a flow diagram of a method for location information processing in one embodiment;
FIG. 10 is a block diagram showing the configuration of a position information processing apparatus according to an embodiment;
fig. 11 is a block diagram of a partial structure of a cellular phone in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is a schematic diagram of an application environment of a location information processing method in one embodiment. As shown in fig. 1, the application environment includes a first terminal 120 and a second terminal 140. The first terminal 120 and the second terminal 140 are connected through a network. The camera module in the first terminal 120 can be controlled by the driving component. The first terminal 120 receives the position information obtaining request, controls the camera to rotationally collect an environment image of the position of the first terminal 120 according to the position information obtaining request, and sends the environment image to the second terminal 140, wherein the environment image is used for indicating the second terminal 140 to determine the position of the first terminal 120. The first terminal 120 may be, but is not limited to, a mobile phone, a tablet computer, a personal digital assistant, a wearable device, and the like. The second terminal 140 may be, but is not limited to, a vehicle-mounted device, a handheld device such as a mobile phone, a tablet computer, a personal digital assistant, a wearable device, and the like.
Fig. 2 is a flowchart of a location information processing method in one embodiment. The position information processing method in this embodiment is described by taking the first terminal operating in fig. 1 as an example. As shown in fig. 2, the position information processing method includes steps 202 to 206.
Step 202, receiving a position information acquisition request.
The location information refers to location information of a location where the first terminal is located. Specifically, the location information may be azimuth information, street information, or the like, but is not limited thereto. For example, the location information may be city information, street information, building information, and the like, where the first terminal is located, and may also be longitude and latitude information, and the like, where the first terminal is located. The position information acquisition request can be generated by clicking a button on the display screen of the first terminal by a user or can be generated by pressing a control on the touch screen of the first terminal by the user.
And 204, controlling the camera to rotate to acquire the environment image of the position of the first terminal according to the position information acquisition request.
The first terminal may include a camera module and a driving assembly driving the camera module. The camera module may include one or more cameras. The camera may include a lens, a base, a sensor, and a circuit board assembled together in this order. When the number of the cameras is two, one camera can be configured to be a front camera, the other camera is a rear camera, or the two cameras are rear cameras. When the cameras are three or more, a front camera can be configured, and other cameras are rear cameras, or two front cameras are configured, and other cameras are rear cameras. The camera can be a built-in camera or an external camera. The environment image refers to an image collected by the camera about the camera surrounding environment of the first terminal. Specifically, the environment image may include objects such as buildings, roads, people, flowers, plants, trees, and the like. The angle of rotation of the camera can be preset according to the requirements of users. Specifically, the angle by which the camera is rotated may be 180 degrees, 270 degrees, 360 degrees, or the like, but is not limited thereto.
After the first terminal receives the position information acquisition request, the driving assembly can drive the camera to rotationally acquire an environment image around the position of the first terminal.
Step 206, sending the environment image to the second terminal, where the environment image is used to instruct the second terminal to determine the location of the first terminal.
The first terminal can send the environment image to the second terminal through Wireless communication technology such as 2G/3G/4G/WiFi (Wireless Fidelity). Specifically, the first terminal may directly send the environment image to the second terminal through the network, or may upload the environment image to the server through the network, and then send the environment image to the second terminal through the server. The second terminal can acquire the specific positions of the first terminal and the user thereof according to the information of the face, surrounding buildings, guideboards, the position and the like of the user in the environment image. For example, two persons a and B meet at about an intersection, but the two persons are respectively located in two different areas of 4 areas of the intersection, at this time, the person a triggers the position information acquisition request to control the camera to rotationally acquire the environment image of the position where the first terminal is located and send the environment image to the second terminal held by the person B, and the person B can determine the specific position where the person a is located according to the environment information around the person B, the building information around the person a in the environment image and the corresponding distance.
According to the position information processing method provided by the embodiment of the application, the camera is controlled to rotate to collect the environment image of the position of the first terminal after the position information acquisition request is received, the environment image is sent to the second terminal, and the second terminal can determine the position of the first terminal according to the environment image. The second terminal can determine the specific position of the first terminal according to the environment image of the position of the first terminal, acquired by the first terminal camera in a rotating mode, so that the accuracy of the position information is improved.
As shown in fig. 3, in a location information processing method provided in an embodiment, the process of controlling a camera to acquire an environment image of a location where a first terminal is located according to a location information acquisition request further includes:
step 302, the camera is turned on according to the position information acquisition request.
Specifically, the first terminal may turn on at least one camera according to the position information acquisition request. The display screen of the first terminal can display the image range and the image information which can be collected by the camera.
Step 304, controlling the camera to move from a first position to a second position on the first terminal.
The second location may be on top of the first terminal. The first terminal comprises a display screen and a rear shell which are connected, and the camera can be arranged at a first position between the display screen and the rear shell. When the camera is started by the first terminal, the camera is driven to move from a first position located in the rear shell to a second position located outside the rear shell by controlling the driving assembly. When the camera is located the second position, can rotate with the pivot of being connected between the casing at camera and first terminal.
And step 306, shooting an environment image around the first terminal by rotating the camera.
When the camera is in the second position, the first terminal can control the camera to rotate and shoot the environmental image around the first terminal through the driving component of the driving camera module.
As shown in fig. 4, in one embodiment, the first terminal 400 includes a main housing, a camera module 440 and a driving assembly connected to the main housing, a battery disposed inside the main housing, and a processor and a communication unit coupled to the battery. The main housing comprises a display screen 410, a rear housing 420 and a side wall 430 therebetween, wherein the display screen 410, the rear housing 420 and the side wall 430 are jointly surrounded to form an accommodating cavity. The main casing body is provided with a groove, the camera module 440 is embedded in the groove, the driving component is arranged between the main casing body and the camera module 440 and used for generating pushing and rotary driving force of the camera module 440 relative to the main casing body so as to drive the camera module 440 to move from a first position 402 in the groove to a second position 404 outside the main casing body and enable the camera module 440 to rotate around a rotating shaft connected with the camera module 440 and the driving component as an axis when the camera module 440 is located at the second position. As shown in fig. 4, the first figure is a schematic diagram of the camera module 440 located at the first position 402; the second figure is a schematic diagram of the camera module 440 at the viewing angle of the rear case 420 being located at the second position 404, where the light incident surface of the camera is the display screen side; the third figure is a schematic diagram of the camera module 440 at the second position 404 at the viewing angle of the display screen 410; the fourth diagram is a schematic diagram of the camera module 440 with the viewing angle of the side wall 430 after rotating 90 degrees or 270 degrees, and at this time, the light incident surface of the camera is located on the side of the side wall 430 pointed by the lens.
As shown in fig. 5, in another embodiment, the first terminal 500 includes a main housing, a camera module 540 and a driving assembly connected to the main housing, a battery disposed inside the main housing, and a processor and a communication unit coupled to the battery. The main housing includes a display screen 510, a rear side 520 and a sidewall 430 therebetween, and the display screen 510, the rear housing 520 and the sidewall 530 together enclose an accommodation cavity. The main casing body is provided with a groove, the camera module 540 is embedded in the groove, the driving component is arranged between the main casing body and the camera module 540, and the driving component is used for generating pushing and rotary driving force of the camera module 540 relative to the main casing body so as to drive the camera module 540 to move from a first position in the groove to a second position outside the main casing body and drive a rotating shaft connected with the camera module 540 and the driving component to rotate around the axis when the camera module 540 is positioned at the second position. As shown in fig. 5, the first diagram is a schematic diagram of the camera module at the first position in the viewing angle of the side wall 530; the second figure is a schematic diagram of the camera module 540 at the second position at the viewing angle of the display screen 510, where the light incident surface of the camera is located at the side of the sidewall 530 pointed by the lens; the third diagram is a schematic diagram of the camera module 540 at the viewing angle of the display screen 510 after rotating 90 degrees or 270 degrees, and at this time, the light incident surface of the camera is one side of the display screen 510.
The camera is started according to the position information acquisition request, the camera is controlled to move from the first position to the second position of the first terminal, the camera is rotated to shoot an environment image around the first terminal, the environment image around the first terminal can be acquired, the position of the first terminal can be accurately determined according to the environment image, and the accuracy of the position information is improved.
As shown in fig. 6, in a location information processing method provided in another embodiment, the process of controlling a camera to acquire an environment image of a location where a first terminal is located according to a location information acquisition request may further include:
step 602, detecting a tilt angle of the first terminal.
The inclination angle is an included angle between the first terminal and a horizontal plane. The first terminal may detect the inclination angle of the first terminal through a built-in sensor. Specifically, the sensor built in the first terminal may be a gravity sensor, an acceleration sensor, a gyroscope, or the like, but is not limited thereto.
And step 604, adjusting the shooting direction of the camera according to the inclination angle.
Specifically, the shooting direction refers to an orientation at which the camera shoots an image. For example, when the image captured by the camera is the sky or the ground, the capturing direction may be a vertical direction of a horizontal plane. The first terminal may adjust a shooting direction of the camera according to an inclination angle of the first terminal, for example, when it is detected that the inclination angle of the first terminal is 30 degrees, if a target image shot by the first terminal is a sky, the first terminal may adjust the camera to form an included angle of 30 degrees with the first terminal, so that the shooting direction of the camera is a vertical direction.
Step 606, the camera is rotated to shoot the environment image around the first terminal.
After the shooting direction is adjusted by the camera, the first terminal can control the camera to rotate through the driving assembly and shoot an environmental image around the first terminal.
In one implementation, the controlling, according to the location information obtaining request, the camera to collect the environment image of the location where the first terminal is located in the location information processing method further includes: and adjusting the shooting direction of the camera to the horizontal direction.
In daily life, when a user holds the first terminal by hand to use, the inclination angle of the first terminal is usually an acute angle, when the inclination angle of the first terminal is an acute angle, an environment image shot by the rotary camera has many regions which are useless for determining position information, such as sky, ceiling, ground and the like, and the second terminal is difficult to determine the position of the first terminal according to environment factors such as sky, ground and the like in the environment image. In one embodiment provided by the application, the first terminal can automatically adjust the shooting direction of the camera to the horizontal direction.
Fig. 7 is a schematic diagram illustrating an embodiment of the first terminal of fig. 5 forming an oblique angle with a horizontal plane. In this embodiment, the drive assembly is further used for generating a rotational driving force for the camera module to rotate relative to the main housing, so as to drive the camera module to rotate by taking a preset node connected with the drive assembly as an axis, so that the camera module is parallel to the horizontal plane. In fig. 7, the first drawing is a side view of the first terminal 500 controlling the camera to move to the second position on the first terminal, and at this time, the first terminal 500 and the camera module 540 thereof both form a certain inclination angle with the horizontal plane; the second drawing is a side view of the first terminal 500 after adjusting the shooting direction of the camera to the horizontal direction according to the inclination angle of the first terminal 500 by detecting the inclination angle of the first terminal 500, at this time, the first terminal 500 forms a certain inclination angle with the horizontal plane, but the camera module 540 is parallel to the horizontal plane, so that the shooting direction of the camera is the horizontal direction, and at this time, the light incident surface of the camera is the side to which the side wall lens points; the third drawing is a side view of the first terminal 500 controlling the camera module 540 to rotate to shoot an environmental image around the first terminal 500, at this time, the camera module 540 and the camera thereof rotate by 90 degrees or 270 degrees, the light incident surface of the camera is the display screen side, and the environmental image in the horizontal direction of the display screen 510 side of the first terminal 500 can be shot.
Through detecting the inclination angle of the first terminal, the shooting direction of the camera is adjusted to the horizontal direction, the camera is rotated to shoot the environment image around the position of the first terminal, more image information which enables the second terminal to accurately determine the position of the first terminal exists in the environment image shot by the camera, and the accuracy of the position information is improved.
As shown in fig. 8, in one embodiment, the provided location information processing method further includes:
step 802, acquiring longitude and latitude information of the first terminal.
The latitude and longitude information is specific latitude and longitude and is main information for generating the position information. The first terminal may automatically acquire longitude and latitude information corresponding to the first terminal through a Global Positioning System (GPS) Positioning System.
And step 804, generating the position information of the first terminal according to the longitude and latitude information and the environment image.
The first terminal can generate the position information of the first terminal according to the acquired longitude and latitude information and the environmental image. For example, the latitude and longitude information of the first terminal is north latitude 23.1066805 and east longitude 113.3245904, the first terminal may generate the location information of the first terminal according to the latitude and longitude information of the first terminal in combination with the environment image, and the location information of the first terminal may be a location point of the first terminal displayed in the environment image collected by the first terminal according to the latitude and longitude information.
Step 806, sending the location information to the second terminal, where the location information is used to instruct the second terminal to determine the location of the first terminal.
The first terminal sends the position information to the second terminal through the network, and the second terminal determines the position of the first terminal according to the position information of the first terminal. Specifically, the second terminal may determine an approximate location of the first terminal according to the latitude and longitude information in the location information, and when the second terminal enters the environmental image range of the first terminal, determine the location of the first terminal according to a location point of the first terminal displayed in the environmental image by the latitude and longitude information.
As shown in fig. 9, in another embodiment, the provided location information processing method may further include:
step 902, detecting location information of a first terminal.
The position information comprises longitude and latitude information of the first terminal and an environment image. Specifically, the first terminal may detect the location information of the first terminal in real time after sending the environment image to the second terminal, or may detect the location information of the first terminal when the time for sending the location information exceeds the set time by setting the time interval.
And 904, when the position information of the first terminal changes, controlling the camera to rotate again to acquire the environment image of the first terminal.
The change of the location information of the first terminal may be a change of latitude and longitude information of the first terminal, a change of a surrounding environment of the first terminal, or a combination of the two. Specifically, the first terminal may set a first threshold, and when a difference between changes in longitude and latitude of the first terminal exceeds the first threshold, it is determined that the location information of the first terminal changes. The first terminal may further set a second threshold, and when it is detected that the size of the area around the first terminal different from the area around the first terminal detected last time exceeds the second threshold, it is determined that the location information of the first terminal changes. When the position information of the first terminal is judged to be changed, the first terminal controls the camera to rotate again to collect the environment image of the surrounding environment of the first terminal.
Step 906, the environment image is sent to the second terminal.
The first terminal sends the acquired environment image to the second terminal, and the second terminal can determine the position of the first terminal again according to the new environment image.
When the position information of the first terminal changes, the camera is controlled again to rotate to collect the environment image around the first terminal and send the environment image to the second terminal, so that the second terminal can track the position information of the first terminal in real time, and the situation that the accurate position of the first terminal cannot be determined due to the position change of the first terminal is avoided.
In one embodiment, a location information processing method is provided, in which the second terminal is a vehicle-mounted device or a handheld device.
The vehicle-mounted equipment can be arranged in the vehicle through a support or a fixing piece, for example, the vehicle-mounted equipment can be independently arranged above a central console of the vehicle, is convenient for checking operation in the driving and waiting processes, and can also be integrated in other devices of the vehicle. The handheld device may be, without limitation, a cell phone, a tablet, a personal digital assistant, a wearable device, and the like. For example, when a user of the first terminal reserves a taxi through the first terminal, the first terminal may control the camera to rotate and acquire an environment image around the position where the first terminal is located and send the environment image to vehicle-mounted equipment of the taxi or handheld equipment of a taxi driver, and the taxi driver may quickly determine the position of the first terminal through the environment image.
In one embodiment, a method for processing location information is provided, and the method is implemented by the following specific steps:
first, a first terminal receives a location information acquisition request. The location information refers to location information of a location where the first terminal is located. Specifically, the location information may be azimuth information, street information, or the like, but is not limited thereto. For example, the location information may be city information, street information, building information, and the like, in which the first terminal is located, and may also be longitude and latitude information, and the like, in which the first terminal is located. The position information acquisition request can be generated by clicking a button on the display screen of the first terminal by a user or can be generated by pressing a control on the touch screen of the first terminal by the user.
And then, the first terminal controls the camera to rotationally acquire the environment image of the position of the first terminal according to the position information acquisition request. After the first terminal receives the position information acquisition request, the driving assembly can drive the camera to rotationally acquire an environment image around the position of the first terminal. The first terminal may include a camera module and a driving assembly driving the camera module. The camera module may include one or more cameras. The camera may include a lens, a base, a sensor, and a circuit board assembled together in this order. The camera can be a built-in camera or an external camera. The environment image refers to an image collected by the camera and related to the environment around the camera. Specifically, the environment image may include objects such as buildings, roads, people, flowers, plants, trees, and the like. The angle of rotation of the camera can be preset according to the requirements of users. Specifically, the angle by which the camera is rotated may be 180 degrees, 270 degrees, 360 degrees, or the like, but is not limited thereto.
Optionally, the first terminal starts the camera according to the position information obtaining request, controls the camera to move from the first position to the second position of the first terminal, and rotates the camera to shoot the environment image around the first terminal. The first terminal may turn on at least one camera according to the position information acquisition request. The display screen of the first terminal can display the image range and the image information which can be collected by the camera. The second location may be on top of the first terminal. The first terminal comprises a display screen and a rear shell which are connected, and the camera module can be arranged at a first position between the display screen and the rear shell. When the camera is started by the first terminal, the camera is driven to move from a first position located in the rear shell to a second position located outside the rear shell by controlling the driving assembly. When the camera is opened at the first terminal, the camera is controlled to move to the second position from the inside of the rear shell through the driving assembly, and then the camera is driven to rotate through the driving assembly and shoot an environment image around the first terminal.
Optionally, the first terminal detects an inclination angle of the first terminal, adjusts a shooting direction of the camera according to the inclination angle, and rotates the camera to shoot an environment image around the first terminal. The inclination angle is an included angle between the first terminal and a horizontal plane. The first terminal may detect the inclination angle of the first terminal through a built-in sensor. Specifically, the sensor built in the first terminal may be a gravity sensor, an acceleration sensor, a gyroscope, or the like, but is not limited thereto. The shooting direction is an orientation of the camera when shooting an image. For example, when the image captured by the camera is the sky or the ground, the capturing direction may be a vertical direction of a horizontal plane. The first terminal may adjust a shooting direction of the camera according to an inclination angle of the first terminal, for example, when it is detected that the inclination angle of the first terminal is 30 degrees, if a target image shot by the first terminal is a sky, the first terminal may adjust the camera to form an included angle of 30 degrees with the first terminal, so that the shooting direction of the camera is a vertical direction. After the shooting direction is adjusted by the camera, the first terminal can drive the camera to rotate through the driving assembly and shoot an environmental image around the first terminal.
Optionally, the first terminal adjusts a shooting direction of the camera to a horizontal direction. Through detecting the inclination angle of the first terminal, the shooting direction of the camera is adjusted to the horizontal direction, the camera is rotated to shoot the environment image around the first terminal, more image information which enables the second terminal to accurately determine the position of the first terminal exists in the environment image shot by the camera, and the accuracy of the position information is improved.
Then, the first terminal sends an environment image to the second terminal, and the environment image is used for indicating the second terminal to determine the position of the first terminal. The first terminal can send the environment image to the second terminal through Wireless communication technology such as 2G/3G/4G/WiFi (Wireless Fidelity). Specifically, the first terminal may directly send the environment image to the second terminal through the network, or may upload the environment image to the server through the network, and then send the environment image to the second terminal through the server. The second terminal can acquire the specific positions of the first terminal and the user thereof according to the information of the face, surrounding buildings, guideboards, the position and the like of the user in the environment image.
Optionally, the first terminal acquires latitude and longitude information of the first terminal, generates position information of the first terminal according to the latitude and longitude information and the environment image, and sends the position information to the second terminal, wherein the position information is used for indicating the second terminal to determine the position of the first terminal. The latitude and longitude information is specific latitude and longitude and is main information for generating the position information. The first terminal can automatically acquire longitude and latitude information corresponding to the first terminal through a GPS. The first terminal may generate the location information of the first terminal according to the latitude and longitude information of the first terminal in combination with the environmental image, and the location information of the first terminal may be a location point at which the first terminal is displayed in the environmental image collected by the first terminal according to the latitude and longitude information. The first terminal sends the position information to the second terminal through the network, and the second terminal determines the position of the first terminal according to the position information of the first terminal. The second terminal may determine an approximate position of the first terminal according to the latitude and longitude information in the position information, and when the second terminal enters an environment image range of the first terminal, determine the position of the first terminal according to a position point of the first terminal displayed in the environment image by the latitude and longitude information.
Optionally, the first terminal detects the position information of the first terminal, and when the position information of the first terminal changes, the camera is controlled again to rotate to collect the environment image of the first terminal, and the environment image is sent to the second terminal. The position information comprises longitude and latitude information of the first terminal and an environment image. Specifically, the first terminal may detect the position information of the first terminal in real time after sending the environment image to the second terminal, or may detect the position information of the first terminal when the time for sending the position information exceeds the set time by setting the time interval, and when it is determined that the position information of the first terminal changes, the first terminal controls the camera to rotate again to collect the environment image of the surrounding environment of the first terminal and sends the environment image to the second terminal, and the second terminal may redetermine the position of the first terminal according to the new environment image.
Optionally, the second terminal is a vehicle-mounted device or a handheld device. The vehicle-mounted equipment can be arranged in the vehicle through a support or a fixing piece, for example, the vehicle-mounted equipment can be independently arranged above a central console of the vehicle, is convenient for checking operation in the driving and waiting processes, and can also be integrated in other devices of the vehicle. The handheld device may be, without limitation, a cell phone, a tablet, a personal digital assistant, a wearable device, and the like. For example, when a user of the first terminal reserves a taxi through the first terminal, the first terminal may control the camera to rotate and acquire an environment image around the position where the first terminal is located and send the environment image to vehicle-mounted equipment of the taxi or handheld equipment of a taxi driver, and the taxi driver may quickly determine the position of the first terminal through the environment image.
It should be understood that although the steps in the flowcharts of fig. 2, 3, 6, 8, 9 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 3, 6, 8, and 9 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 10 is a block diagram showing a configuration of a position information processing apparatus according to an embodiment. As shown in fig. 10, a positional information processing apparatus includes a request receiving module 1002, an image acquiring module 1004, and a transmitting module 1006. Wherein:
the request receiving module 1002 is configured to receive a location information obtaining request.
And the image obtaining module 1004 is configured to control the camera to rotate to collect an environment image of the location of the first terminal according to the location information obtaining request.
A sending module 1006, configured to send the environment image to the second terminal, where the environment image is used to instruct the second terminal to determine the location of the first terminal.
The position information processing device in this embodiment controls the camera to rotate to acquire the environment image of the position where the first terminal is located by receiving the position information acquisition request, and sends the environment image to the second terminal, so that the second terminal accurately determines the position of the first terminal according to the environment image, and the accuracy of the position information is improved.
In one embodiment, the image obtaining module 1004 may be further configured to turn on the camera according to the position information obtaining request, control the camera to move from a first position to a second position on the first terminal, and rotate the camera to capture an image of an environment around the first terminal.
In one embodiment, the image obtaining module 1004 may be further configured to detect a tilt angle of the first terminal, adjust a shooting direction of the camera according to the tilt angle, and rotate the camera to shoot an environment image around the first terminal.
In one embodiment, the image acquisition module 1004 may also be used to adjust the shooting direction of the camera to a horizontal direction.
In one embodiment, the location information processing apparatus further includes a location generating module 1008, configured to obtain longitude and latitude information of the first terminal, generate location information of the first terminal according to the longitude and latitude information and the environment image, and send the location information to the second terminal, where the location information is used to instruct the second terminal to determine the location of the first terminal.
In one embodiment, the image obtaining module 1004 is further configured to detect the position information of the first terminal, and when the position information of the first terminal changes, control the camera to rotate again to capture the environment image of the first terminal, and send the environment image to the second terminal.
The division of each module in the position information processing apparatus is only used for illustration, and in other embodiments, the position information processing apparatus may be divided into different modules as needed to complete all or part of the functions of the position information processing apparatus.
For specific limitations of the position information processing apparatus, reference may be made to the above limitations of the position information processing method, which are not described herein again. Each module in the position information processing apparatus may be wholly or partially implemented by software, hardware, or a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
The implementation of each module in the position information processing apparatus provided in the embodiment of the present application may be in the form of a computer program. The computer program may be run on a mobile terminal. Program modules constituted by such computer programs may be stored on the memory of the mobile terminal. The computer program, when executed by a processor, implements the steps of the location information processing method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the location information processing method.
A computer program product containing instructions which, when run on a computer, cause the computer to perform a method of position information processing.
The embodiment of the application also provides the mobile terminal. As shown in fig. 11, for convenience of explanation, only the parts related to the embodiments of the present application are shown, and details of the technology are not disclosed, please refer to the method part of the embodiments of the present application. The mobile terminal may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, a wearable device, and the like, taking the mobile terminal as the mobile phone as an example:
fig. 11 is a block diagram of a partial structure of a mobile phone related to a mobile terminal according to an embodiment of the present application. Referring to fig. 11, the cellular phone includes: radio Frequency (RF) circuitry 1110, memory 1120, input unit 1130, display unit 1140, sensors 1150, audio circuitry 1160, wireless fidelity (WiFi) module 1170, processor 1180, and power supply 1190. Those skilled in the art will appreciate that the handset configuration shown in fig. 11 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The RF circuit 1110 may be configured to receive and transmit signals during information transmission and reception or during a call, and may receive downlink information of a base station and then process the downlink information to the processor 1180; the uplink data may also be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 1110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 1120 may be used to store software programs and modules, and the processor 1180 may execute various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1120. The memory 1120 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as an application program for a sound playing function, an application program for an image playing function, and the like), and the like; the data storage area may store data (such as audio data, an address book, etc.) created according to the use of the mobile phone, and the like. Further, the memory 1120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1130 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone 1100. Specifically, the input unit 1130 may include a touch panel 1131 and other input devices 1132. Touch panel 1131, which may also be referred to as a touch screen, can collect touch operations of a user on or near the touch panel 1131 (for example, operations of the user on or near touch panel 1131 by using any suitable object or accessory such as a finger or a stylus pen), and drive corresponding connection devices according to a preset program. In one embodiment, the touch panel 1131 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1180, and can receive and execute commands sent by the processor 1180. In addition, the touch panel 1131 can be implemented by using various types, such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 1130 may include other input devices 1132 in addition to the touch panel 1131. In particular, other input devices 1132 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), and the like.
The display unit 1140 may be used to display information input by the user or information provided to the user and various menus of the cellular phone. The display unit 1140 may include a display panel 1141. In one embodiment, the Display panel 1141 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. In one embodiment, touch panel 1131 can cover display panel 1141, and when touch panel 1131 detects a touch operation thereon or nearby, the touch operation is transmitted to processor 1180 to determine the type of touch event, and then processor 1180 provides a corresponding visual output on display panel 1141 according to the type of touch event. Although in fig. 11, the touch panel 1131 and the display panel 1141 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1131 and the display panel 1141 may be integrated to implement the input and output functions of the mobile phone.
The cell phone 1100 can also include at least one sensor 1150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1141 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1141 and/or the backlight when the mobile phone moves to the ear. The motion sensor can comprise an acceleration sensor, the acceleration sensor can detect the magnitude of acceleration in each direction, the magnitude and the direction of gravity can be detected when the mobile phone is static, and the motion sensor can be used for identifying the application of the gesture of the mobile phone (such as horizontal and vertical screen switching), the vibration identification related functions (such as pedometer and knocking) and the like; the mobile phone may be provided with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor.
Audio circuitry 1160, speaker 1161 and microphone 1162 may provide an audio interface between a user and a cell phone. The audio circuit 1160 may transmit the electrical signal converted from the received audio data to the speaker 1161, and convert the electrical signal into a sound signal for output by the speaker 1161; on the other hand, the microphone 1162 converts the collected sound signal into an electrical signal, and the electrical signal is received by the audio circuit 1160 and converted into audio data, and then the audio data is processed by the audio data output processor 1180, and then the audio data is sent to another mobile phone through the RF circuit 1110, or the audio data is output to the memory 1120 for subsequent processing.
WiFi belongs to short-distance wireless transmission technology, and the cell phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 1170, and provides wireless broadband internet access for the user. Although fig. 11 shows the WiFi module 1170, it is to be understood that it does not necessarily form part of the handset 1100 and may be omitted as desired.
The processor 1180 is a control center of the mobile phone, and is connected to various parts of the whole mobile phone through various interfaces and lines, and executes various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1120 and calling data stored in the memory 1120, thereby performing overall monitoring of the mobile phone. In one embodiment, the processor 1180 may include one or more processing units. In one embodiment, the processor 1180 may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user interfaces, application programs, and the like; the modem processor handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated within processor 1180.
The cell phone 1100 also includes a power supply 1190 (e.g., a battery) for providing power to various components, which may be logically coupled to the processor 1180 via a power management system, such that the power management system may be configured to manage charging, discharging, and power consumption.
In one embodiment, the cell phone 1100 may also include a camera, a bluetooth module, and the like.
In the embodiment of the present application, the processor 1180 included in the mobile terminal implements the steps of the location information processing method when executing the computer program stored in the memory.
As used herein, a "communication terminal" (or simply "terminal") includes, but is not limited to, a device that is configured to receive/transmit communication signals via a wireline connection, such as via a Public Switched Telephone Network (PSTN), a Digital Subscriber Line (DSL), a digital cable, a direct cable connection, and/or another data connection/network, and/or via a wireless interface (e.g., for a cellular network, a Wireless Local Area Network (WLAN), a digital television network such as a DVB-H network, a satellite network, an AM-FM broadcast transmitter, and/or another communication terminal). A communication terminal arranged to communicate over a wireless interface may be referred to as a "wireless communication terminal", "wireless terminal" or "mobile terminal". Examples of mobile terminals include, but are not limited to, satellite or cellular telephones; a Personal Communications System (PCS) terminal that may combine a cellular radiotelephone with data processing, facsimile and data communications capabilities; PDAs that may include radiotelephones, pagers, internet/intranet access, Web browsers, notepads, calendars, and/or Global Positioning System (GPS) receivers; and conventional laptop and/or palmtop receivers or other electronic devices that include a radiotelephone transceiver. The mobile phone is a mobile terminal equipped with a cellular communication module.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A position information processing method, characterized by comprising:
receiving a position information acquisition request;
starting a camera according to the position information acquisition request, and controlling the camera to move from a first position in a main shell of the first terminal to a second position outside the main shell;
detecting an inclination angle of the first terminal, and adjusting the shooting direction of the camera to the horizontal direction according to the inclination angle;
rotating the camera through a rotating shaft connected between the camera and the main shell to shoot an environmental image around the first terminal;
and sending the environment image to a second terminal, wherein the environment image is used for indicating the second terminal to determine the position of the first terminal.
2. The method of claim 1, further comprising:
acquiring longitude and latitude information of the first terminal;
generating position information of the first terminal according to the longitude and latitude information and the environment image;
and sending the position information to the second terminal, wherein the position information is used for indicating the second terminal to determine the position of the first terminal.
3. The method of claim 1, further comprising:
detecting position information of a first terminal;
when the position information of the first terminal changes, controlling the camera to rotate again to acquire an environment image of the first terminal;
and sending the environment image to a second terminal.
4. The method according to claim 3, wherein the method for detecting the location information of the first terminal comprises any one of the following manners:
detecting the position information of the first terminal in real time;
and detecting the position information of the first terminal at fixed time.
5. The method according to claim 3, wherein the determination condition that the location information of the first terminal changes includes at least one of a change in latitude and longitude information of the first terminal and a change in an ambient environment of the first terminal.
6. The method of claim 1, wherein the first terminal comprises a main housing, a camera module coupled to the main housing, and a drive assembly;
the main shell is provided with a groove, and the camera module is embedded in the groove;
the drive assembly is located the main casing body with between the camera module, drive assembly is used for producing the propelling drive power of camera module for the main casing body to drive the camera module and remove the outer second position of main casing from the primary importance that is in the recess, drive assembly still is used for producing the rotary driving power of camera module for the main casing body, uses the pivot that camera module and drive assembly are connected to rotate as the axle center when the camera module is located outside the main casing body, makes camera module and horizontal plane parallel, and uses the pivot that camera module and drive assembly are connected to rotate as the axle center when the camera module is parallel with the horizontal plane.
7. The method of claim 1, wherein the second terminal is a vehicle-mounted device or a handheld device.
8. A positional information processing apparatus characterized by comprising:
the request receiving module is used for receiving a position information acquisition request;
the image acquisition module is used for starting a camera according to the position information acquisition request, controlling the camera to move from a first position in a main shell of a first terminal to a second position outside the main shell, detecting the inclination angle of the first terminal, adjusting the shooting direction of the camera to the horizontal direction according to the inclination angle, and rotating the camera through a rotating shaft connected between the camera and the main shell to shoot an environment image around the first terminal;
and the sending module is used for sending the environment image to a second terminal, and the environment image is used for indicating the second terminal to determine the position of the first terminal.
9. A mobile terminal comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the location information processing method according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN201810386192.2A 2018-04-26 2018-04-26 Position information processing method and device, mobile terminal and storage medium Active CN110418049B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810386192.2A CN110418049B (en) 2018-04-26 2018-04-26 Position information processing method and device, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810386192.2A CN110418049B (en) 2018-04-26 2018-04-26 Position information processing method and device, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110418049A CN110418049A (en) 2019-11-05
CN110418049B true CN110418049B (en) 2021-08-17

Family

ID=68345951

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810386192.2A Active CN110418049B (en) 2018-04-26 2018-04-26 Position information processing method and device, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110418049B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111554084B (en) * 2020-05-19 2022-01-21 新石器慧通(北京)科技有限公司 Method for searching unmanned vehicle
CN113092674A (en) * 2021-03-30 2021-07-09 维沃移动通信有限公司 Air quality monitoring method and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104580909A (en) * 2015-01-09 2015-04-29 上海小蚁科技有限公司 Image acquisition method and device
CN104822042A (en) * 2015-03-03 2015-08-05 广东欧珀移动通信有限公司 Camera-based pedestrian safety detection method and device
CN105282413A (en) * 2014-07-24 2016-01-27 信泰光学(深圳)有限公司 Image pick-up device, horizontal display control method
CN105578007A (en) * 2015-12-21 2016-05-11 广东欧珀移动通信有限公司 Method and device used for mobile terminal, and mobile terminal
CN106534132A (en) * 2016-11-17 2017-03-22 京东方科技集团股份有限公司 Taxi order-based video processing methods, apparatuses, server and system
CN106646566A (en) * 2017-01-03 2017-05-10 京东方科技集团股份有限公司 Passenger positioning method, device and system
CN106864372A (en) * 2017-03-31 2017-06-20 寅家电子科技(上海)有限公司 Outdoor scene internet is called a taxi accessory system and method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202634544U (en) * 2011-12-20 2012-12-26 信利光电(汕尾)有限公司 Panoramic photography module for mobile phone
US8860818B1 (en) * 2013-07-31 2014-10-14 Apple Inc. Method for dynamically calibrating rotation offset in a camera system
CN204408434U (en) * 2015-02-25 2015-06-17 胡伟 Mobile phone
CN107343064A (en) * 2017-06-27 2017-11-10 努比亚技术有限公司 A kind of mobile terminal of two-freedom rotating camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105282413A (en) * 2014-07-24 2016-01-27 信泰光学(深圳)有限公司 Image pick-up device, horizontal display control method
CN104580909A (en) * 2015-01-09 2015-04-29 上海小蚁科技有限公司 Image acquisition method and device
CN104822042A (en) * 2015-03-03 2015-08-05 广东欧珀移动通信有限公司 Camera-based pedestrian safety detection method and device
CN105578007A (en) * 2015-12-21 2016-05-11 广东欧珀移动通信有限公司 Method and device used for mobile terminal, and mobile terminal
CN106534132A (en) * 2016-11-17 2017-03-22 京东方科技集团股份有限公司 Taxi order-based video processing methods, apparatuses, server and system
CN106646566A (en) * 2017-01-03 2017-05-10 京东方科技集团股份有限公司 Passenger positioning method, device and system
CN106864372A (en) * 2017-03-31 2017-06-20 寅家电子科技(上海)有限公司 Outdoor scene internet is called a taxi accessory system and method

Also Published As

Publication number Publication date
CN110418049A (en) 2019-11-05

Similar Documents

Publication Publication Date Title
CN108366207B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN110636174B (en) Bus code calling method and mobile terminal
CN109151170B (en) Electronic device and control method thereof
CN108762859B (en) Wallpaper display method and device, mobile terminal and storage medium
CN108337368B (en) Method for updating positioning data and mobile terminal
CN107124556B (en) Focusing method, focusing device, computer readable storage medium and mobile terminal
CN108174103B (en) Shooting prompting method and mobile terminal
CN110622571B (en) Network connection method, device and terminal
CN106878949B (en) Positioning terminal, system and method based on double cameras
WO2014114204A1 (en) Gps positioning method for mobile terminal and mobile terminal
CN112136093B (en) Method for controlling opening and closing of screen, device for controlling opening and closing of screen and electronic equipment
CN107888765B (en) Method for switching scene mode and mobile terminal
CN110456395B (en) Positioning method and terminal equipment
CN112424725A (en) Application program control method and electronic equipment
KR20170089653A (en) Mobile terminal and method for controlling the same
CN108769893B (en) Terminal detection method and terminal
WO2019052450A1 (en) Photo taking control method and system based on mobile terminal, and storage medium
CN111510482B (en) Method and device for determining failed network request and computer storage medium
CN110418049B (en) Position information processing method and device, mobile terminal and storage medium
CN110234068B (en) Positioning method and terminal equipment
CN110536236B (en) Communication method, terminal equipment and network equipment
CN110602387B (en) Shooting method and electronic equipment
CN109936817B (en) Positioning method and terminal equipment
CN110187769B (en) Preview image viewing method, equipment and computer readable storage medium
CN111684827A (en) Method and equipment for setting supplementary service

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant