WO2021164387A1 - 目标物体的预警方法、装置和电子设备 - Google Patents

目标物体的预警方法、装置和电子设备 Download PDF

Info

Publication number
WO2021164387A1
WO2021164387A1 PCT/CN2020/135113 CN2020135113W WO2021164387A1 WO 2021164387 A1 WO2021164387 A1 WO 2021164387A1 CN 2020135113 W CN2020135113 W CN 2020135113W WO 2021164387 A1 WO2021164387 A1 WO 2021164387A1
Authority
WO
WIPO (PCT)
Prior art keywords
deflection angle
target object
horizontal deflection
center point
operating data
Prior art date
Application number
PCT/CN2020/135113
Other languages
English (en)
French (fr)
Inventor
张�浩
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021164387A1 publication Critical patent/WO2021164387A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • G01C21/3638Guidance using 3D or perspective road maps including 3D objects and buildings
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • This application relates to the technical field of smart terminals, and in particular to early warning methods, devices and electronic equipment for target objects.
  • Figure 1a shows the AR real-world navigation, which can provide users with better navigation services
  • Figure 1b shows the AR real-world navigation supported by the smart car box combined with the car machine, which provides users with a better driving and navigation experience. Both greatly facilitate people's lives.
  • ADAS Advanced Driving Assistance System
  • MDC Mobile Data Center
  • the user warns of the collision threat ahead, which increases the comfort and safety of car driving.
  • the main principle is: sensing the distance between the vehicle ahead and the vehicle through the radar sensor set on the vehicle, and predicting whether the vehicle will collide with the vehicle in front within a preset period of time based on the movement of the vehicle, such as heading and speed. As a result, a collision may occur, and the vehicle directly in front of the AR real-world navigation screen is displayed as a threatening vehicle.
  • this early warning method can only provide early warning of whether the vehicle directly in front has a collision threat, and cannot provide users with a wider range of dangerous warnings.
  • the warning range is narrow and the user experience is poor.
  • the present application provides an early warning method, device and electronic equipment for a target object, which can provide users with a wider range of dangerous early warnings and improve user experience.
  • an embodiment of the present application provides an early warning method for a target object, including:
  • the expected horizontal deflection angle of the target object is the predicted value of the absolute horizontal deflection angle of the image of the target object in the AR image
  • AR image It is the AR image displayed by the AR device of the first object
  • the absolute horizontal deflection angle of each object image in the AR screen is the horizontal angle between the direction in which the shooting point of the AR screen points to the center point of the object image and the direction in which the shooting point points to the center point of the AR screen ;
  • the object image corresponding to the target object in the AR screen is obtained, and the expected horizontal deflection angle of the target object is the absolute horizontal deflection angle of the object image corresponding to the target object
  • the object image corresponding to the target object is displayed early in the AR screen.
  • This method obtains at least one target object that is threatening to the first object, obtains the object image corresponding to the target object in the AR screen through the deflection angle comparison, and displays the object image as an early warning, so as to provide users with a larger range of Danger warning, enhance user experience.
  • obtaining the absolute horizontal deflection angle of each object image in the AR screen includes:
  • the absolute horizontal deflection angle of the object image is calculated by the following formula:
  • y is the angle value of the absolute horizontal deflection angle of the object image
  • L is the total number of pixels in the horizontal direction of the AR image
  • m is the horizontal viewing angle range of the camera of the AR device
  • x is the center point of the object image and the center of the AR image The number of pixels that the line segment between the points occupies in the horizontal direction.
  • obtaining at least one target object that is threatening to the first object and obtaining the expected horizontal deflection angle of the target object includes:
  • the target object and the expected horizontal deflection angle of the target object are determined by the V2X device according to the operating data of the target object and the operating data of the first object.
  • the V2X device is set on the first object. superior.
  • obtaining at least one target object that is threatening to the first object and obtaining the expected horizontal deflection angle of the target object includes:
  • the V2X device Obtain the operating data of the surrounding objects from the V2X device, and obtain the operating data of the first object from the GNSS device; the V2X device and the GNSS device are set on the first object;
  • At least one target object is obtained from the surrounding objects, and the expected horizontal deflection angle of the target object is calculated.
  • calculating the expected horizontal deflection angle of the target object includes:
  • For each target object according to the operating data of the target object and the operating data of the first object, calculate the angle between the direction in which the center point of the AR device points to the center point of the target object and the installation direction of the AR device camera.
  • the included angle is used as the expected horizontal deflection angle of the target object.
  • the operating data of the target object and the operating data of the first object calculate the gap between the direction in which the center point of the AR device points to the center point of the target object and the installation direction of the AR device camera.
  • Angle including:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • (X O , Y O ) is the position coordinate of the center point O of the AR device
  • ⁇ NOB is the heading angle of the first object
  • O 1 O is the position of the first object The distance between the center point O 1 of the GNSS device and the center point O of the AR device.
  • the operating data of the target object and the operating data of the first object calculate the difference between the direction in which the center point of the AR device points to the center point of the GNSS device in the target object and the installation direction of the AR device camera.
  • the angle between including:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • acquiring at least one target object from the surrounding objects according to the operating data of the surrounding objects and the operating data of the first object includes:
  • the operating data of the surrounding objects and the operating data of the first object calculate the length of time each surrounding object collides with the first object when operating according to the operating data
  • the method before acquiring at least one target object from surrounding objects, the method further includes:
  • the operating data of the surrounding objects and the operating data of the first object select the surrounding objects associated with the first object from the surrounding objects; accordingly,
  • Obtain at least one target object from surrounding objects including:
  • At least one target object is acquired from surrounding objects associated with the first object.
  • the obtained object images corresponding to the target object in the AR screen are at least two, before the object image corresponding to the target object is displayed in the AR screen, the include:
  • the expected horizontal deflection angle of the surrounding object is the predicted value of the absolute horizontal deflection angle of the image of the surrounding object in the AR image;
  • the object image with the same sorting order as the sorting order of the target object is selected as the object image corresponding to the target object.
  • obtaining the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object includes:
  • the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object are obtained from the V2X device.
  • the distance and the expected horizontal deflection angle are determined by the operating data of the target object of the V2X device and the operating data of the first object.
  • obtaining the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object includes:
  • the V2X device Obtain the operating data of the surrounding objects from the V2X device, and obtain the operating data of the first object from the GNSS device; the V2X device and the GNSS device are set on the first object;
  • the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object are calculated.
  • calculating the expected horizontal deflection angle of surrounding objects includes:
  • For each surrounding object according to the operating data of the surrounding object and the operating data of the first object, calculate the angle between the direction from the center point of the AR device to the center point of the surrounding object and the installation direction of the AR device camera, and the The included angle serves as the expected horizontal deflection angle of the surrounding object.
  • the operating data of the surrounding object and the operating data of the first object calculate the difference between the direction in which the center point of the AR device points to the center point of the GNSS device in the surrounding object and the installation direction of the AR device camera.
  • the angle between including:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • (X O , Y O ) is the position coordinate of the center point O of the AR device
  • ⁇ NOB is the heading angle of the first object
  • O 1 O is the first object’s The distance between the center point O 1 of the GNSS device and the center point O of the AR device.
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • the method before calculating the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object, the method further includes:
  • the running data of the surrounding objects and the running data of the first object select the surrounding objects associated with the first object from the surrounding objects;
  • calculate the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object including:
  • an embodiment of the present application provides an early warning device for a target object, including:
  • the expected deflection angle acquisition unit is used to obtain at least one target object that is threatening to the first object, and obtain the expected horizontal deflection angle of the target object;
  • the expected horizontal deflection angle of the target object is the absolute level of the image of the target object in the AR screen
  • the AR image is the AR image displayed by the AR device of the first object;
  • the absolute declination acquisition unit is used to acquire the absolute horizontal deflection angle of each object image in the AR screen;
  • the absolute horizontal deflection angle of the object image is the direction in which the shooting point of the AR screen points to the center point of the object image and the shooting point points to the center point of the AR screen The angle between the direction in the horizontal direction;
  • the image obtaining unit is used to obtain the object image corresponding to the target object in the AR screen according to the expected horizontal deflection angle of the target object obtained by the expected deflection angle obtaining unit and the absolute horizontal deflection angle of each object image obtained by the absolute deflection angle obtaining unit , The difference between the expected horizontal deflection angle of the target object and the absolute horizontal deflection angle of the object image corresponding to the target object meets the first difference requirement;
  • the display unit is used for early warning display of the object image corresponding to the target object obtained by the image obtaining unit in the AR screen.
  • the absolute deflection angle acquisition unit is specifically used for:
  • the absolute horizontal deflection angle of the object image is calculated by the following formula:
  • y is the angle value of the absolute horizontal deflection angle of the object image
  • L is the total number of pixels in the horizontal direction of the AR image
  • m is the horizontal viewing angle range of the camera of the AR device
  • x is the center point of the object image and the center of the AR image The number of pixels that the line segment between the points occupies in the horizontal direction.
  • the expected deflection angle acquisition unit is specifically used for:
  • the target object and the expected horizontal deflection angle of the target object are determined by the V2X device according to the operating data of the target object and the operating data of the first object.
  • the V2X device is set on the first object. superior.
  • the expected deflection angle acquisition unit includes:
  • the data acquisition subunit is used to acquire the operating data of the surrounding objects from the V2X equipment, and to acquire the operating data of the first object from the GNSS equipment; the V2X equipment and the GNSS equipment are set on the first object;
  • the calculation subunit is used to obtain at least one target object from the surrounding objects according to the running data of the surrounding objects and the running data of the first object, and calculate the expected horizontal deflection angle of the target object.
  • calculation subunit is specifically used for:
  • For each target object according to the operating data of the target object and the operating data of the first object, calculate the angle between the direction in which the center point of the AR device points to the center point of the target object and the installation direction of the AR device camera.
  • the included angle is used as the expected horizontal deflection angle of the target object.
  • calculation subunit is specifically used for:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • (X O , Y O ) is the position coordinate of the center point O of the AR device
  • ⁇ NOB is the heading angle of the first object
  • O 1 O is the first object’s The distance between the center point O 1 of the GNSS device and the center point O of the AR device.
  • calculation subunit is specifically used for:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • calculation subunit is specifically used for:
  • the operating data of the surrounding objects and the operating data of the first object calculate the length of time each surrounding object collides with the first object when operating according to the operating data
  • calculation subunit is specifically used for:
  • a surrounding object associated with the first object is selected from the surrounding objects; at least one target object is obtained from the surrounding objects associated with the first object.
  • the expected deflection angle acquisition unit is also used to: obtain the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object; the expected horizontal deflection angle of the surrounding object is relative to the surrounding object The predicted value of the absolute horizontal deflection angle of the image in the AR picture;
  • the image obtaining unit is further used to: select the peripheral object whose difference between the expected horizontal deflection angle and the expected horizontal deflection angle of the target object meets the second difference requirement; The distance of the target object and the distance between the target object and the first object, sort the selected surrounding objects and the target object according to the distance from small to large to get the ranking of the target object; obtain the object image corresponding to the target object In the Y-axis coordinate value of the AR screen, the object images corresponding to the target object are sorted according to the coordinate value from small to large; select the object image with the same sorting order as the sorting order of the target object as the target object The corresponding object image.
  • the expected deflection angle acquisition unit is specifically used for:
  • the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object are obtained from the V2X device.
  • the distance and the expected horizontal deflection angle are determined by the operating data of the target object of the V2X device and the operating data of the first object.
  • the expected deflection angle acquisition unit is specifically used for:
  • the V2X device Obtain the operating data of the surrounding objects from the V2X device, and obtain the operating data of the first object from the GNSS device; the V2X device and the GNSS device are set on the first object;
  • the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object are calculated.
  • the expected deflection angle acquisition unit is specifically used for:
  • For each surrounding object according to the operating data of the surrounding object and the operating data of the first object, calculate the angle between the direction from the center point of the AR device to the center point of the surrounding object and the installation direction of the AR device camera, and the The included angle serves as the expected horizontal deflection angle of the surrounding object.
  • the expected deflection angle acquisition unit is specifically used to calculate the included angle according to the following formula:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • (X O , Y O ) is the position coordinate of the center point O of the AR device
  • ⁇ NOB is the heading angle of the first object
  • O 1 O is the first object’s The distance between the center point O 1 of the GNSS device and the center point O of the AR device.
  • the expected deflection angle acquisition unit is specifically used to calculate the included angle according to the following formula:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • the expected deflection angle acquisition unit is specifically configured to: select a peripheral object associated with the first object from the peripheral objects according to the operating data of the peripheral objects and the operating data of the first object; The distance between the surrounding object associated with an object and the first object, and the expected horizontal deflection angle of the surrounding object associated with the first object.
  • an electronic device including:
  • the expected horizontal deflection angle of the target object is the predicted value of the absolute horizontal deflection angle of the image of the target object in the AR image
  • AR image It is the AR image displayed by the AR device of the first object
  • the absolute horizontal deflection angle of each object image in the AR screen is the horizontal angle between the direction in which the shooting point of the AR screen points to the center point of the object image and the direction in which the shooting point points to the center point of the AR screen ;
  • the object image corresponding to the target object in the AR screen is obtained, and the expected horizontal deflection angle of the target object is the absolute horizontal deflection angle of the object image corresponding to the target object
  • the object image corresponding to the target object is displayed early in the AR screen.
  • an embodiment of the present application provides a computer program, which is used to execute the method of the first aspect when the computer program is executed by a computer.
  • the program in the fourth aspect may be stored in whole or in part on a storage medium packaged with the processor, or may be stored in part or in a memory not packaged with the processor.
  • Figure 1a is a schematic diagram of AR real-scene navigation in the prior art
  • Figure 1b is a schematic diagram of the AR real-world navigation supported by the prior art smart car box combined car machine
  • Figure 2a is a schematic diagram of a radar sensing model
  • Figure 2b is a schematic diagram of the problems caused by the prior art
  • FIG. 3 is a flowchart of an embodiment of an early warning method for a target object of this application
  • FIG. 4 is a flowchart of another embodiment of an early warning method for a target object of this application.
  • Figure 4a is a schematic diagram of the relationship between the target object of the application and the field of view angle range of the camera;
  • Fig. 4b and Fig. 4c are examples of the display mode of early warning of the application.
  • FIG. 4d is an example diagram of the positional relationship between surrounding objects and the first object in this application.
  • Fig. 4e is a schematic diagram of the deflection angle between the surrounding objects of the application and the first object
  • Figure 5 is a top view of the first object and the target object of the application.
  • FIG. 6a is a schematic diagram of light connection for imaging by a camera of this application.
  • FIG. 6b is a schematic diagram of the relationship between the object image and the absolute horizontal deflection angle recognized in the AR image of the application;
  • FIG. 7 is a flowchart of another embodiment of an early warning method for a target object of this application.
  • FIG. 8a is a flowchart of another embodiment of an early warning method for a target object of this application.
  • Fig. 8b is an example diagram of the coordinate values of the object image in the Y-axis direction of the application.
  • FIG. 9 is a flowchart of another embodiment of an early warning method for a target object of this application.
  • FIG. 10 is a diagram of a possible system structure to which the method described in this application is applicable.
  • FIG. 11 is a structural diagram of an embodiment of an early warning device for a target object of this application.
  • FIG. 12 is a schematic structural diagram of an embodiment of an electronic device of this application.
  • ADAS, MDC and other technologies combined with radar sensors can warn the user of the front collision threat in the AR real-world navigation screen.
  • the existing radar sensors have typical directional attributes, namely There is a big difference in performance in different directions.
  • the radar perception model shown in Figure 2a Generally, the radar sensor that detects the front vehicle and the radar sensor that detects the side direction vehicle have a large difference in distance perception performance. If the surrounding vehicles are not in the own vehicle If it is directly ahead, but in other directions, the radar sensor's ability to perceive surrounding vehicles will be greatly reduced. Therefore, at present, it can only predict whether the vehicle directly in front is a threatening vehicle.
  • Figure 2b if multiple vehicles in the surrounding vehicles are threatened by collisions with the vehicle, the existing technology cannot accurately detect the most threatening vehicle and mark it in the AR real-life navigation screen for early warning.
  • the radar sensor can only perceive the objective attributes of surrounding vehicles, such as location, etc., but cannot perceive the subjective attributes of surrounding vehicles, such as turn signal status, braking status, fault status, etc., to solve this problem.
  • One of the methods of the problem is to obtain more objective and subjective attributes of surrounding vehicles.
  • V2X Vehicle to Everything
  • LTE-V Long Term Evolution-Vehicle
  • DSRC Dedicated Short Message Communication
  • V2X is a general term for vehicle-to-vehicle communication interconnection based on LTE-V or DSRC (V2V, Vehicle to Vehicle) and vehicle-to-infrastructure (V2I) information exchange technology.
  • V2X enables two-way information transmission between the vehicle and any entity that may affect the vehicle, such as enabling communication between vehicles and pedestrians, vehicles and vehicles, vehicles and base stations, base stations and base stations, so as to obtain vehicle operating information, real-time road conditions, A series of information such as road information and pedestrian information can improve driving safety, reduce congestion, improve traffic efficiency, and provide in-vehicle entertainment information. Therefore, compared with radar sensors, V2X communication can acquire more objective and subjective attributes of surrounding vehicles and even pedestrians.
  • this application proposes an early warning method, device and electronic equipment for a target object, which combines V2X technology with AR technology to provide users with a wider range of danger warnings in the AR screen and improve user experience.
  • the method of the present application can not only be applied to AR devices that support AR real-world navigation in vehicles, but can also be applied to electronic devices that support AR real-world navigation, for example, as shown in Figure 1a. Or, it can also be applied to: Among the AR devices that perform AR real-world display and need to warn the target object threatening the first object in the AR screen, for example: a robot equipped with an AR device is the first object, and the robot can use the AR device to perform surroundings The AR real-world display of the environment, and the AR device needs to display an early warning display on the AR screen of the collision that the surrounding objects may cause to the robot.
  • Fig. 3 is a flowchart of an embodiment of an early warning method for a target object of the application. As shown in Fig. 3, the above method may include:
  • Step 301 Obtain at least one target object that is threatening to the first object, and obtain the expected horizontal deflection angle of the target object; the expected horizontal deflection angle of the target object is the absolute value of the image of the target object in the AR screen The predicted value of the horizontal deflection angle; the AR picture is the AR picture displayed by the AR device of the first object;
  • the first object and the target object can generally be identified by different object IDs.
  • the specific implementation of the object ID is not limited in the embodiment of this application, as long as different objects can be uniquely identified. Can.
  • Step 302 Obtain the absolute horizontal deflection angle of each object image in the AR screen
  • Step 303 Obtain the object image corresponding to the target object in the AR screen according to the expected horizontal deflection angle of the target object and the absolute horizontal deflection angle of each object image, and the expected horizontal deflection angle of the target object is The difference between the absolute horizontal deflection angles of the object image corresponding to the target object meets the first difference requirement;
  • Step 304 Perform an early warning display on the object image corresponding to the target object in the AR screen.
  • the method shown in Figure 3 acquires at least one target object that is threatening to the first object, obtains the object image corresponding to the target object in the AR screen through deflection angle comparison, and displays the object image as an early warning, so as to provide users with A wider range of hazard warnings improves user experience.
  • Fig. 4 is a flowchart of another embodiment of an early warning method for a target object of this application.
  • an AR device is provided in the first object to provide the user with an AR image, and the AR image is a peripheral area of the first object captured by a camera set in the first object
  • the AR screen which can be a real-life navigation screen or an AR screen in other non-navigation scenes;
  • the first object is also provided with a V2X device for V2X communication with surrounding objects of the first object, the surrounding objects It is an object with V2X communication capabilities.
  • the first object can be: vehicles, robots, pedestrians, etc.; the first object can be in motion or at rest; the surrounding objects can be vehicles, pedestrians, robots, or bicycles with V2X communication capabilities, each The surrounding objects can be in motion or at rest.
  • the method may include:
  • Step 401 The V2X device of the first object performs V2X communication with surrounding objects, and obtains the operating data of the surrounding objects respectively.
  • the operating data of the object may include, but is not limited to: the operating speed, and/or operating direction, and/or position of the object.
  • the surrounding objects can be identified by the object ID.
  • V2X devices electronic devices capable of V2X communication, such as V2X devices, may be provided in the surrounding objects.
  • the V2X device of the first object and the electronic device of the surrounding object can communicate through LTE-V or DSRC when performing V2X communication, which is not limited by this application.
  • V2X devices can broadcast the operating data of their own objects through Basic Safety Message (BSM) messages.
  • BSM messages can include, but are not limited to: object identification, operating speed of the object, and/or operation Direction, and/or location, and/or acceleration, and/or predicted path, and/or historical path, and/or vehicle events, etc. Therefore, the V2X device of the first object in this step can also obtain the operating data of the surrounding objects through the BSM message.
  • the object identifier in the BSM message is generally the identifier of the object sending the BSM message.
  • the position of the object can be represented by latitude and longitude.
  • the running speed can be the driving speed of the vehicle
  • the running direction can be the heading angle of the vehicle.
  • the heading angle of the vehicle is the angle between the running direction of the vehicle and the true north direction.
  • Step 402 The V2X device of the first object acquires the operating data of the first object.
  • the V2X device of the first object can obtain the operating data of the first object from the Global Navigation Satellite System (GNSS, Global Navigation Satellite System) device of the first object.
  • GNSS Global Navigation Satellite System
  • the GNSS device of the first object may be set in the AR device, or set in the first object, or set in the V2X device, which is not limited in this application.
  • the GNSS device of the first object may be set at the center point of the first object.
  • GNSS is a unified term for the Beidou system, Global Positioning System (GPS), GLONASS system, Galileo satellite navigation system and other single satellite navigation and positioning systems. It can also refer to Their enhanced system also refers to the additive mixture of all the above-mentioned satellite navigation and positioning systems and their enhanced systems. In other words: GNSS is a star-rated radio navigation system with artificial satellites as navigation stations.
  • step 401 and step 402 The order of execution between step 401 and step 402 is not limited.
  • Step 403 The V2X device of the first object selects a peripheral object associated with the first object from the peripheral objects according to the operating data of the first object and the operating data of the surrounding objects.
  • associated objects For ease of description, the selected surrounding objects associated with the first object are referred to as associated objects below.
  • This step is optional.
  • Step 404 The V2X device of the first object separately calculates the threat degree of each associated object to the first object according to the operating data of the first object and the operating data of the associated object, and selects P associated objects in descending order of the threat degree As the target object; P is a natural number.
  • the value of P is not limited in this application, but generally speaking, only a small number of target objects need to be warned, such as 1 target object or 2 target objects, otherwise, the meaning of warning will be lost.
  • P can be 1, which means that only the most threatening surrounding objects will be followed up for early warning.
  • Step 405 The V2X device of the first object calculates the expected horizontal deflection angle of each target object.
  • the expected horizontal deflection angle of the target object is: the predicted value of the absolute horizontal deflection angle of the image of the target object in the AR picture.
  • the AR image refers to the AR image displayed by the AR device of the first object.
  • the absolute horizontal deflection angle of the object image in the AR screen is the horizontal angle between the direction in which the shooting point of the AR device points to the center point of the object image and the direction in which the shooting point points to the center point of the AR screen.
  • Step 406 The V2X device of the first object sends the expected horizontal deflection angle of the target object to the AR device of the first object.
  • the V2X device can send: the identification of the target object and the expected horizontal deflection angle.
  • Step 407 The AR device of the first object receives the expected horizontal deflection angle of the target object.
  • Step 408 The AR device of the first object sequentially determines whether the expected horizontal declination angle of each target object is within the horizontal viewing angle range of the AR device, filtering out the target objects whose expected horizontal declination angle is not within the horizontal viewing angle range, and then executing the step 409.
  • this step is optional. In a possible implementation, this step can also be performed by the V2X device of the first object.
  • the V2X device of the first object will expect the target object and the target object whose horizontal deflection angle is within the horizontal viewing angle range of the AR device. The expected horizontal deflection is sent to the AR device of the first object.
  • the target objects that are not within the horizontal field of view of the AR device can be filtered out, and unnecessary data processing consumption in subsequent steps can be reduced.
  • the camera of the AR device has a certain range of field of view, what this application has to do is to find the object image of the target object threatening the first object from the AR screen. Therefore, if the target object is not in the AR device Within the horizontal field of view angle range of the camera, then the object image of the target object will not appear in the AR screen, and the subsequent steps do not need to be performed.
  • Figure 4a it is clear that both object 1 and object 2 are within the field of view of the camera, but object 3 is not within the field of view of the camera. Therefore, even if object 3 is a target object threatening the first object, it passes After the processing in this step, the object 3 will be filtered out, and there is no need to process the object 3 in the subsequent steps.
  • the expected horizontal deflection angle is a predicted value of the absolute horizontal deflection angle of the image of the target object in the AR screen
  • the expected horizontal deflection angle should be less than or equal to m/2, where m is the horizontal viewing angle range of the camera of the AR device.
  • Step 409 The AR device of the first object recognizes the object image in the AR image, and calculates the absolute horizontal deflection angle of each object image in the AR image.
  • Step 410 The AR device of the first object compares the expected horizontal deflection angle of each target object with the absolute horizontal deflection angle to obtain the object image corresponding to each target object in the AR screen, and The difference between the expected horizontal deflection angle of the target object and the absolute horizontal deflection angle of the corresponding object image meets the first difference requirement.
  • the object image corresponding to the target object in the AR screen can be found, that is, the object that is threatening to the first object is found in the AR screen, that is, it needs to be Reminder object.
  • the accuracy required for the specific difference between the expected horizontal deflection angle and the absolute horizontal deflection angle can be independently set in practical applications, and this application is not limited.
  • the expected horizontal deflection angle of each target object can be compared with the absolute horizontal deflection angle of each object image in turn to determine whether the difference between the two is satisfied Difference requirement
  • Step 411 The AR device of the first object performs an early warning display on the object image corresponding to each target object in the AR screen.
  • warning display can be implemented by means of a graphical user interface (GUI, Graphical User Interface) such as on-screen display (OSD, On Screen Display).
  • GUI Graphical User Interface
  • the method of warning display can include, but is not limited to: setting a special display color for the object image, using a special method such as a box to frame the object image, flashing the object image, and displaying special characters such as "warning” and “warnning” on the object image. "Collision occurs in X seconds", etc., as long as it can draw the user's attention to the image of the object and play a danger warning effect.
  • the image of the object can be displayed for early warning as shown in the figure.
  • Figures 4d and 4e are only examples, and are not used to limit the possible implementation of the early warning display in this application.
  • step 403 will be described.
  • the follow-up processing is to warn the surrounding objects that have images in the AR screen and threaten the first object, and the camera of the AR device has a field of view, and many surrounding objects are not within the field of view of the camera. , That is to say, many surrounding objects will not appear in the AR picture, and images of the many surrounding objects will not appear in the AR picture. Therefore, it is not necessary to process the many surrounding objects in subsequent steps.
  • the surrounding objects can be screened first, and the surrounding objects that may be potentially risky with the first object, that is, the surrounding objects associated with the first object, can be selected, and then the selected surrounding objects can be subsequently processed to reduce The data processing capacity of the method described in this application.
  • a target classification algorithm can be used to filter out the surrounding objects associated with the first object, and filter out the surrounding objects not associated with the first object.
  • the objects associated with the first object in the surrounding objects that is, the associated objects, can be filtered out according to the running direction and the latitude and longitude of the first object and the surrounding objects.
  • the positional relationship between the surrounding objects and the first object is classified according to the 9-square grid position model, and 8 positional relationships of the surrounding objects relative to the first object are obtained, specifically front, right front, left front, There are 8 directions: front left, front right, rear left, rear right, and front rear.
  • the first object is a vehicle as an example, but the first object in FIG. 4d is not limited to a vehicle, and can be applied to any first object.
  • ⁇ 0 is the current running direction of the first object, which can be obtained from the GNSS device of the first object.
  • the longitude and latitude of point A is the longitude and latitude of the surrounding objects, assuming it is (x 1 , y 1 ), the longitude and latitude (x 1 , y 1 ) of the surrounding objects belong to the operating data of the surrounding objects, which can be obtained from the BSM message sent by the surrounding objects ;
  • the latitude and longitude of point B is the latitude and longitude of the first object, assuming it is (x 0 , y 0 ), which can be obtained from the GNSS device in the first object.
  • the specific selection method can be:
  • the surrounding object is moving towards the first object, the positional relationship is positive left or positive right or left rear or right rear or directly rear, and the actual direction deflection angle is greater than plus or minus 80 degrees, then the surrounding object is judged to be the first object Irrelevant vehicles are filtered out.
  • the surrounding objects are traveling in the same direction as the first object, the positional relationship is straight ahead or front left or front right, and (v 1 *sin ⁇ -v 0 )>0, then the surrounding objects are objects that have nothing to do with the first object, filter out .
  • the surrounding objects are traveling in the same direction as the first object, the positional relationship is directly rear or left rear or right rear, and (v 0 -v 1 *sin ⁇ )>0, then the surrounding objects are objects that have nothing to do with the first object, filter out .
  • v 0 is the current running speed of the first object
  • v 1 is the current running speed of surrounding objects.
  • the surrounding objects that are not related to the first object are filtered out, the surrounding objects related to the first object are selected, and the related objects are obtained.
  • the surrounding objects with a speed of 0 can be filtered out first, and then the surrounding objects can be further filtered by the above-mentioned target classification algorithm.
  • step 404 the implementation of step 404 will be described.
  • the Path Predication Method can be used to calculate the threat degree of the associated object to the first object. Further, the algorithm can also calculate the threat degree of any surrounding object to the first object. . In the following, it is assumed that the second object is any related object. It should be noted that if there is no step 403, the second object can be any surrounding object.
  • the main principles of the predictive path collision algorithm are:
  • x 0t represents the predicted longitude of the first object after the duration t
  • y 0t represents the predicted latitude of the first object after the duration t
  • x 0 represents the current longitude of the first object
  • y 0 represents the current latitude of the first object
  • v 0 represents the current speed of the first object
  • t represents the duration
  • R 0 represents the predicted curvature of the first object
  • ⁇ 0 represents the heading angle of the first object.
  • x it represents the predicted longitude of the second object after the time length t
  • y it represents the predicted latitude of the second object after the time length t
  • x i the current longitude of the second object
  • y i the current latitude of the second object.
  • v i represents the current speed of the second object
  • t represents the length
  • R i represents a prediction of curvature of the second object
  • ⁇ i represents a heading angle of the second object.
  • the distance threshold can be set according to the size of the object, and the corresponding time length T when the distance between the first object and the second object is less than the preset distance threshold can be obtained. If the distance between an object is smaller than the preset distance threshold, it indicates that the second object will collide with the first object after the time period T has passed.
  • At least one of the first object and the second object should be a moving object, otherwise if the first object and the second object are both stationary objects, the above prediction path collision algorithm does not
  • moving objects can include but are not limited to vehicles, pedestrians, robots, or bicycles capable of V2X communication
  • stationary objects can include, but are not limited to, vehicles, pedestrians, robots, or bicycles capable of V2X communication, etc. .
  • step 405 will be described.
  • the center point of the AR device can be calculated to point to the center point of the target object based on the operating data of the target object and the operating data of the first object
  • the included angle between the direction of and the installation direction of the AR device camera, and the included angle is taken as the expected horizontal deflection angle of the target object.
  • the above-mentioned similar method can also be used.
  • the operation data of the surrounding object and the operation data of the first object can be used.
  • Data calculate the angle between the direction in which the center point of the AR device points to the center point of the surrounding object and the installation direction of the AR device camera, and use the angle as the expected horizontal deflection angle of the surrounding object.
  • Figure 5 shows a schematic top view of the first object and the target object.
  • Point O is the center point of the AR device of the first object
  • OA is the installation direction of the AR device's camera
  • OL and OR are the left and right boundary lines of the camera's field of view, respectively
  • O 1 is the center point of the GNSS device of the first object, or Think of it as the center point of the first object
  • O 2 is the center point of the GNSS device of the target object, or it can be considered the center point of the target object
  • OO 2 is the direction in which the center point of the AR device points to the center point of the GNSS device of the target object
  • O 1 B points to the running direction of the first object
  • point O is on the straight line O 1 B
  • ON and O 1 N 1 are the connection lines in the true north direction
  • the line segment O 1 O is the known distance, that is, the center point of the GNSS device of the first object The distance from the center point of the AR device, then, the expected horizontal de
  • the distance O 1 O between the center point O 1 of the GNSS device of the first object and the center point O of the AR device, and the heading angle of the first object ⁇ NOB is calculated to obtain the position coordinates of the center point O of the AR device of the first object ( X O , Y O ).
  • the following formula 3 can be used to calculate the coordinates of point O (X O , Y O ):
  • the position coordinates of the center point O 1 of the GNSS device of the first object It can be obtained by reading the GNSS device of the first object, the heading angle of the first object ⁇ NOB can be obtained by the GNSS device of the first object, the distance between the center point of the GNSS device of the first object and the center point of the AR device O 1 O is the known distance.
  • ⁇ NOA is the horizontal installation angle of the camera of the AR device of the first object relative to the true north direction, which can be obtained through the electronic compass of the AR device of the first object.
  • the expected horizontal deflection angle ⁇ O 2 OA of the target object in the AR image is calculated.
  • the aforementioned method can be used to calculate the expected horizontal deflection angle of the target object in the AR image of the first object.
  • the distance O 1 O between the center point of the GNSS device of the first object and the center point of the AR device is relatively large, so ⁇ When O 2 O 1 N 1 is quite different from ⁇ O 2 ON. If the size of the first object is relatively small or the GNSS device of the first object is relatively close to the AR device, even the GNSS device is set in the AR device, so that the center point of the GNSS device of the first object is between the center point of the AR device The distance O 1 O is relatively small, and the difference between ⁇ O 2 O 1 N 1 and ⁇ O 2 ON is small. Within the allowable error range, the above calculation method can be simplified to formula 5:
  • the above method of calculating the expected horizontal deflection angle of the target object can also be further extended to: calculating the expected horizontal deflection angle of any surrounding object.
  • step 409 will be described.
  • the relevant image recognition method can be used to recognize the object image in the AR screen.
  • the object image can be marked with graphics.
  • the object image can be marked with a rectangle.
  • Figure 6a is a schematic diagram of the light connection of the camera imaging.
  • K is the shooting point of the camera
  • ABCD is the video image plane shot and imaged by the camera at the K point. It can also be understood as the AR finally displayed on the screen of the AR device.
  • point P is the center point of the AR picture
  • the straight line PL is the horizontal line of the AR picture. Point P corresponds to the exact center of the camera shooting direction, so all images are presented on the AR picture ABCD.
  • an object image in the AR picture ABCD is identified through image recognition technology.
  • the object image is marked by a rectangular area, the rectangular area is abcd, P1 is the center point of the rectangular area abcd, and a straight line P perpendicular to the straight line PL is made through the point P1 1 M, it can be concluded that the absolute horizontal deflection angle of the rectangular area abcd in the AR image is ⁇ PKM, which is the absolute horizontal deflection angle of the object image in the AR image.
  • PM is the horizontal coordinate x of the object image represented by the rectangular area abcd. Since the resolution of the AR image display is known, the image recognition technology scans the video buffer area data to obtain the horizontal coordinate of the rectangular area abcd in the AR image. Pixel start index number and end index number, the pixel index number of the center point P1 in the horizontal direction of the AR picture can be further calculated, and P is the center point of the AR picture, and the pixel index number of P in the horizontal direction of the AR picture is already Known, then, how many pixels the line segment PM occupies in the horizontal direction can be calculated by image recognition technology.
  • the absolute horizontal deflection angle ⁇ PKM of the object image in the AR image can be calculated by the following formula 6:
  • y is the angle value of ⁇ PKM
  • L is the total number of pixels in the horizontal direction of the AR image
  • m is the horizontal viewing angle range of the camera
  • x is the number of pixels in the horizontal direction of the PM.
  • the absolute horizontal deflection angle of each object image in the AR image in the AR image can be obtained.
  • the absolute horizontal deflection angle of the point on the AR screen gradually increases from the center to the left and right sides, and the maximum value is m/2.
  • m/2 is 65 degrees as an example.
  • the recognition in the AR screen The resulting object image is framed by a rectangular frame, and the absolute horizontal deflection angle of each object image is shown in Figure 6b. The closer to the center point, the smaller the absolute horizontal deflection angle, and vice versa, the larger the absolute horizontal deflection angle.
  • the V2X device of the first object performs the calculation of the predicted horizontal deflection angle.
  • the AR device may also perform the above calculation. Based on this, the present application provides the embodiment shown in FIG. 7.
  • the early warning method of the target object shown in FIG. 7 may include:
  • Step 701 The V2X device of the first object performs V2X communication with surrounding objects, and obtains operating data of the surrounding objects respectively.
  • Step 702 The V2X device of the first object sends the operating data of the surrounding objects to the AR device of the first object.
  • Step 703 The AR device of the first object receives operating data of the surrounding objects.
  • Step 704 The AR device of the first object acquires the operating data of the first object.
  • the AR device of the first object can obtain the operating data of the first object from the GNSS device of the first object.
  • step 704 and step 701 to step 703 are not limited.
  • Step 705 The AR device of the first object selects a peripheral object associated with the first object from the peripheral objects according to the operating data of the first object and the operating data of the surrounding objects.
  • Step 706 The AR device of the first object separately calculates the threat degree of each associated object to the first object according to the operating data of the first object and the operating data of the associated object, and selects P associated objects in descending order of the degree of threat As the target object; P is a natural number.
  • Step 707 The AR device of the first object calculates the expected horizontal deflection angle of each target object.
  • Step 708 to step 711 are the same as step 408 to step 411, and will not be repeated here.
  • each step in the embodiment shown in FIG. 7 can refer to the corresponding description in the embodiment shown in FIG.
  • the target object is identified from the AR image by the horizontal deflection angle.
  • S ⁇ 2 the difference between the deflection angles all meet the first difference requirement
  • Fig. 8a is a flow chart of another embodiment of an early warning method for a target object of this application. As shown in Fig. 8a, the method may include:
  • Step 801 to step 803 are the same as step 401 to step 403, and will not be described in detail.
  • Step 804 The V2X device of the first object calculates the distance between each associated object and the first object and the expected horizontal deflection angle of each associated object according to the operating data of the first object and the operating data of the associated object.
  • the distance corresponding to each associated object and the expected horizontal deflection angle can be stored in the form of (D n , ⁇ n ,), where D n is the distance between the associated object and the first object, ⁇ n is the expected horizontal deflection angle of the associated object.
  • Step 805 The V2X device of the first object separately calculates the threat degree of each associated object to the first object according to the operating data of the first object and the operating data of the associated object, and selects P associated objects in descending order of the degree of threat As the target object; P is a natural number.
  • step 804 and step 805 are not limited.
  • Step 806 The V2X device of the first object sends the target object, the distance between each associated object and the first object, and the expected horizontal deflection angle of each associated object to the AR device of the first object.
  • Step 807 The AR device of the first object receives the above-mentioned data sent by the V2X device of the first object.
  • Step 808 The AR device of the first object sequentially determines whether the expected horizontal declination angle of each target object is within the horizontal viewing angle range of the AR device, filtering out the target objects whose expected horizontal declination angle is not within the horizontal viewing angle range, and then executing the step 809.
  • Step 809 The AR device of the first object recognizes the object image in the AR image, and calculates the absolute horizontal deflection angle of each object image in the AR image.
  • Step 810 The AR device of the first object compares the expected horizontal deflection angle of each target object with the absolute horizontal deflection angle to obtain the object image corresponding to each target object in the AR screen, and The difference between the expected horizontal deflection angle of the target object and the absolute horizontal deflection angle of the corresponding object image meets the first preset difference requirement.
  • step 811 to step 814 are performed.
  • Step 811 For the target objects whose corresponding object images in the AR screen are at least two, the AR device of the first object selects the expected horizontal deflection angle from the associated objects and the difference between the expected horizontal deflection angle of the target object satisfies Related objects required by the second difference.
  • the associated object whose difference between the expected horizontal deflection angle and the expected horizontal deflection angle of the target object meets the second difference requirement is simply referred to as the first associated object.
  • Step 812 According to the distance between the first associated object and the first object, and the distance between the target object and the first object, the AR device of the first object determines the distance between the first associated object and the target object as small as possible. Sort to big to get the sort order of the target object.
  • Step 813 Obtain the coordinate value of the object image corresponding to the target object in the Y-axis direction in the AR screen, and sort the object image according to the coordinate value from small to large.
  • the coordinate value of the object image on the Y axis is essentially the distance between the center point of the object image and the bottom edge of the AR screen.
  • the lower left corner of the AR screen can be used as the origin of the two-dimensional rectangular coordinate system, or the center point of the bottom edge of the AR screen can be used as the origin of the two-dimensional rectangular coordinate system, and the bottom side can be used as the horizontal axis.
  • a straight line perpendicular to the horizontal axis and passing through the origin is used as the vertical axis, thereby establishing a two-dimensional teaching coordinate system, and then calculating the vertical coordinate of the center point of the object image.
  • the vertical coordinate of the object image can also be obtained directly by calculating the pixels occupied by the vertical line segment from the center point of the object image to the bottom edge of the AR image, as shown in the dashed line in Figure 8b.
  • step 811 to step 812 and step 813 is not limited.
  • Step 814 If the sort order of the target object is greater than the number of object images corresponding to the target object, filter out the target object, otherwise, select the object image with the same sort order as the target object as the target object The corresponding object image.
  • the ranking of the target object is greater than the number of object images corresponding to the target object. At this time, it means that the target object does not have a corresponding object image in the AR screen, and the target object is filtered out.
  • Performing steps 811 to 814 for the target objects corresponding to at least two object images in step 810 can make each target object correspond to only one object image in the AR screen.
  • Step 815 is the same as step 411, and will not be repeated here.
  • the V2X device of the first object performs the calculation of the predicted horizontal deflection angle.
  • the AR device may also perform the above calculation.
  • the present application provides the embodiment shown in FIG. 9, and the early warning method for the target object shown in FIG. 9 may include:
  • Step 901 The V2X device of the first object performs V2X communication with surrounding objects, and obtains the operating data of the surrounding objects respectively.
  • Step 902 The V2X device of the first object sends the operating data of the surrounding objects to the AR device of the first object.
  • Step 903 The AR device of the first object receives operating data of the surrounding objects.
  • Step 904 The AR device of the first object acquires the operating data of the first object.
  • the AR device of the first object can obtain the operating data of the first object from the GNSS device of the first object.
  • step 904 and step 901 to step 903 are not limited.
  • Step 905 The AR device of the first object selects a peripheral object associated with the first object from the peripheral objects according to the operating data of the first object and the operating data of the surrounding objects.
  • Step 906 The AR device of the first object separately calculates the distance between each associated object and the first object and the expected horizontal deflection angle of each associated object according to the operating data of the first object and the operating data of the associated object.
  • Step 907 The AR device of the first object separately calculates the threat degree of each associated object to the first object according to the operating data of the first object and the operating data of the associated object, and selects P associated objects in descending order of the degree of threat As the target object; P is a natural number.
  • step 906 and step 907 is not limited.
  • Step 908 to step 915 are the same as step 808 to step 815, and will not be repeated here.
  • each step of the embodiment shown in FIG. 9 can refer to the corresponding description in the embodiment shown in FIG. 4, FIG. 7, and FIG.
  • the target object early warning method of the present application enables at least one target object that is a threat to the first object acquired based on V2X communication to be effectively recognized in the AR picture taken by the camera, and the warning display is performed.
  • the AR screen combined with GUI means makes the target object more intuitively displayed in the AR screen, which greatly improves the interaction effect between the AR screen and the user;
  • the warning method of the target object in this application does not need to rely on radar, navigation or other sensors, only through V2X Communication equipment, cameras and AI computing equipment can be realized, which greatly simplifies the comprehensive link and cost of V2X to AR reality, and improves the calculation efficiency;
  • the early warning method of the target object of this application can be used not only for vehicles, but also for vehicles on the road.
  • Other objects capable of V2X communication can also be used, such as pedestrian threats.
  • the first object, target object, surrounding objects, etc. described in the embodiment of this application require identification information to indicate the above-mentioned object in the electronic device implementing the technical solution of the embodiment of this application, for example, when performing the above-mentioned acquisition In the step of at least one target object threatening the first object, what is obtained is not the target object itself, but the identification information of the target object.
  • FIG. 10 is a diagram of a possible vehicle system architecture to which the early warning method of the target object of this application can be applied.
  • the system mainly includes: AR equipment, communication information processing system, body bus, LTE-V antenna, GPS data processing module and GPS antenna; among them,
  • the AR device is used to complete the target recognition of the camera video image content and synthesize enhanced information, and display the AR picture on the screen.
  • the AR device can communicate with the communication information processing system, and receive the distance between the surrounding objects and the vehicle and the expected horizontal deflection angle of the surrounding objects determined by the communication information processing system.
  • the AR device can be a car machine or a mobile phone, etc.
  • the body bus is used to connect other electronic control units (ECU, Electronic Control Unit) of the vehicle, such as transmitters, wheels, brake sensors, etc., and can obtain various driving status data of the vehicle, such as speed, steering wheel angle, etc., through the body bus.
  • ECU Electronic Control Unit
  • driving status data of the vehicle such as speed, steering wheel angle, etc.
  • the GPS data processing module is used to obtain GPS data through a GPS antenna, analyze the received GPS data, and obtain the longitude and latitude position information and heading information of the vehicle.
  • GPS antenna and GPS data processing module constitute a GPS device.
  • AR equipment includes:
  • the video data decoding unit is used to obtain video data from the camera of the AR device, decode the video data and output it to the screen drive controller and the video logic unit processing module;
  • the screen drive controller is used to complete the coding and output of screen data signals and synchronization signals, supply power to the screen, and drive the screen to display normally.
  • the GUI image controller is used to superimpose vector signals and on-screen information display (OSD, On-Screen Display) on AR images.
  • OSD On-screen information display
  • the video logic unit processing module is used to use the image recognition algorithm to perform artificial intelligence (AI, Artificial Intelligence) algorithm recognition on the image data content in the AR screen, recognize the object image in the AR screen, and control the GUI image controller to superimpose the mark image information To the object image of the target object in the AR screen.
  • AI Artificial Intelligence
  • the communication information processing system mainly includes:
  • the vehicle operation data analysis module is used to complete the reception and analysis of the vehicle data.
  • the LTE-V data packet application data algorithm processing module is used to combine the GPS data of the vehicle, the vehicle data of the vehicle, and the surrounding objects such as the V2X message of the vehicle received through the LTE-V data packet network transmission layer protocol stack processing module Data, define the positional relationship between surrounding objects and the vehicle, and use the target classification algorithm and the predicted path algorithm to calculate the target object, and calculate the distance of the surrounding object relative to the vehicle and the expected horizontal deflection angle.
  • the LTE-V data packet network transport layer protocol stack processing module is used to complete the identification and extraction of the network transport layer protocol stack header of the LTE-V data packet, and send the application layer data in the data packet such as the BSM message to LTE-V Data packet application data algorithm processing module.
  • LTE-V radio frequency integrated circuit is used to complete the collection of LTE-V radio frequency signals.
  • the LTE-V data access layer processing module is used to complete the processing of the 3GPP protocol stack of the LTE-V access layer, so that the air interface data can be correctly identified.
  • the Ethernet-driven communication interface is used to send the relevant information calculated by the LTE-V data packet application data algorithm processing module to the AR device.
  • This interface can also be other communication interfaces, including but not limited to the Universal Asynchronous Transmitter (UART, Universal Asynchronous Receiver/Transmitte), Serial Peripheral Interface (SPI, Serial Peripheral Interface), Integrated Circuit Bus (I2C, Inter-Integrated Circuit), WIFI (Wireless-Fidelity), Universal Serial Bus (USB, Universal Serial Bus) , Peripheral component interconnection expansion standard (PCIE, peripheral component interconnect express), secure digital input and output card (SDIO, Secure Digital Input and Output), etc.
  • UART Universal Asynchronous Receiver/Transmitte
  • SPI Serial Peripheral Interface
  • I2C Inter-Integrated Circuit
  • WIFI Wireless-Fidelity
  • USB Universal Serial Bus
  • PCIE Peripheral component interconnection expansion standard
  • SDIO Secure Digital Input and Output
  • the physical components involved in the embodiments of this application may include: RFIC chips supporting LTE-V communication data, GPS positioning chips, data transmission bus controllers, computing processors, memory storage, flash memory, image processors, video viewfinders, Electronic compass, etc., can also include: WIFI chip, Ethernet controller, etc.
  • the image processor may be a DA/AD converter; the video viewfinder may be a camera, and the data transmission bus controller may be based on Ethernet or Controller Area Network (CAN, Controller Area Network).
  • CAN Controller Area Network
  • the communication information processing system exists in a vehicle-mounted TBOX device, the AR device is a vehicle-mounted vehicle entertainment system, and the vehicle-mounted TBOX communicates with the AR device via Ethernet or USB or WIFI;
  • the communication information processing system has LTE -V communication function, responsible for V2X communication with surrounding objects on the road, such as vehicles, based on which the distance and expected horizontal deflection of surrounding objects relative to the vehicle are calculated, and the target object is calculated, and the above information is sent to the AR device;
  • AR The device recognizes the object image according to the image taken by the camera, and calculates the absolute horizontal deflection angle of the object image, matches the expected horizontal deflection angle with the absolute horizontal deflection angle, finds the object image of the target object in the AR screen, marks the object image and prompts it Relevant warning information, such as "collision in X seconds", etc.
  • FIG. 11 is a structural diagram of an embodiment of an early warning device for a target object of this application. As shown in FIG. 11, the device 110 may include:
  • the expected deflection angle obtaining unit 111 is configured to obtain at least one target object that is threatening to the first object, and obtain the expected horizontal deflection angle of the target object; the expected horizontal deflection angle of the target object is the The predicted value of the absolute horizontal deflection angle of the image in the AR picture; the AR picture is the AR picture displayed by the AR device of the first object;
  • the absolute deflection angle acquiring unit 112 is configured to acquire the absolute horizontal deflection angle of each object image in the AR screen;
  • the absolute horizontal deflection angle of the object image is the direction in which the shooting point of the AR screen points to the center point of the object image and the shooting The horizontal angle between the point pointing to the center point of the AR screen;
  • the image obtaining unit 113 is configured to obtain the target according to the expected horizontal deflection angle of the target object obtained by the expected deflection angle obtaining unit and the absolute horizontal deflection angle of each object image obtained by the absolute deflection angle obtaining unit For the object image corresponding to the object in the AR screen, the difference between the expected horizontal deflection angle of the target object and the absolute horizontal deflection angle of the object image corresponding to the target object meets the first difference requirement;
  • the display unit 114 is configured to perform an early warning display on the object image corresponding to the target object obtained by the image obtaining unit in the AR screen.
  • the absolute deflection angle obtaining unit 112 may be specifically used for:
  • the absolute horizontal deflection angle of the object image is calculated by the following formula:
  • y is the angle value of the absolute horizontal deflection angle of the object image
  • L is the total number of pixels in the horizontal direction of the AR image
  • m is the horizontal viewing angle range of the camera of the AR device
  • x is the center point of the object image and the center of the AR image The number of pixels that the line segment between the points occupies in the horizontal direction.
  • the expected deflection angle obtaining unit 111 may be specifically used for:
  • the expected horizontal deflection angle of the target object and the target object is determined by the V2X device according to the operating data of the target object and the first
  • the operating data of an object is determined, and the V2X device is set on the first object.
  • the expected deflection angle obtaining unit 111 may include:
  • the calculation subunit is configured to obtain at least one target object from the surrounding objects according to the operating data of the surrounding objects and the operating data of the first object, and calculate the expected horizontal deflection angle of the target object.
  • calculation subunit may be specifically used for:
  • For each target object according to the operating data of the target object and the operating data of the first object, calculate the direction in which the center point of the AR device points to the center point of the target object and the installation of the AR device camera The included angle between the directions, the included angle is used as the expected horizontal deflection angle of the target object.
  • calculation subunit may be specifically used for:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • (X O , Y O ) is the position coordinate of the center point O of the AR device
  • ⁇ NOB is the heading angle of the first object
  • O 1 O is The distance between the center point O 1 of the GNSS device of the first object and the center point O of the AR device.
  • calculation subunit may be specifically used for:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • calculation subunit may be specifically used for:
  • calculation subunit may be specifically used for:
  • peripheral objects associated with the first object According to the operating data of the peripheral objects and the operating data of the first object, select peripheral objects associated with the first object from the peripheral objects; acquire at least one of the target objects from the peripheral objects associated with the first object .
  • the expected deflection angle obtaining unit 111 may also be used to obtain the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object; the expected horizontal deviation of the surrounding object The angle is the predicted value of the absolute horizontal deflection angle of the image of the surrounding object in the AR picture;
  • the image obtaining unit 113 can also be used to: select surrounding objects whose difference between the expected horizontal deflection angle and the expected horizontal deflection angle of the target object meets the second difference requirement; The distance between the first object and the distance between the target object and the first object, the selected surrounding objects and the target object are sorted according to the distance from small to large, and the ranking position of the target object is obtained Obtain the coordinate value of the object image corresponding to the target object in the Y-axis direction in the AR screen, and sort the object image corresponding to the target object from small to large according to the coordinate value; select the sort order and the target Object images with the same order of objects are regarded as the object images corresponding to the target object.
  • the expected deflection angle obtaining unit 111 may be specifically used for:
  • the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object from the V2X device.
  • the distance and the expected horizontal deflection angle are determined by the target object of the V2X device.
  • the operating data and the operating data of the first object are determined.
  • the expected deflection angle obtaining unit 111 may be specifically used for:
  • the distance between the peripheral object and the first object and the expected horizontal deflection angle of the peripheral object are calculated.
  • the expected deflection angle obtaining unit 111 may be specifically used for:
  • For each of the peripheral objects calculate the direction in which the center point of the AR device points to the center point of the peripheral object and the installation direction of the AR device camera according to the operating data of the peripheral object and the operating data of the first object
  • the included angle between, the included angle is regarded as the expected horizontal deflection angle of the surrounding object.
  • the expected deflection angle obtaining unit 111 may be specifically configured to calculate the included angle according to the following formula:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • (X O , Y O ) is the position coordinate of the center point O of the AR device
  • ⁇ NOB is the heading angle of the first object
  • O 1 O is the distance between the center point O 1 of the GNSS device of the first object and the center point O of the AR device.
  • the expected deflection angle obtaining unit 111 may be specifically configured to calculate the included angle according to the following formula:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • the expected deflection angle acquisition unit 111 may be specifically configured to: select a peripheral object associated with the first object from the peripheral objects according to the operating data of the peripheral object and the operating data of the first object; Calculate the distance between the peripheral object associated with the first object and the first object, and the expected horizontal deflection angle of the peripheral object associated with the first object.
  • the device shown in FIG. 11 obtains at least one target object that is threatening to the first object, obtains the object image corresponding to the target object in the AR screen through the deflection angle comparison, and performs an early warning display of the object image, so as to provide users with A wider range of hazard warnings improves user experience.
  • the apparatus 110 provided in the embodiment shown in FIG. 11 can be used to implement the technical solutions of the method embodiments shown in FIGS. 3 to 9 of this application. For its implementation principles and technical effects, further reference may be made to related descriptions in the method embodiments.
  • the division of the various units of the operating device of the desktop folder shown in FIG. 11 is only a division of logical functions, and may be fully or partially integrated into one physical entity in actual implementation, or may be physically separated.
  • these units can all be implemented in the form of software called by processing elements; they can also be implemented in the form of hardware; part of the units can also be implemented in the form of software called by the processing elements, and some of the units can be implemented in the form of hardware.
  • the expected deflection angle acquisition unit may be a separately established processing element, or it may be integrated in a certain chip of the electronic device.
  • the implementation of other units is similar.
  • all or part of these units can be integrated together or implemented independently.
  • each step of the above method or each of the above units can be completed by an integrated logic circuit of hardware in the processor element or instructions in the form of software.
  • the above units may be one or more integrated circuits configured to implement the above methods, such as: one or more specific integrated circuits (Application Specific Integrated Circuit; hereinafter referred to as ASIC), or, one or more micro-processing DSP (Digital Singnal Processor; hereinafter referred to as DSP), or, one or more Field Programmable Gate Array (Field Programmable Gate Array; hereinafter referred to as FPGA), etc.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Singnal Processor
  • FPGA Field Programmable Gate Array
  • these units can be integrated together and implemented in the form of a System-On-a-Chip (hereinafter referred to as SOC).
  • SOC System-On-a-Chip
  • FIG. 12 is a schematic structural diagram of an embodiment of an electronic device of this application. As shown in FIG. 12, the above-mentioned electronic device may include: a touch screen; one or more processors; a memory; multiple application programs; and one or more computers program.
  • the above-mentioned touch screen may include the touch screen of an on-board computer (Mobile Data Center); the above-mentioned electronic device may be an electronic device (mobile phone), a smart screen, a drone, or an intelligent connected vehicle; Hereinafter referred to as: ICV), smart/intelligent car (smart/intelligent car) or in-vehicle equipment and other equipment.
  • ICV intelligent connected vehicle
  • the above-mentioned one or more computer programs are stored in the above-mentioned memory, and the above-mentioned one or more computer programs include instructions.
  • the above-mentioned instructions are executed by the above-mentioned device, the above-mentioned device is caused to perform the following steps:
  • the expected horizontal deflection angle of the target object is the absolute horizontal deflection angle of the image of the target object in the AR screen
  • the predicted value of; the AR picture is the AR picture displayed by the AR device of the first object;
  • the absolute horizontal deflection angle of each object image in the AR screen is the direction in which the shooting point of the AR screen points to the center point of the object image and the direction in which the shooting point points to the center point of the AR screen The included angle in the horizontal direction;
  • the object image corresponding to the target object in the AR screen is obtained.
  • the difference between the absolute horizontal deflection angles of the object image corresponding to the object meets the first difference requirement;
  • the object image corresponding to the target object is displayed for warning in the AR screen.
  • the step of obtaining the absolute horizontal deflection angle of each object image in the AR screen includes:
  • the absolute horizontal deflection angle of the object image is calculated by the following formula:
  • y is the angle value of the absolute horizontal deflection angle of the object image
  • L is the total number of pixels in the horizontal direction of the AR image
  • m is the horizontal viewing angle range of the camera of the AR device
  • x is the center point of the object image and the center of the AR image The number of pixels that the line segment between the points occupies in the horizontal direction.
  • the step of obtaining at least one target object that is threatening to the first object and obtaining the expected horizontal deflection angle of the target object includes:
  • the expected horizontal deflection angle of the target object and the target object is determined by the V2X device according to the operating data of the target object and the first
  • the operating data of an object is determined, and the V2X device is set on the first object.
  • the step of obtaining at least one target object that is threatening to the first object and obtaining the expected horizontal deflection angle of the target object includes:
  • At least one target object is acquired from the peripheral objects, and an expected horizontal deflection angle of the target object is calculated.
  • the step of causing the calculation of the expected horizontal deflection angle of the target object includes:
  • For each target object according to the operating data of the target object and the operating data of the first object, calculate the direction in which the center point of the AR device points to the center point of the target object and the installation of the AR device camera The included angle between the directions, the included angle is used as the expected horizontal deflection angle of the target object.
  • the calculation is made based on the operating data of the target object and the operating data of the first object to calculate the direction in which the center point of the AR device points to the center point of the target object and the
  • the steps of the angle between the installation directions of the AR device cameras include:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • (X O , Y O ) is the position coordinate of the center point O of the AR device
  • ⁇ NOB is the heading angle of the first object
  • O 1 O is The distance between the center point O 1 of the GNSS device of the first object and the center point O of the AR device.
  • the direction in which the center point of the AR device points to the center point of the GNSS device in the target object is calculated based on the operating data of the target object and the operating data of the first object
  • the steps of the included angle with the installation direction of the AR device camera include:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • the step of enabling the step of obtaining at least one target object from the surrounding objects according to the operating data of the surrounding objects and the operating data of the first object includes:
  • a peripheral object associated with the first object is selected from the peripheral objects; correspondingly,
  • the acquiring at least one target object from the surrounding objects includes:
  • At least one target object is acquired from surrounding objects associated with the first object.
  • the AR screen is Before the step of displaying the early warning on the object image corresponding to the target object, the following steps are also performed:
  • the expected horizontal deflection angle of the surrounding object is the absolute level of the image of the surrounding object in the AR screen The predicted value of the deflection angle
  • the selected peripheral object and the target object are selected in descending order of distance Sort, get the sort position of the target object;
  • the object image with the same sorting order as the sorting order of the target object is selected as the object image corresponding to the target object.
  • the step of obtaining the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object includes:
  • the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object from the V2X device.
  • the distance and the expected horizontal deflection angle are determined by the target object of the V2X device.
  • the operating data and the operating data of the first object are determined.
  • the step of obtaining the distance between the surrounding object and the first object and the expected horizontal deflection angle of the surrounding object includes:
  • the distance between the peripheral object and the first object and the expected horizontal deflection angle of the peripheral object are calculated.
  • the step of causing the calculation of the expected horizontal deflection angle of the surrounding objects includes:
  • For each of the peripheral objects calculate the direction in which the center point of the AR device points to the center point of the peripheral object and the installation direction of the AR device camera according to the operating data of the peripheral object and the operating data of the first object
  • the included angle between, the included angle is regarded as the expected horizontal deflection angle of the surrounding object.
  • the direction in which the center point of the AR device points to the center point of the GNSS device in the surrounding object is calculated based on the operation data of the surrounding object and the operation data of the first object
  • the steps of the included angle with the installation direction of the AR device camera include:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • (X O , Y O ) is the position coordinate of the center point O of the AR device
  • ⁇ NOB is the heading angle of the first object
  • O 1 O is the distance between the center point O 1 of the GNSS device of the first object and the center point O of the AR device.
  • the calculation of the direction in which the center point of the AR device points to the center point of the surrounding object and the AR The steps for the angle between the installation directions of the equipment cameras include:
  • ⁇ O 2 OA is the included angle
  • ⁇ NOA is the horizontal installation angle of the camera relative to the true north direction
  • the step of calculating the distance between the peripheral object and the first object and the expected horizontal deflection angle of the peripheral object includes:
  • the electronic device shown in FIG. 12 may be a terminal device or a circuit device built in the aforementioned terminal device.
  • the device can be used to execute the functions/steps in the methods provided in the embodiments shown in FIG. 3 to FIG. 9 of the present application.
  • the electronic device 1200 may include a processor 1210, an external memory interface 1220, an internal memory 1221, a universal serial bus (USB) interface 1230, a charging management module 1240, a power management module 1241, a battery 1242, an antenna 1, and an antenna 2.
  • Mobile communication module 1250 wireless communication module 1260, audio module 1270, speaker 1270A, receiver 1270B, microphone 1270C, earphone jack 1270D, sensor module 1280, buttons 1290, motor 1291, indicator 1292, camera 1293, display 1294, and Subscriber identification module (subscriber identification module, SIM) card interface 1295, etc.
  • SIM Subscriber identification module
  • the sensor module 1280 can include pressure sensor 1280A, gyroscope sensor 1280B, air pressure sensor 1280C, magnetic sensor 1280D, acceleration sensor 1280E, distance sensor 1280F, proximity light sensor 1280G, fingerprint sensor 1280H, temperature sensor 1280J, touch sensor 1280K, ambient light Sensor 1280L, bone conduction sensor 1280M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 1200.
  • the electronic device 1200 may include more or fewer components than those shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 1210 may include one or more processing units.
  • the processor 1210 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • a memory may also be provided in the processor 1210 to store instructions and data.
  • the memory in the processor 1210 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 1210. If the processor 1210 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 1210 is reduced, and the efficiency of the system is improved.
  • the processor 1210 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter/receiver (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter/receiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, which includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 1210 may include multiple sets of I2C buses.
  • the processor 1210 can couple the touch sensor 1280K, charger, flash, camera 1293, etc., respectively through different I2C bus interfaces.
  • the processor 1210 may couple the touch sensor 1280K through an I2C interface, so that the processor 1210 and the touch sensor 1280K communicate through the I2C bus interface to realize the touch function of the electronic device 1200.
  • the I2S interface can be used for audio communication.
  • the processor 1210 may include multiple sets of I2S buses.
  • the processor 1210 may be coupled with the audio module 1270 through an I2S bus to implement communication between the processor 1210 and the audio module 1270.
  • the audio module 1270 may transmit audio signals to the wireless communication module 1260 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 1270 and the wireless communication module 1260 may be coupled through a PCM bus interface.
  • the audio module 1270 may also transmit audio signals to the wireless communication module 1260 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 1210 and the wireless communication module 1260.
  • the processor 1210 communicates with the Bluetooth module in the wireless communication module 1260 through the UART interface to realize the Bluetooth function.
  • the audio module 1270 may transmit audio signals to the wireless communication module 1260 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 1210 with the display screen 1294, camera 1293 and other peripheral devices.
  • the MIPI interface includes a camera serial interface (camera serial interface, CSI), a display serial interface (display serial interface, DSI), and so on.
  • the processor 1210 and the camera 1293 communicate through a CSI interface to implement the shooting function of the electronic device 1200.
  • the processor 1210 and the display screen 1294 communicate through the DSI interface to realize the display function of the electronic device 1200.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 1210 with the camera 1293, the display screen 1294, the wireless communication module 1260, the audio module 1270, the sensor module 1280, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 1230 is an interface that complies with the USB standard and specifications, and specifically can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 1230 can be used to connect a charger to charge the electronic device 1200, and can also be used to transfer data between the electronic device 1200 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is merely a schematic description, and does not constitute a structural limitation of the electronic device 1200.
  • the electronic device 1200 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 1240 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 1240 may receive the charging input of the wired charger through the USB interface 1230.
  • the charging management module 1240 may receive a wireless charging input through the wireless charging coil of the electronic device 1200. While the charging management module 1240 charges the battery 1242, it can also supply power to the electronic device through the power management module 1241.
  • the power management module 1241 is used to connect the battery 1242, the charging management module 1240 and the processor 1210.
  • the power management module 1241 receives input from the battery 1242 and/or the charging management module 1240, and supplies power to the processor 1210, the internal memory 1221, the display screen 1294, the camera 1293, and the wireless communication module 1260.
  • the power management module 1241 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 1241 may also be provided in the processor 1210.
  • the power management module 1241 and the charging management module 1240 may also be provided in the same device.
  • the wireless communication function of the electronic device 1200 can be implemented by the antenna 1, the antenna 2, the mobile communication module 1250, the wireless communication module 1260, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 1200 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 1250 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 1200.
  • the mobile communication module 1250 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and so on.
  • the mobile communication module 1250 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering and amplifying the received electromagnetic waves, and then transmitting them to the modem processor for demodulation.
  • the mobile communication module 1250 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 1250 may be provided in the processor 1210.
  • at least part of the functional modules of the mobile communication module 1250 and at least part of the modules of the processor 1210 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 1270A, a receiver 1270B, etc.), or displays an image or video through the display screen 1294.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 1210 and be provided in the same device as the mobile communication module 1250 or other functional modules.
  • the wireless communication module 1260 can provide applications on the electronic device 1200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites. System (global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 1260 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1260 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 1210.
  • the wireless communication module 1260 may also receive the signal to be sent from the processor 1210, perform frequency modulation, amplify, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the electronic device 1200 is coupled with the mobile communication module 1250, and the antenna 2 is coupled with the wireless communication module 1260, so that the electronic device 1200 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the electronic device 1200 implements a display function through a GPU, a display screen 1294, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 1294 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations and is used for graphics rendering.
  • the processor 1210 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 1294 is used to display images, videos, etc.
  • the display screen 1294 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 1200 may include one or N display screens 1294, and N is a positive integer greater than one.
  • the electronic device 1200 can realize a shooting function through an ISP, a camera 1293, a video codec, a GPU, a display screen 1294, and an application processor.
  • the ISP is used to process the data fed back from the camera 1293. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 1293.
  • the camera 1293 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 1200 may include 1 or N cameras 1293, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 1200 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 1200 may support one or more video codecs. In this way, the electronic device 1200 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the electronic device 1200 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 1220 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 1200.
  • the external memory card communicates with the processor 1210 through the external memory interface 1220 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 1221 may be used to store computer executable program code, where the executable program code includes instructions.
  • the internal memory 1221 may include a program storage area and a data storage area.
  • the storage program area can store an operating system, an application program (such as a sound playback function, an image playback function, etc.) required by at least one function, and the like.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 1200.
  • the internal memory 1221 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • the processor 1210 executes various functional applications and data processing of the electronic device 1200 by running instructions stored in the internal memory 1221 and/or instructions stored in a memory provided in the processor.
  • the electronic device 1200 can implement audio functions through an audio module 1270, a speaker 1270A, a receiver 1270B, a microphone 1270C, a headphone interface 1270D, and an application processor. For example, music playback, recording, etc.
  • the audio module 1270 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 1270 can also be used to encode and decode audio signals.
  • the audio module 1270 may be provided in the processor 1210, or part of the functional modules of the audio module 1270 may be provided in the processor 1210.
  • the speaker 1270A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 1200 can listen to music through the speaker 1270A, or listen to a hands-free call.
  • the receiver 1270B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 1200 answers a call or voice message, it can receive the voice by bringing the receiver 1270B close to the human ear.
  • Microphone 1270C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 1270C through the human mouth, and input the sound signal into the microphone 1270C.
  • the electronic device 1200 may be provided with at least one microphone 1270C.
  • the electronic device 1200 may be provided with two microphones 1270C, which can implement noise reduction functions in addition to collecting sound signals.
  • the electronic device 1200 can also be equipped with three, four or more microphones 1270C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the earphone interface 1270D is used to connect wired earphones.
  • the earphone interface 1270D can be a USB interface 1230, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association
  • the pressure sensor 1280A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 1280A may be disposed on the display screen 1294.
  • the capacitive pressure sensor may include at least two parallel plates with conductive materials. When a force is applied to the pressure sensor 1280A, the capacitance between the electrodes changes.
  • the electronic device 1200 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 1294, the electronic device 1200 detects the intensity of the touch operation according to the pressure sensor 1280A.
  • the electronic device 1200 may also calculate the touched position according to the detection signal of the pressure sensor 1280A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold is applied to the short message application icon, the command to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 1280B may be used to determine the movement posture of the electronic device 1200.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 1280B can be used for shooting anti-shake.
  • the gyroscope sensor 1280B detects the shake angle of the electronic device 1200, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device 1200 through reverse movement to achieve anti-shake.
  • the gyro sensor 1280B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 1280C is used to measure air pressure.
  • the electronic device 1200 calculates the altitude based on the air pressure value measured by the air pressure sensor 1280C to assist positioning and navigation.
  • the magnetic sensor 1280D includes a Hall sensor.
  • the electronic device 1200 can use the magnetic sensor 1280D to detect the opening and closing of the flip holster.
  • the electronic device 1200 when the electronic device 1200 is a flip machine, the electronic device 1200 can detect the opening and closing of the flip according to the magnetic sensor 1280D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 1280E can detect the magnitude of the acceleration of the electronic device 1200 in various directions (generally three axes). When the electronic device 1200 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and apply to applications such as horizontal and vertical screen switching, pedometers, and so on.
  • the electronic device 1200 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 1200 can use the distance sensor 1280F to measure the distance to achieve fast focusing.
  • the proximity light sensor 1280G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 1200 emits infrared light to the outside through the light emitting diode.
  • the electronic device 1200 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 1200. When insufficient reflected light is detected, the electronic device 1200 can determine that there is no object near the electronic device 1200.
  • the electronic device 1200 can use the proximity light sensor 1280G to detect that the user holds the electronic device 1200 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 1280G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
  • the ambient light sensor 1280L is used to perceive the brightness of the ambient light.
  • the electronic device 1200 can adaptively adjust the brightness of the display screen 1294 according to the perceived brightness of the ambient light.
  • the ambient light sensor 1280L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 1280L can also cooperate with the proximity light sensor 1280G to detect whether the electronic device 1200 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 1280H is used to collect fingerprints.
  • the electronic device 1200 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 1280J is used to detect temperature.
  • the electronic device 1200 uses the temperature detected by the temperature sensor 1280J to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor 1280J exceeds a threshold value, the electronic device 1200 executes to reduce the performance of the processor located near the temperature sensor 1280J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 1200 when the temperature is lower than another threshold, the electronic device 1200 heats the battery 1242 to avoid abnormal shutdown of the electronic device 1200 due to low temperature.
  • the electronic device 1200 boosts the output voltage of the battery 1242 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 1280K also called “touch device”.
  • the touch sensor 1280K can be arranged on the display screen 1294, and the touch screen is composed of the touch sensor 1280K and the display screen 1294, which is also called a “touch screen”.
  • the touch sensor 1280K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 1294.
  • the touch sensor 1280K may also be disposed on the surface of the electronic device 1200, which is different from the position of the display screen 1294.
  • the bone conduction sensor 1280M can acquire vibration signals.
  • the bone conduction sensor 1280M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 1280M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 1280M can also be arranged in the earphone, combined with the bone conduction earphone.
  • the audio module 1270 can parse out the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 1280M to realize the voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 1280M, and realize the heart rate detection function.
  • the buttons 1290 include a power-on button, a volume button, and so on.
  • the button 1290 may be a mechanical button. It can also be a touch button.
  • the electronic device 1200 may receive key input, and generate key signal input related to user settings and function control of the electronic device 1200.
  • the motor 1291 can generate vibration prompts.
  • the motor 1291 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 1294, the motor 1291 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 1292 can be an indicator light, which can be used to indicate the charging status, power change, and can also be used to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 1295 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 1295 or pulled out from the SIM card interface 1295 to achieve contact and separation with the electronic device 1200.
  • the electronic device 1200 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 1295 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 1295 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 1295 can also be compatible with different types of SIM cards.
  • the SIM card interface 1295 can also be compatible with external memory cards.
  • the electronic device 1200 interacts with the network through the SIM card to realize functions such as call and data communication.
  • the electronic device 1200 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 1200 and cannot be separated from the electronic device 1200.
  • the electronic device 1200 shown in FIG. 12 can implement various processes of the methods provided in the embodiments shown in FIGS. 3 to 9 of this application.
  • the operations and/or functions of each module in the electronic device 1200 are respectively for implementing the corresponding processes in the foregoing method embodiments.
  • processor 1210 in the electronic device 1200 shown in FIG. 12 may be a system-on-chip SOC, and the processor 1210 may include a central processing unit (CPU), and may further include other types of processors. For example: Graphics Processing Unit (GPU), etc.
  • CPU central processing unit
  • GPU Graphics Processing Unit
  • each part of the processor or processing unit inside the processor 1210 can cooperate to implement the previous method flow, and the corresponding software program of each part of the processor or processing unit can be stored in the internal memory 121.
  • the device includes a storage medium and a central processing unit.
  • the storage medium may be a non-volatile storage medium.
  • a computer executable program is stored in the storage medium.
  • the central processing unit is connected to the The non-volatile storage medium is connected, and the computer executable program is executed to implement the method provided by the embodiments shown in FIG. 3 to FIG. 9 of this application.
  • the processors involved may include, for example, CPU, DSP, microcontroller or digital signal processor, and may also include GPU, embedded neural network processor (Neural-network Process Units; hereinafter referred to as NPU) and Image signal processing (Image Signal Processing; hereinafter referred to as ISP), which may also include necessary hardware accelerators or logic processing hardware circuits, such as ASIC, or one or more integrated circuits used to control the execution of the technical solutions of this application Circuit etc.
  • the processor may have a function of operating one or more software programs, and the software programs may be stored in a storage medium.
  • the embodiments of the present application also provide a computer-readable storage medium that stores a computer program, and when it runs on a computer, the computer can execute the functions provided by the embodiments shown in Figs. 3 to 9 of the present application. method.
  • the embodiments of the present application also provide a computer program product.
  • the computer program product includes a computer program that, when running on a computer, causes the computer to execute the method provided in the embodiments shown in FIGS. 3 to 9 of the present application.
  • At least one refers to one or more
  • multiple refers to two or more.
  • And/or describes the association relationship of the associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean the existence of A alone, A and B at the same time, and B alone. Among them, A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an “or” relationship.
  • “The following at least one item” and similar expressions refer to any combination of these items, including any combination of single items or plural items.
  • At least one of a, b, and c can represent: a, b, c, a and b, a and c, b and c, or a and b and c, where a, b, and c can be single, or There can be more than one.
  • any function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory; hereinafter referred to as ROM), random access memory (Random Access Memory; hereinafter referred to as RAM), magnetic disks or optical disks, etc.
  • ROM read-only memory
  • RAM random access memory
  • magnetic disks or optical disks etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

一种目标物体的预警方法、装置和电子设备,方法包括:获取至少一个对第一物体具有威胁的目标物体,并获取目标物体的预期水平偏角(301);获取AR画面中各个物体图像的绝对水平偏角(302);根据目标物体的预期水平偏角、以及各个物体图像的绝对水平偏角,获得目标物体在AR画面中对应的物体图像(303);在AR画面中对目标物体对应的物体图像进行预警显示(304);提供更大范围内的危险预警,提升用户体验。

Description

目标物体的预警方法、装置和电子设备 技术领域
本申请涉及智能终端技术领域,特别涉及目标物体的预警方法、装置和电子设备。
背景技术
随着增强现实(AR,Augmented Reality)技术的进步,AR应用在导航场景特别是车载导航的场景中已成为现实。例如,图1a所示为AR实景导航,能够为用户提供更好的导航服务,图1b所示为智能车盒联合车机所支持的AR实景导航,为用户提供更好的驾驶导航体验,两者均大大方便了人们的生活。
在车载AR实景导航的场景下,目前还存在例如高级驾驶辅助系统(ADAS,Advanced Driving Assistance System)、移动数据中心(MDC,Mobile Data Center)等技术,结合雷达传感器,在AR实景导航画面中向用户预警前方的碰撞威胁,增加汽车驾驶的舒适性和安全性。主要原理是:通过车辆上设置的雷达传感器感知前车与本车的距离,结合本车的运动情况如航向、速度等预测在预设的时长内本车是否会与前车发生碰撞,如果预测结果为可能发生碰撞,将AR实景导航画面中正前方的车辆作为威胁车辆显示出来。
但是,这种预警方法只能对正前方的车辆是否具有碰撞威胁做出预警,无法为用户提供更大范围内的危险预警,预警范围狭窄,用户体验差。
发明内容
本申请提供了一种目标物体的预警方法、装置和电子设备,能够为用户提供更大范围内的危险预警,提升用户体验。
第一方面,本申请实施例提供了一种目标物体的预警方法,包括:
获取至少一个对第一物体具有威胁的目标物体,并获取目标物体的预期水平偏角;目标物体的预期水平偏角是对目标物体在AR画面中图像的绝对水平偏角的预测值;AR画面是第一物体的AR设备展示的AR画面;
获取AR画面中各个物体图像的绝对水平偏角;物体图像的绝对水平偏角是AR画面的拍摄点指向物体图像中心点的方向与拍摄点指向AR画面中心点的方向在水平方向上的夹角;
根据目标物体的预期水平偏角、以及各个物体图像的绝对水平偏角,获得目标物体在AR画面中对应的物体图像,目标物体的预期水平偏角与目标物体对应的物体图像的绝对水平偏角之间的差值满足第一差值要求;
在AR画面中对目标物体对应的物体图像进行预警显示。
该方法获取至少一个对第一物体具有威胁的目标物体,通过偏角比对,得到目标物体在AR画面中对应的物体图像,对物体图像进行预警显示,从而能够为用户提供更大范围内的危险预警,提升用户体验。
在一种可能的实现方式中,获取AR画面中各个物体图像的绝对水平偏角,包括:
从AR画面中识别出物体图像;
对于每个物体图像,通过以下公式计算该物体图像的绝对水平偏角:
y=arctan(2x*tan(m/2)/L)
其中,y为物体图像的绝对水平偏角的角度值,L为AR画面水平方向的总像素数,m为AR设备的摄像头的水平视野角度范围,x为物体图像的中心点与AR画面的中心点之间的线段在水平方向上占用的像素个数。
在一种可能的实现方式中,获取至少一个对第一物体具有威胁的目标物体,并获取目标物体的预期水平偏角,包括:
从V2X设备获取目标物体以及目标物体的预期水平偏角,目标物体以及目标物体的预期水平偏角由V2X设备根据目标物体的运行数据以及第一物体的运行数据确定,V2X设备设置于第一物体上。
在一种可能的实现方式中,获取至少一个对第一物体具有威胁的目标物体,并获取目标物体的预期水平偏角,包括:
从V2X设备获取周边物体的运行数据,并且,从GNSS设备获取第一物体的运行数据;V2X设备以及GNSS设备设置于第一物体上;
根据周边物体的运行数据、以及第一物体的运行数据,从周边物体中获取至少一个目标物体,计算目标物体的预期水平偏角。
在一种可能的实现方式中,计算目标物体的预期水平偏角,包括:
对于每个目标物体,根据该目标物体的运行数据、以及第一物体的运行数据,计算AR设备中心点指向该目标物体的中心点的方向与AR设备摄像头的安装方向之间的夹角,将该夹角作为该目标物体的预期水平偏角。
在一种可能的实现方式中,根据该目标物体的运行数据、以及第一物体的运行数据,计算AR设备中心点指向该目标物体的中心点的方向与AR设备摄像头的安装方向之间的夹角,包括:
根据以下公式计算夹角:
Figure PCTCN2020135113-appb-000001
其中,∠O 2OA为夹角,∠NOA为摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000002
为第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000003
为目标物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为AR设备的中心点O的位置坐标,∠NOB为第一物体的航向角,O 1O为第一物体的GNSS设备中心点O 1与AR设备的中心点O之间的距离。
在一种可能的实现方式中,根据该目标物体的运行数据、以及第一物体的运行数据,计算AR设备的中心点指向该目标物体中GNSS设备中心点的方向与AR设备摄像头的安装方向之间的夹角,包括:
根据以下公式计算夹角:
Figure PCTCN2020135113-appb-000004
其中,∠O 2OA为夹角,∠NOA为摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000005
为第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000006
为目标物体的GNSS设备中心点O 2的位置坐标。
在一种可能的实现方式中,根据周边物体的运行数据、以及第一物体的运行数据,从周边物体中获取至少一个目标物体,包括:
根据周边物体的运行数据、以及第一物体的运行数据,计算每个周边物体与第一物体按照运行数据运行时发生碰撞的时长;
按照时长的从小到大的顺序从周边物体中获取至少一个目标物体。
在一种可能的实现方式中,从周边物体中获取至少一个目标物体之前,还包括:
根据周边物体的运行数据、以及第一物体的运行数据,从周边物体中选择与第一物体关联的周边物体;相应的,
从周边物体中获取至少一个目标物体,包括:
从与第一物体关联的周边物体中获取至少一个目标物体。
在一种可能的实现方式中,对于一个目标物体,如果获得的该目标物体在AR画面中对应的物体图像为至少两个,在AR画面中对目标物体对应的物体图像进行预警显示之前,还包括:
获取周边物体与第一物体之间的距离、以及周边物体的预期水平偏角;周边物体的预期水平偏角是对周边物体在AR画面中图像的绝对水平偏角的预测值;
相应的,获得目标物体在AR画面中对应的物体图像与进行预警显示 之间,还包括:
选择预期水平偏角与该目标物体的预期水平偏角之间的差值满足第二差值要求的周边物体;
根据选择的周边物体与第一物体之间的距离、以及该目标物体与第一物体之间的距离,对选择的周边物体和该目标物体按照距离从小到大进行排序,得到该目标物体的排序位次;
获取该目标物体对应的物体图像在AR画面中Y轴方向上的坐标值,对该目标物体对应的物体图像按照坐标值从小到大进行排序;
选择排序位次与该目标物体的排序位次相同的物体图像,作为该目标物体对应的物体图像。
在一种可能的实现方式中,获取周边物体与第一物体之间的距离、以及周边物体的预期水平偏角,包括:
从V2X设备获取周边物体与第一物体之间的距离、以及周边物体的预期水平偏角,距离以及预期水平偏角由V2X设备目标物体的运行数据以及第一物体的运行数据确定。
在一种可能的实现方式中,获取周边物体与第一物体之间的距离、以及周边物体的预期水平偏角,包括:
从V2X设备获取周边物体的运行数据,并且,从GNSS设备获取第一物体的运行数据;V2X设备以及GNSS设备设置于第一物体上;
根据周边物体的运行数据、以及第一物体的运行数据,计算周边物体与第一物体之间的距离、以及周边物体的预期水平偏角。
在一种可能的实现方式中,计算周边物体的预期水平偏角,包括:
对于每个周边物体,根据该周边物体的运行数据、以及第一物体的运行数据,计算AR设备中心点指向该周边物体中心点的方向与AR设备摄像头的安装方向之间的夹角,将该夹角作为该周边物体的预期水平偏角。
在一种可能的实现方式中,根据该周边物体的运行数据、以及第一物体的运行数据,计算AR设备的中心点指向该周边物体中GNSS设备中心点的方向与AR设备摄像头的安装方向之间的夹角,包括:
根据以下公式计算夹角:
Figure PCTCN2020135113-appb-000007
其中,∠O 2OA为夹角,∠NOA为摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000008
为第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000009
为周边物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为AR设备的中心点O的位置坐标,∠NOB为第一物体的航向角,O 1O为第一物体的GNSS设备中心点O 1与AR设备的中心点O之间的距离。
在一种可能的实现方式中,根据该周边物体的运行数据、以及第一物体的运行数据,计算AR设备中心点指向该周边物体中心点的方向与AR设备摄像头的安装方向之间的夹角,包括:
根据以下公式计算夹角:
Figure PCTCN2020135113-appb-000010
其中,∠O 2OA为夹角,∠NOA为摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000011
为第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000012
为周边物体的GNSS设备中心点O 2的位置坐标。
在一种可能的实现方式中,计算周边物体与第一物体之间的距离、以及周边物体的预期水平偏角之前,还包括:
根据周边物体的运行数据、以及第一物体的运行数据,从周边物体中选择与第一物体关联的周边物体;
相应的,计算周边物体与第一物体之间的距离、以及周边物体的预期水平偏角,包括:
计算与第一物体关联的周边物体与第一物体之间的距离、以及与第一物体关联的周边物体的预期水平偏角。
第二方面,本申请实施例提供一种目标物体的预警装置,包括:
预期偏角获取单元,用于获取至少一个对第一物体具有威胁的目标物体,并获取目标物体的预期水平偏角;目标物体的预期水平偏角是对目标物体在AR画面中图像的绝对水平偏角的预测值;AR画面是第一物体的AR设备展示的AR画面;
绝对偏角获取单元,用于获取AR画面中各个物体图像的绝对水平偏角;物体图像的绝对水平偏角是AR画面的拍摄点指向物体图像中心点的方向与拍摄点指向AR画面中心点的方向在水平方向上的夹角;
图像获得单元,用于根据预期偏角获取单元获取的目标物体的预期水平偏角、以及绝对偏角获取单元获取的各个物体图像的绝对水平偏角,获得目标物体在AR画面中对应的物体图像,目标物体的预期水平偏角与目标物体对应的物体图像的绝对水平偏角之间的差值满足第一差值要求;
显示单元,用于在AR画面中对图像获得单元获得的目标物体对应的物体图像进行预警显示。
在一种可能的实现方式中,绝对偏角获取单元具体用于:
从AR画面中识别出物体图像;
对于每个物体图像,通过以下公式计算该物体图像的绝对水平偏角:
y=arctan(2x*tan(m/2)/L)
其中,y为物体图像的绝对水平偏角的角度值,L为AR画面水平方向的总像素数,m为AR设备的摄像头的水平视野角度范围,x为物体图像的中心点与AR画面的中心点之间的线段在水平方向上占用的像素个数。
在一种可能的实现方式中,预期偏角获取单元具体用于:
从V2X设备获取目标物体以及目标物体的预期水平偏角,目标物体以及目标物体的预期水平偏角由V2X设备根据目标物体的运行数据以及第一物体的运行数据确定,V2X设备设置于第一物体上。
在一种可能的实现方式中,预期偏角获取单元包括:
数据获取子单元,用于从V2X设备获取周边物体的运行数据,并且,从GNSS设备获取第一物体的运行数据;V2X设备以及GNSS设备设置于第一物体上;
计算子单元,用于根据周边物体的运行数据、以及第一物体的运行数据,从周边物体中获取至少一个目标物体,计算目标物体的预期水平偏角。
在一种可能的实现方式中,计算子单元具体用于:
对于每个目标物体,根据该目标物体的运行数据、以及第一物体的运行数据,计算AR设备中心点指向该目标物体的中心点的方向与AR设备摄像头的安装方向之间的夹角,将该夹角作为该目标物体的预期水平偏角。
在一种可能的实现方式中,计算子单元具体用于:
根据以下公式计算夹角:
Figure PCTCN2020135113-appb-000013
其中,∠O 2OA为夹角,∠NOA为摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000014
为第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000015
为目标物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为AR设备的中心点O的位置坐标,∠NOB为第一物体的航向角,O 1O为第一物体的GNSS设备中心点O 1与AR设备的中心点O之间的距离。
在一种可能的实现方式中,计算子单元具体用于:
根据以下公式计算夹角:
Figure PCTCN2020135113-appb-000016
其中,∠O 2OA为夹角,∠NOA为摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000017
为第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000018
为目标物体的GNSS设备中心点O 2的位置坐标。
在一种可能的实现方式中,计算子单元具体用于:
根据周边物体的运行数据、以及第一物体的运行数据,计算每个周边物体与第一物体按照运行数据运行时发生碰撞的时长;
按照时长的从小到大的顺序从周边物体中获取至少一个目标物体。
在一种可能的实现方式中,计算子单元具体用于:
根据周边物体的运行数据、以及第一物体的运行数据,从周边物体中选择与第一物体关联的周边物体;从与第一物体关联的周边物体中获取至少一个目标物体。
在一种可能的实现方式中,预期偏角获取单元还用于:获取周边物体与第一物体之间的距离、以及周边物体的预期水平偏角;周边物体的预期水平偏角是对周边物体在AR画面中图像的绝对水平偏角的预测值;
相应的,图像获得单元还用于:选择预期水平偏角与该目标物体的预期水平偏角之间的差值满足第二差值要求的周边物体;根据选择的周边物体与第一物体之间的距离、以及该目标物体与第一物体之间的距离,对选择的周边物体和该目标物体按照距离从小到大进行排序,得到该目标物体的排序位次;获取该目标物体对应的物体图像在AR画面中Y轴方向上的坐标值,对该目标物体对应的物体图像按照坐标值从小到大进行排序;选择排序位次与该目标物体的排序位次相同的物体图像,作为该目标物体对应的物体图像。
在一种可能的实现方式中,预期偏角获取单元具体用于:
从V2X设备获取周边物体与第一物体之间的距离、以及周边物体的预期水平偏角,距离以及预期水平偏角由V2X设备目标物体的运行数据以及第一物体的运行数据确定。
在一种可能的实现方式中,预期偏角获取单元具体用于:
从V2X设备获取周边物体的运行数据,并且,从GNSS设备获取第一物体的运行数据;V2X设备以及GNSS设备设置于第一物体上;
根据周边物体的运行数据、以及第一物体的运行数据,计算周边物体与第一物体之间的距离、以及周边物体的预期水平偏角。
在一种可能的实现方式中,预期偏角获取单元具体用于:
对于每个周边物体,根据该周边物体的运行数据、以及第一物体的运行数据,计算AR设备中心点指向该周边物体中心点的方向与AR设备摄像头的安装方向之间的夹角,将该夹角作为该周边物体的预期水平偏角。
在一种可能的实现方式中,预期偏角获取单元具体用于:根据以下公式计算夹角:
Figure PCTCN2020135113-appb-000019
其中,∠O 2OA为夹角,∠NOA为摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000020
为第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000021
为周边物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为AR设备的中心点O的位置坐标,∠NOB为第一物体的航向角,O 1O为第一物体的GNSS设备中心点O 1与AR设备的中心点O之间的距离。
在一种可能的实现方式中,预期偏角获取单元具体用于:根据以下公式计算夹角:
Figure PCTCN2020135113-appb-000022
其中,∠O 2OA为夹角,∠NOA为摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000023
为第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000024
为周边物体的GNSS设备中心点O 2的位置坐标。
在一种可能的实现方式中,预期偏角获取单元具体用于:根据周边物体的运行数据、以及第一物体的运行数据,从周边物体中选择与第一物体关联的周边物体;计算与第一物体关联的周边物体与第一物体之间的距离、以及与第一物体关联的周边物体的预期水平偏角。
第三方面,本申请实施例提供一种电子设备,包括:
显示屏;一个或多个处理器;存储器;多个应用程序;以及一个或多个计算机程序,其中一个或多个计算机程序被存储在存储器中,一个或多个计算机程序包括指令,当指令被设备执行时,使得设备执行以下步骤:
获取至少一个对第一物体具有威胁的目标物体,并获取目标物体的预期水平偏角;目标物体的预期水平偏角是对目标物体在AR画面中图像的绝对水平偏角的预测值;AR画面是第一物体的AR设备展示的AR画面;
获取AR画面中各个物体图像的绝对水平偏角;物体图像的绝对水平偏角是AR画面的拍摄点指向物体图像中心点的方向与拍摄点指向AR画面中心点的方向在水平方向上的夹角;
根据目标物体的预期水平偏角、以及各个物体图像的绝对水平偏角,获得目标物体在AR画面中对应的物体图像,目标物体的预期水平偏角与目标物体对应的物体图像的绝对水平偏角之间的差值满足第一差值要求;
在AR画面中对目标物体对应的物体图像进行预警显示。
第四方面,本申请实施例提供一种计算机程序,当计算机程序被计算机执行时,用于执行第一方面的方法。
在一种可能的设计中,第四方面中的程序可以全部或者部分存储在与处理器封装在一起的存储介质上,也可以部分或者全部存储在不与处理器封装在一起的存储器上。
附图说明
为了更清楚地说明本发明实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它的附图。
图1a为现有技术AR实景导航示意图;
图1b为现有技术智能车盒联合车机所支持的AR实景导航示意图;
图2a为雷达感知模型示意图;
图2b为现有技术所产生的问题示意图;
图3为本申请目标物体的预警方法一个实施例的流程图;
图4为本申请目标物体的预警方法另一个实施例的流程图;
图4a为本申请目标物体与摄像头视野角度范围的关系示意图;
图4b和图4c为本申请预警显示方式示例图;
图4d为本申请周边物体与第一物体的位置关系示例图;
图4e为本申请周边物体与第一物体之间方向偏角示意图;
图5为本申请第一物体和目标物体的俯视图;
图6a为本申请摄像头成像的光线连接示意图;
图6b为本申请AR画面中识别出的物体图像以及绝对水平偏角关系示意图;
图7为本申请目标物体的预警方法又一个实施例的流程图;
图8a为本申请目标物体的预警方法又一个实施例的流程图;
图8b为本申请物体图像在Y轴方向上的坐标值的示例图;
图9为本申请目标物体的预警方法又一个实施例的流程图;
图10为本申请所述方法适用的一种可能的系统结构图;
图11为本申请目标物体的预警装置一个实施例的结构图;
图12为本申请电子设备一个实施例的结构示意图。
具体实施方式
本申请的实施方式部分使用的术语仅用于对本申请的具体实施例进行解释,而非旨在限定本申请。
申请人分析发现:现有的实现方案中,ADAS、MDC等技术结合雷达传感器,在AR实景导航画面中向用户预警前方的碰撞威胁,但是,由于现有的雷达传感器具备典型的方向属性,即面向不同的方向性能差异较大,参见图2a所示的雷达感知模型,一般感知前方车辆的雷达传感器和感知侧方向车辆的雷达传感器的感知距离性能差异较大,如果周边车辆不是处于本车的正前方,而是其他方位,则雷达传感器对周边车辆的感知能力将会大大下降,因此,目前仅能预测正前方的车辆是否为威胁车辆。如图2b所示,如果周边车辆中有多辆车与本车有碰撞威胁,现有技术无法精确感知最有威胁车辆并在AR实景导航画面中标注出来进行预警。
现有通过雷达传感器感知威胁车辆的方法中,雷达传感器只能感知周边车辆的客观属性例如位置等,对于周边车辆的主观属性如转向灯状态、刹车状态、故障状态等等均无法感知,解决这一问题的方法之一在于:获取周边车辆更多的客观和主观属性。
车与万物的互联(V2X,Vehicle to everything),也称为车辆与万物的基于用于车辆通信的长期演进通信(LTE-V,Long term evolution-vehicle)或专用短消息通信(DSRC,Dedicate short range communication)的通信互联,是未来智能交通运输系统的关键技术。V2X是车辆与车辆的基于LTE-V或DSRC的通信互联(V2V,Vehicle to Vehicle)和车对基础设施(V2I) 信息交换技术等的统称。V2X使得车辆能够与可能影响车辆的任何实体之间进行双向信息传输,例如使得车辆与行人、车辆与车辆、车辆与基站、基站与基站之间能够通信,从而获得车辆的运行信息、实时路况、道路信息、行人信息等一系列信息,提高驾驶安全性、减少拥堵、提高交通效率、提供车载娱乐信息等。因此,V2X通信相对于雷达传感器能够获取更多的周边车辆甚至行人等物体的客观和主观属性。
基于以上分析,本申请提出了一种目标物体的预警方法、装置和电子设备,将V2X技术与AR技术相结合,在AR画面中为用户提供更大范围内的危险预警,提升用户体验。
需要说明的是,本申请的方法不仅可以适用于车辆中支持AR实景导航的AR设备中,还可以适用于例如图1a所示的支持AR实景导航的电子设备,或者,还可以适用于:需要进行AR实景显示、且需要在AR画面中对威胁第一物体的目标物体进行预警的AR设备中,例如:某一设置有AR设备的机器人为所述第一物体,机器人能够使用AR设备进行周围环境的AR实景显示,且AR设备需要对周围的物体可能会对机器人造成的碰撞在AR画面中进行预警显示。
图3为本申请目标物体的预警方法一个实施例的流程图,如图3所示,上述方法可以包括:
步骤301:获取至少一个对第一物体具有威胁的目标物体,并获取所述目标物体的预期水平偏角;所述目标物体的预期水平偏角是对所述目标物体在AR画面中图像的绝对水平偏角的预测值;所述AR画面是所述第一物体的AR设备展示的AR画面;
在执行本申请实施例所述方法的设备中,一般可以通过不同的物体ID来标识第一物体以及目标物体,所述物体ID的具体实现本申请实施例不限定,只要能够唯一标识不同物体即可。
步骤302:获取所述AR画面中各个物体图像的绝对水平偏角;
步骤303:根据所述目标物体的预期水平偏角、以及各个物体图像的绝对水平偏角,获得所述目标物体在所述AR画面中对应的物体图像,所述目标物体的预期水平偏角与所述目标物体对应的物体图像的绝对水平偏角之间的差值满足第一差值要求;
步骤304:在所述AR画面中对所述目标物体对应的物体图像进行预警显示。
图3所示的方法,获取至少一个对第一物体具有威胁的目标物体,通过偏角比对,得到目标物体在AR画面中对应的物体图像,对物体图像进行预警显示,从而能够为用户提供更大范围内的危险预警,提升用户体验。
图4为本申请目标物体的预警方法另一个实施例的流程图。
本申请实施例所适用的场景中,第一物体中设置有AR设备,用于为用户提供AR画面,所述AR画面为设置于第一物体中的摄像头所拍摄的第一物体某一周边区域的AR画面,所述AR画面可以为实景导航画面或者其他非导航场景下的AR画面;第一物体中还设置有V2X设备,用于与 第一物体的周边物体进行V2X通信,所述周边物体是具有V2X通信能力的物体。第一物体可以为:车辆、机器人、行人等物体;第一物体可以处于运动状态,也可以处于静止状态;周边物体可以为具有V2X通信能力的车辆、行人、机器人、或者自行车等物体,每个周边物体可以处于运动状态,也可以处于静止状态。
如图4所示,该方法可以包括:
步骤401:第一物体的V2X设备与周边物体进行V2X通信,分别获取所述周边物体的运行数据。
其中,物体的运行数据可以包括但不限于:物体的运行速度、和/或运行方向、和/或位置。
其中,周边物体可以通过物体ID来标识。
其中,周边物体中可以设置有能够进行V2X通信的电子设备例如V2X设备等。
其中,第一物体的V2X设备与周边物体的电子设备进行V2X通信时可以通过LTE-V或者DSRC进行通信,本申请并不限制。
一般的,V2X设备可以通过基础安全消息(BSM,Basic Safety Message)消息进行自身所属物体的运行数据的广播,BSM消息中可以包括但不限于:物体标识、所属物体的运行速度、和/或运行方向、和/或位置、和/或加速度、和/或预测路径、和/或历史路径、和/或车辆事件等。因此,本步骤中第一物体的V2X设备也可以通过BSM消息获取周边物体的运行数据。BSM消息中的物体标识一般为发送BSM消息的物体的标识。
其中,物体的位置可以通过经纬度来表示。如果物体为车辆,运行速度可以为车辆的行驶速度,运行方向可以为车辆的航向角,车辆的航向角是车辆的运行方向与正北方向之间的夹角。
步骤402:第一物体的V2X设备获取第一物体的运行数据。
一般的,第一物体的V2X设备可以从第一物体的全球导航卫星系统(GNSS,Global Navigation Satellite System)设备中获取第一物体的运行数据。第一物体的GNSS设备可以设置于AR设备中、或者设置于第一物体中、或者设置于V2X设备中,本申请不限制。在一种可能的实现方式中,第一物体如果为车辆、机器人等,第一物体的GNSS设备可以设置于第一物体的中心点。
GNSS是对北斗系统、全球定位系统(GPS,Global Positioning System)、格洛纳斯(GLONASS)系统、伽利略卫星导航系统(Galileo satellite navigation system)等单个卫星导航定位系统的统一称谓,也可指代他们的增强型系统,又指代所有上述卫星导航定位系统及其增强型系统的相加混合体。也即是说:GNSS是以人造卫星作为导航台的星级无线电导航系统。
步骤401和步骤402之间的执行顺序不限制。
步骤403:第一物体的V2X设备根据第一物体的运行数据、以及周边物体的运行数据,从周边物体中选择与第一物体关联的周边物体。
为了便于描述,以下将选择出的与第一物体关联的周边物体称为:关联物体。
本步骤为可选步骤。
步骤404:第一物体的V2X设备根据第一物体的运行数据、以及关联物体的运行数据分别计算各个关联物体对第一物体的威胁程度,按照威胁程度从高到低的顺序选择P个关联物体作为目标物体;P是自然数。
其中,P的数值本申请并不限制,但是,一般来说,只需要对较少数量的目标物体进行预警即可,例如1个目标物体或者2个目标物体,否则,将失去预警的意义。
在一种可能的实现方式中,P可以为1,也即后续只对最具威胁的周边物体进行后续的预警。
步骤405:第一物体的V2X设备计算每个所述目标物体的预期水平偏角。
其中,目标物体的预期水平偏角是:对目标物体在AR画面中图像的绝对水平偏角的预测值。AR画面是指:第一物体的AR设备展示的AR画面。
物体图像在AR画面中的绝对水平偏角是:AR设备拍摄点指向物体图像中心点的方向与拍摄点指向AR画面中心点的方向在水平方向上的夹角。
步骤406:第一物体的V2X设备将所述目标物体的预期水平偏角发送给第一物体的AR设备。
在一种可能的实现方式中,V2X设备可以发送:目标物体的标识以及预期水平偏角。
步骤407:第一物体的AR设备接收所述目标物体的预期水平偏角。
步骤408:第一物体的AR设备依次判断每个目标物体的预期水平偏角是否在AR设备的水平视野角度范围内,过滤掉预期水平偏角不在水平视野角度范围内的目标物体,之后执行步骤409。
本步骤为可选步骤。在一种可能的实现方式中,本步骤也可以由第一物体的V2X设备执行,相应的,第一物体的V2X设备将预期水平偏角在AR设备的水平视野角度范围内的目标物体及其预期水平偏角发送给第一物体的AR设备。
通过本步骤的执行,可以过滤掉不在AR设备的水平视野角度范围内的目标物体,减少后续步骤中不必要的数据处理消耗。因为AR设备的摄像头是具有一定的视野角度范围的,而本申请所要做的是:从AR画面中找出对第一物体具有威胁的目标物体的物体图像,因此,如果目标物体并不在AR设备的摄像头的水平视野角度范围内,那么AR画面中不会出现该目标物体的物体图像,后续步骤无需执行。参见图4a所示,显然物体1和物体2均在摄像头的视野角度范围内,但是物体3并不在摄像头的视野角度范围内,因此,即便物体3是对第一物体具有威胁的目标物体,经过本步骤处理后,物体3将被过滤掉,无需对物体3进行后续步骤的处理。
由于预期水平偏角是对目标物体的图像在AR画面中的绝对水平偏角的预测值,因此,预期水平偏角应小于等于m/2,m为AR设备的摄像头的水平视野角度范围。
步骤409:第一物体的AR设备识别出AR画面中的物体图像,计算每个物体图像在AR画面中的绝对水平偏角。
步骤410:第一物体的AR设备将每个所述目标物体的预期水平偏角与所述绝对水平偏角进行比对,获得每个所述目标物体在AR画面中对应的物体图像,所述目标物体的预期水平偏角与其对应的物体图像的绝对水平偏角之间的差值满足第一差值要求。
通过预期水平偏角与绝对水平偏角的比对,就可以找到目标物体在AR画面中对应的物体图像,也即是在AR画面中找到了对第一物体具有威胁的物体,也即需要被提醒的物体。
其中,关于预期水平偏角与绝对水平偏角的具体差值要求的精度可以在实际应用中自主设定,本申请并不限制。
在一种可能的实现方式中,针对每个目标物体的预期水平偏角,可以将该预期水平偏角依次与每个物体图像的绝对水平偏角进行比对,判断两者的差值是否满足差值要求,
步骤411:第一物体的AR设备在所述AR画面中对每个所述目标物体对应的物体图像进行预警显示。
其中,所述预警显示可以通过屏幕显示(OSD,On Screen Display)等图形用户界面(GUI,Graphical User Interface)手段实现。预警显示的方法可以包括但不限于:为物体图像设置特殊的显示颜色、使用方框等特殊方式框出物体图像、对物体图像进行闪烁显示、在物体图像上显示特殊字符如“警告”“warnning”“X秒后发生碰撞”等,只要能够引起用户注意到该物体图像,起到危险预警效果即可。
参见图4b和图4c所示,可以对物体图像进行如图所示的预警显示。图4d和图4e仅为示例,并不用以限制本申请预警显示的可能实现方式。
以下,对步骤403的实现进行说明。
在实际应用中,与第一物体进行V2X通信的周边物体可能很多,这些周边物体中只有一部分甚至一小部分物体可能会对第一物体产生威胁,很多周边物体并不会对第一物体产生威胁;而且,后续处理中是对在AR画面中具有图像且对第一物体产生威胁的周边物体进行预警,而AR设备的摄像头是具有视野角度范围的,很多周边物体并不在摄像头的视野角度范围内,也即是说很多周边物体并不会出现在AR画面中,AR画面中不会出现所述很多周边物体的图像。因此,对所述很多周边物体进行后续步骤的处理并没有必要。因此,可以先对周边物体进行筛选,从中挑选出可能与第一物体存在潜在风险的周边物体,也即与第一物体关联的周边物体,之后再对选择出来的周边物体进行后续处理,从而降低了本申请所述方法的数据处理量。
在一种可能的实现方式中,可以通过目标分类算法来过滤出与第一物体关联的周边物体,过滤掉与第一物体无关联的周边物体。具体的,可以根据第一物体和周边物体的运行方向以及经纬度过滤出周边物体中与第一物体关联的物体,也即关联物体。具体实现过程说明如下:
参见图4d所示,将周边物体与第一物体的位置关系,按照9宫格位置模型进行分类,得到周边物体相对第一物体的8种位置关系,具体为正前方、右前方、左前方、正左方、正右方、后左方、后右方、正后方,共8个方向。在图4d中以第一物体为车辆为例,但是图4d中的第一物体并不限于车辆,可以适用于任意第一物体。
参见图4e所示,以第一物体和周边物体为车辆为例,但是图4e的应用并不限于车辆,而是可以适用任一周边物体和第一物体,假设A点为周边物体的中心点,B点为第一物体的中心点,AC垂直于穿过B点的正东方向的直线,交点为C点,AB与BC之间的夹角为θ,θ=arctan(AC/BC)。
根据周边物体的的经纬度、第一物体的经纬度和第一物体的运行方向计算周边物体相对第一物体的运行方向的实际方向偏角θ 1=θ+θ 0。其中,θ 0为第一物体当前运行方向,可以从第一物体的GNSS设备获取。
A点的经纬度即为周边物体的经纬度,假设为(x 1,y 1),周边物体的经纬度(x 1,y 1)属于周边物体的运行数据,可以从该周边物体发送的BSM消息中获取;B点的经纬度即为第一物体的经纬度,假设为(x 0,y 0),可以从第一物体中的GNSS设备获取。
在A点、B点经纬度可知的情况下,AC=y 1-y 0,BC=x 1-x 0
因此,
Figure PCTCN2020135113-appb-000025
均为可以计算得到的数据。
基于以上计算结果:
根据角度θ,可以确定周边物体相对于第一物体的位置关系,具体参见下表1所示:
表1
Figure PCTCN2020135113-appb-000026
选择方法具体可以为:
如果周边物体与第一物体相向运行,位置关系为正左方或正右方或左后方或右后方或正后方,且实际方向偏角大于正负80度,则判定周边物体为与第一物体无关的车辆,滤除。
如果周边物体与第一物体同向行驶,位置关系为正前方或左前方或右前方,且(v 1*sinθ-v 0)>0,则周边物体为与第一物体无关的物体,滤除。
如果周边物体与第一物体同向行驶,位置关系为正后方或左后方或右后方,且(v 0-v 1*sinθ)>0,则周边物体为与第一物体无关的物体,滤除。
其中,v 0为第一物体当前的运行速度,v 1为周边物体当前的运行速度。
通过上述处理,滤除了与第一物体无关联的周边物体,选择出与第一物体关联的周边物体,得到关联物体。
在另一种可能的实现方式中,如果第一物体为静止的物体,可以首先过滤掉速度为0的周边物体,之后,再通过上述目标分类算法进行周边物体的进一步过滤。
以下,对步骤404的实现进行说明。
在一种可能的实现方式中,可以使用预测路径碰撞算法(Path Predication Method)计算关联物体对第一物体的威胁程度,进一步的,该算法还可以计算任一周边物体对第一物体的威胁程度。以下,假设第二物体为任一关联物体。需要说明的是,如果没有步骤403,第二物体可以是任一周边物体。预测路径碰撞算法的主要原理在于:
根据第一物体的经纬度、速度、航向角、以及预测曲率R 0,得到时长t后第一物体的预测经纬度与时长t的表达式,表达式如以下公式1所示;
Figure PCTCN2020135113-appb-000027
其中,x 0t表示第一物体在时长t后的预测经度,y 0t表示第一物体在时长t后的预测纬度,x 0表示第一物体当前的经度,y 0表示第一物体当前的纬度,v 0表示第一物体的当前速度,t表示时长,R 0表示第一物体的预测曲率,θ 0表示第一物体的航向角。第一物体的航向角一般是第一物体运行方向相对正北方向的夹角。预测曲率=速度/横摆角速度。
根据第二物体的运行数据以及预测曲率R i,得到时长t后第二物体的预测经纬度与时长t的表达式,表达式如以下公式2所示:
Figure PCTCN2020135113-appb-000028
其中,x it表示第二物体在时长t后的预测经度,y it表示第二物体在时长t后的预测纬度,x i表示第二物体当前的经度,y i表示第二物体当前的纬度,v i表示第二物体的当前速度,t表示时长,R i表示第二物体的预测曲率,θ i表示第二物体的航向角。
基于上述公式1,2,根据第一物体的预测经纬度(x 0t,y ot)与第二物体的预测经纬度(x it,y it)可以预测第一物体和第二物体之间的距离是否会小于预设距离阈值,该距离阈值可以根据物体尺寸来设置,并且可以获得第一物体和第二物体之间的距离小于预设距离阈值时对应的时长T,如果经过时长T第二物体与第一物体之间的距离会小于预设距离阈值表明经过时长T第二物体会与第一物体发生碰撞。
通过以上计算,可以获得每一个关联物体依照现在的运行数据是否会与第一物体发生碰撞,如果发生碰撞,关联物体与第一物体发生碰撞的时长T。
对预测会与第一物体发生碰撞的关联物体对应的时长T进行排序,按照时长T从小到大的顺序,关联物体对第一物体的威胁程度相应的从大到小;因此,按照时长T从小到大的顺序获取P个关联物体就是本步骤中想要获得的物体。
以上预测路径碰撞算法相关描述中,第一物体和第二物体中至少有一个物体应该为正在移动的物体,否则如果第一物体和第二物体均为静止的物体,以上的预测路径碰撞算法没有计算的意义,正在移动的物体可以包括但不限于能够进行V2X通信的车辆、行人、机器人、或者自行车等,静止的物体可以包括但不限于能够进行V2X通信的车辆、行人、机器人、或者自行车等。
以下,对步骤405的实现进行说明。
在一种可能的实现方式中,对于每个所述目标物体,可以根据该目标物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该目标物体的中心点的方向与所述AR设备摄像头的安装方向之间的夹角,将该夹角作为该目标物体的预期水平偏角。
进一步地,在计算周边物体的预期水平夹角时,也可以使用上述类似的方法,此时,对于每个所述周边物体,可以根据该周边物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该周边物体中心点的方向与所述AR设备摄像头的安装方向之间的夹角,将该夹角作为该周边物体的预期水平偏角。
以下通过图5进行举例说明。图5所示为第一物体和目标物体的俯视 示意图。点O为第一物体的AR设备的中心点,OA为AR设备的摄像头安装方向,OL与OR分别为摄像头视野的左边界线和右边界线,O 1为第一物体的GNSS设备中心点,也可以认为是第一物体的中心点,O 2为目标物体的GNSS设备中心点,也可以认为是目标物体的中心点,OO 2是AR设备的中心点指向目标物体的GNSS设备中心点的方向,O 1B指向第一物体的运行方向,O点在直线O 1B上,ON和O 1N 1为正北方向连线,线段O 1O为已知距离,即第一物体的GNSS设备中心点与AR设备的中心点之间的距离,那么,目标物体的预期水平偏角可以为∠O 2OA。∠O 2OA的一种可能的计算方法如下:
根据第一物体的GNSS设备中心点O 1的位置坐标
Figure PCTCN2020135113-appb-000029
第一物体的GNSS设备中心点O 1与AR设备的中心点O之间的距离O 1O、以及第一物体的航向角∠NOB计算得到第一物体的AR设备的中心点O的位置坐标(X O,Y O)。具体的,可以使用以下公式3计算O点的坐标(X O,Y O):
Figure PCTCN2020135113-appb-000030
其中,第一物体的GNSS设备中心点O 1的位置坐标
Figure PCTCN2020135113-appb-000031
可以通过读取第一物体的GNSS设备获得,第一物体的航向角∠NOB可以通过第一物体的GNSS设备获得,第一物体的GNSS设备中心点与AR设备的中心点之间的距离O 1O为已知距离。
根据目标物体的GNSS设备中心点O 2的位置坐标
Figure PCTCN2020135113-appb-000032
及第一物体的AR设备的中心点O的位置坐标(X O,Y O)、以及第一物体的AR设备相对于正北方向的水平安装角度∠NOA,计算目标物体的预期水平偏角∠O 2OA。具体的,可以使用以下的公式4计算∠O 2OA:
Figure PCTCN2020135113-appb-000033
其中,∠NOA为第一物体的AR设备的摄像头相对于正北方向的水平安装角度,可以通过第一物体的AR设备的电子罗盘来获得。
这样就计算出了目标物体在AR画面中的预期水平偏角∠O 2OA。
对于步骤403中的每个目标物体,均可以使用上述方法计算得出目标物体在第一物体的AR画面中的预期水平偏角。
在以上的方法中,基于第一物体的尺寸相对较大,例如第一物体为车辆,第一物体的GNSS设备中心点与AR设备的中心点之间的距离O 1O相 对较大,从而∠O 2O 1N 1与∠O 2ON相差较大的情况下。如果第一物体的尺寸相对较小或者第一物体的GNSS设备与AR设备相距较近,甚至GNSS设备就设置在AR设备中,从而第一物体的GNSS设备中心点与AR设备的中心点之间的距离O 1O相对较小,∠O 2O 1N 1与∠O 2ON相差较小,在误差允许范围内,可以将以上的计算方法简化为公式5:
Figure PCTCN2020135113-appb-000034
也即:根据第一物体的GNSS设备中心点O 1的位置坐标
Figure PCTCN2020135113-appb-000035
目标物体的GNSS设备中心点O 2的位置坐标
Figure PCTCN2020135113-appb-000036
以及第一物体的AR设备相对于正北方向的水平安装角度∠NOA,计算目标物体的预期水平偏角∠O 2OA。
以上计算目标物体的预期水平偏角的方法也可以进一步扩展为:计算任一周边物体的预期水平偏角。
以下,对步骤409的实现进行说明。
本步骤中,可以使用相关的图像识别方法识别出AR画面中的物体图像。从AR画面中识别出物体图像时,可以用图形标示出物体图像,在一种可能的实现方式中,可以使用矩形标示出物体图像。
如图6a所示为摄像头成像的光线连接示意图,假设K为摄像头的拍摄点,ABCD是位于K点的摄像头所拍摄成像的视频图像平面,也可以理解为最终在AR设备的屏幕上显示的AR画面,P点是AR画面的中心点,直线PL为AR画面的水平线,P点正好对应摄像头拍摄方向的正中心,那么所有图像都呈现在AR画面ABCD上。
假设通过图像识别技术识别出AR画面ABCD中有一物体图像,该物体图像通过矩形区域标出,矩形区域为abcd,P1是矩形区域abcd的中心点,通过P1点做一条垂直于直线PL的直线P 1M,可以得出,矩形区域abcd在AR画面中的绝对水平偏角为∠PKM,也即物体图像在AR画面中的绝对水平偏角。
PM为矩形区域abcd所代表的物体图像的水平坐标x,由于AR图像显示的分辨率是已知的,图像识别技术通过扫描视频缓存区数据,可以得出矩形区域abcd在AR画面中水平方向的像素起始索引号和结束索引号,进一步可计算出中心点P1在AR画面的水平方向的像素索引号,而P为AR画面的中心点,P在AR画面的水平方向的像素索引号是已知的,则, 线段PM在水平方向一共占用多少个像素点可由图像识别技术计算出来。
据此,通过以下的公式6可以计算得到物体图像在AR画面中的绝对水平偏角∠PKM:
y=arctan(2x*tan(m/2)/L)                     (6)
其中,y为∠PKM的角度值,L为AR画面水平方向的总像素数,m为摄像头的水平视野角度范围,x为PM在水平方向的像素个数。
基于以上的方法,可以得出AR画面中每个物体图像在AR画面中的绝对水平偏角。
如图6b所示,AR画面上的点的绝对水平偏角从中心向左右两侧逐渐增大,最大值为m/2,图6b中以m/2为65度为例,AR画面中识别出的物体图像通过矩形框框出,每个物体图像的绝对水平偏角如图6b所示,距离中心点越近,绝对水平偏角越小,反之,绝对水平偏角越大。
图4所示的实施例中,由第一物体的V2X设备进行预测水平偏角的计算,在实际应用中,也可以由AR设备进行上述计算。基于此,本申请提供图7所示的实施例,图7所示目标物体的预警方法可以包括:
步骤701:第一物体的V2X设备与周边物体进行V2X通信,分别获取所述周边物体的运行数据。
步骤702:第一物体的V2X设备将所述周边物体的运行数据发送给第一物体的AR设备。
步骤703:第一物体的AR设备接收所述周边物体的运行数据。
步骤704:第一物体的AR设备获取第一物体的运行数据。
一般的,第一物体的AR设备可以从第一物体的GNSS设备获取第一物体的运行数据。
步骤704与步骤701~步骤703之间的执行顺序不限制。
步骤705:第一物体的AR设备根据第一物体的运行数据、以及周边物体的运行数据,从周边物体中选择与第一物体关联的周边物体。
步骤706:第一物体的AR设备根据第一物体的运行数据、以及关联物体的运行数据分别计算各个关联物体对第一物体的威胁程度,按照威胁程度从高到低的顺序选择P个关联物体作为目标物体;P是自然数。
步骤707:第一物体的AR设备计算每个所述目标物体的预期水平偏角。
步骤708~步骤711与步骤408~步骤411相同,这里不赘述。
图7所示实施例各步骤的实现可以参见图4所示实施例中的对应描述,区别仅在于部分步骤的执行主体由V2X设备变为AR设备。
在图4所示的实施例中,通过水平偏角从AR画面中识别出目标物体,但是,在实际应用中步骤410中可能存在一个目标物体的预测水平偏角与 S个物体图像的绝对水平偏角之间的差值均满足第一差值要求的情况,S≥2,以下,对这种情况下本申请的实现进行说明。
图8a为本申请目标物体的预警方法又一个实施例的流程图,如图8a所示,该方法可以包括:
步骤801~步骤803与步骤401~步骤403相同,不赘述。
步骤804:第一物体的V2X设备根据第一物体的运行数据、以及关联物体的运行数据分别计算每个关联物体与第一物体之间的距离、以及每个关联物体的预期水平偏角。
在一种可能的实现方式中,每个关联物体对应的距离以及预期水平偏角可以通过(D n,β n,)的方式保存,D n为关联物体与第一物体之间的距离,β n为关联物体的预期水平偏角。
步骤805:第一物体的V2X设备根据第一物体的运行数据、以及关联物体的运行数据分别计算各个关联物体对第一物体的威胁程度,按照威胁程度从高到低的顺序选择P个关联物体作为目标物体;P是自然数。
步骤804与步骤805之间的执行顺序不限制。
步骤806:第一物体的V2X设备将目标物体、每个关联物体与第一物体之间的距离、每个关联物体的预期水平偏角发送给第一物体的AR设备。
步骤807:第一物体的AR设备接收第一物体的V2X设备发送的上述数据。
步骤808:第一物体的AR设备依次判断每个目标物体的预期水平偏角是否在AR设备的水平视野角度范围内,过滤掉预期水平偏角不在水平视野角度范围内的目标物体,之后执行步骤809。
步骤809:第一物体的AR设备识别出AR画面中的物体图像,计算每个物体图像在AR画面中的绝对水平偏角。
步骤810:第一物体的AR设备将每个所述目标物体的预期水平偏角与所述绝对水平偏角进行比对,获得每个所述目标物体在AR画面中对应的物体图像,所述目标物体的预期水平偏角与其对应的物体图像的绝对水平偏角之间的差值满足第一预设差值要求。
本步骤中,如果存在目标物体对应的物体图像为至少2个,则执行步骤811~步骤814。
步骤811:对于在AR画面中对应的物体图像为至少2个的目标物体,第一物体的AR设备从关联物体中选择预期水平偏角与该目标物体的预期水平偏角之间的差值满足第二差值要求的关联物体。
以下,将预期水平偏角与该目标物体的预期水平偏角之间的差值满足第二差值要求的关联物体简称为第一关联物体。
步骤812:第一物体的AR设备根据第一关联物体与第一物体之间的距离、以及该目标物体与第一物体之间的距离,对所述第一关联物体和该目标物体按照距离从小到大进行排序,得到该目标物体的排序位次。
步骤813:获取该目标物体对应的物体图像在所述AR画面中Y轴方向上的坐标值,对所述物体图像按照坐标值从小到大进行排序。
物体图像在Y轴上的坐标值实质上为物体图像的中心点距离AR画面底边的距离。在一种可能的实现方式中,可以将AR画面的左下角作为二维直角坐标系的原点,或者,将AR画面底边的中心点作为二维直角坐标系的原点,底边作为横轴,与横轴垂直且穿过原点的直线作为纵轴,从而建立二维指教坐标系,进而计算物体图像的中心点的纵坐标。或者,也可以直接使用类似图6a相关描述中说明的,通过计算物体图像的中心点到AR画面底边的垂线段占用的像素点来得到物体图像的纵坐标,如图8b中虚线所示。在AR画面中,物体图像的纵坐标越小,距离第一物体越近,纵坐标越大,距离第一物体越远。
步骤811~步骤812与步骤813之间的执行顺序不限制。
步骤814:如果目标物体的排序位次大于目标物体对应的物体图像的个数,滤除该目标物体,否则,选择排序位次与该目标物体的排序位次相同的物体图像,作为该目标物体对应的物体图像。
在实际应用中,也可能出现目标物体的排序位次大于目标物体对应的物体图像个数的情况,此时,说明目标物体在AR画面中不存在对应的物体图像,则目标物体被滤除。
对于步骤810中对应至少2个物体图像的目标物体执行步骤811~步骤814,可以使得每个目标物体仅对应AR画面中的一个物体图像。
步骤815与步骤411相同,这里不赘述。
图8a所示实施例中各个步骤的具体实现可以参考图4和图7所示实施例中的对应描述,区别仅在于部分步骤的执行主体不同,这里不赘述。
图8a所示的实施例中,由第一物体的V2X设备进行预测水平偏角的计算,在实际应用中,也可以由AR设备进行上述计算。基于此,本申请提供图9所示的实施例,图9所示目标物体的预警方法可以包括:
步骤901:第一物体的V2X设备与周边物体进行V2X通信,分别获取所述周边物体的运行数据。
步骤902:第一物体的V2X设备将所述周边物体的运行数据发送给第一物体的AR设备。
步骤903:第一物体的AR设备接收所述周边物体的运行数据。
步骤904:第一物体的AR设备获取第一物体的运行数据。
一般的,第一物体的AR设备可以从第一物体的GNSS设备获取第一物体的运行数据。
步骤904与步骤901~步骤903之间的执行顺序不限制。
步骤905:第一物体的AR设备根据第一物体的运行数据、以及周边物体的运行数据,从周边物体中选择与第一物体关联的周边物体。
步骤906:第一物体的AR设备根据第一物体的运行数据、以及关联物体的运行数据分别计算每个关联物体与第一物体之间的距离、以及每个关联物体的预期水平偏角。
步骤907:第一物体的AR设备根据第一物体的运行数据、以及关联物体的运行数据分别计算各个关联物体对第一物体的威胁程度,按照威胁程度从高到低的顺序选择P个关联物体作为目标物体;P是自然数。
步骤906与步骤907的执行顺序不限制。
步骤908~步骤915与步骤808~步骤815相同,这里不赘述。
图9所示实施例各个步骤的实现可以参考图4、图7、图8a所示实施例中的对应描述,区别仅在于部分步骤的执行主体不同,这里不赘述。
基于以上实施例,本申请目标物体的预警方法使得基于V2X通信获取到的至少一个对第一物体具有威胁的目标物体,能够有效的在摄像头所拍摄的AR画面中识别到,并进行预警显示,通过AR画面结合GUI手段使得目标物体更直观的显示在AR画面中,使得AR画面与用户之间的交互效果大大提升;本申请目标物体的预警方法无需依赖雷达、导航或者其他传感器,仅通过V2X通信设备、摄像头以及AI计算设备即可实现,大大简化了V2X呈现到AR现实的设备综合链路和成本,并提升了计算效率;本申请目标物体的预警方法不仅可以针对车辆,对于路面上的其他能够进行V2X通信的物体亦可使用,如行人威胁等。
需要说明的是,本申请实施例描述中的第一物体、目标物体、周边物体等,在实施本申请实施例技术方案的电子设备中均需要通过标识信息来表示上述物体,例如在执行上述获取至少一个对第一物体具有威胁的目标物体的步骤时,获取到的并非目标物体本身,而是目标物体的标识信息。
可以理解的是,上述实施例中的部分或全部步骤骤或操作仅是示例,本申请实施例还可以执行其它操作或者各种操作的变形。此外,各个步骤可以按照上述实施例呈现的不同的顺序来执行,并且有可能并非要执行上述实施例中的全部操作。
图10为本申请目标物体的预警方法可以适用的一种可能的车辆系统架构图。如图10所示,该系统中主要包括:AR设备、通信信息处理系统、车身总线、LTE-V天线、GPS数据处理模块以及GPS天线;其中,
AR设备,用于完成对摄像头视频图像内容的目标识别以及合成增强信息,将AR画面显示到屏幕上。AR设备可以与通信信息处理系统进行通信,接收由通信信息处理系统确定的周边物体与本车之间的距离、周边物体的预期水平偏角。AR设备可以为车机或手机等。
车身总线,用于连接车辆其他电子控制单元(ECU,Electronic Control  Unit),如发送机、车轮、制动传感器等等,能够通过车身总线获取车辆的各类行驶状态数据如速度、方向盘转角等。
GPS数据处理模块,用于通过GPS天线获取GPS数据,解析接收到的GPS数据,获取本车的经纬度位置信息和航向信息。
GPS天线和GPS数据处理模块构成GPS设备。
AR设备中包括:
视频数据解码单元,用于从AR设备的摄像头获取视频数据,并对视频数据解码后输出至屏幕驱动控制器以及视频逻辑单元处理模块;
屏幕驱动控制器,用于完成屏幕数据信号、同步信号的编码和输出,为屏幕供电,以及驱动屏幕正常显示。
GUI图像控制器,用于对AR画面进行矢量信号叠加、屏幕信息显示(OSD,On-Screen Display)叠加。
视频逻辑单元处理模块,用于使用图像识别算法对AR画面中的图像数据内容进行人工智能(AI,Artificial Intelligence)算法识别,识别出AR画面中的物体图像,控制GUI图像控制器叠加标记图像信息到AR画面中目标物体的物体图像上。
通信信息处理系统主要包括:
车辆运行数据解析模块,用于完成对本车数据的接收和解析。
LTE-V数据包应用数据算法处理模块,用于结合本车的GPS数据、本车的车辆数据、以及通过LTE-V数据包网络传输层协议栈处理模块所接收的周边物体例如车辆的V2X消息数据,对周边物体与本车的位置关系进行定义,并使用目标分类算法和预测路径算法计算出目标物体,并且,计算出周边物体相对本车的距离和预期水平偏角。
LTE-V数据包网络传输层协议栈处理模块,用于完成LTE-V数据包的网络传输层协议栈包头的识别和摘取,将数据包中的应用层数据如BSM消息发送给LTE-V数据包应用数据算法处理模块。
LTE-V射频集成电路(RFIC),用于完成LTE-V射频信号的采集。
LTE-V数据接入层处理模块,用于完成LTE-V接入层的3GPP协议栈的处理,使得空口数据得以正确识别。
以太网驱动通信接口,用于将LTE-V数据包应用数据算法处理模块计算出的相关信息发送给AR设备,这个接口也可以是其他通信接口,包括但不限于通用异步收发传输器(UART,Universal Asynchronous Receiver/Transmitte)、串行外设接口(SPI,Serial Peripheral Interface)、集成电路总线(I2C,Inter-Integrated Circuit)、WIFI(Wireless-Fidelity)、通用串行总线(USB,Universal Serial Bus)、外设组件互联扩展标准(PCIE,peripheral component interconnect express)、安全数字输入输出卡(SDIO,Secure Digital Input and Output)等等。
本申请实施例涉及的物理元器件可以包括:支持LTE-V通信数据的 RFIC芯片、GPS定位芯片、数据传输总线控制器、计算处理器、内存存储器、闪存存储器、图像处理器、视频取景器、电子罗盘等,还可以包括:WIFI芯片、以太网控制器等。所述图像处理器可以为DA/AD转换器;所述视频取景器可以为摄像头,所述数据传输总线控制器可以基于以太网或者控制器局域网络(CAN,Controller Area Network)。
在一种可能的实现方式中,通信信息处理系统存在于车载TBOX设备中,AR设备是车载车机娱乐系统,车载TBOX通过以太网或者USB或者WIFI同AR设备进行通信;通信信息处理系统具备LTE-V通信功能,负责与路面上的周边物体例如车辆进行V2X通信,据此计算出周边物体相对本车的距离和预期水平偏角,并且计算出目标物体,将上述信息发送给AR设备;AR设备根据摄像头拍摄的图像识别物体图像,并计算物体图像的绝对水平偏角,预期水平偏角与绝对水平偏角进行匹配,找到AR画面中目标物体的物体图像,对物体图像进行标记并提示出相关预警信息,如“X秒后碰撞”等。
图11为本申请目标物体的预警装置一种实施例的结构图,如图11所示,该装置110可以包括:
预期偏角获取单元111,用于获取至少一个对第一物体具有威胁的目标物体,并获取所述目标物体的预期水平偏角;所述目标物体的预期水平偏角是对所述目标物体在AR画面中图像的绝对水平偏角的预测值;所述AR画面是所述第一物体的AR设备展示的AR画面;
绝对偏角获取单元112,用于获取所述AR画面中各个物体图像的绝对水平偏角;所述物体图像的绝对水平偏角是AR画面的拍摄点指向物体图像中心点的方向与所述拍摄点指向AR画面中心点的方向在水平方向上的夹角;
图像获得单元113,用于根据所述预期偏角获取单元获取的所述目标物体的预期水平偏角、以及所述绝对偏角获取单元获取的各个物体图像的绝对水平偏角,获得所述目标物体在所述AR画面中对应的物体图像,所述目标物体的预期水平偏角与所述目标物体对应的物体图像的绝对水平偏角之间的差值满足第一差值要求;
显示单元114,用于在所述AR画面中对所述图像获得单元获得的所述目标物体对应的物体图像进行预警显示。
其中,所述绝对偏角获取单元112具体可以用于:
从所述AR画面中识别出物体图像;
对于每个物体图像,通过以下公式计算该物体图像的绝对水平偏角:
y=arctan(2x*tan(m/2)/L)
其中,y为物体图像的绝对水平偏角的角度值,L为AR画面水平方向的总像素数,m为AR设备的摄像头的水平视野角度范围,x为物体图像 的中心点与AR画面的中心点之间的线段在水平方向上占用的像素个数。
其中,所述预期偏角获取单元111具体可以用于:
从V2X设备获取所述目标物体以及所述目标物体的预期水平偏角,所述目标物体以及所述目标物体的预期水平偏角由所述V2X设备根据所述目标物体的运行数据以及所述第一物体的运行数据确定,所述V2X设备设置于所述第一物体上。
或者,所述预期偏角获取单元111可以包括:
数据获取子单元,用于从V2X设备获取周边物体的运行数据,并且,从GNSS设备获取所述第一物体的运行数据;所述V2X设备以及GNSS设备设置于所述第一物体上;
计算子单元,用于根据所述周边物体的运行数据、以及所述第一物体的运行数据,从所述周边物体中获取至少一个所述目标物体,计算所述目标物体的预期水平偏角。
其中,所述计算子单元具体可以用于:
对于每个所述目标物体,根据该目标物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该目标物体的中心点的方向与所述AR设备摄像头的安装方向之间的夹角,将该夹角作为该目标物体的预期水平偏角。
其中,所述计算子单元具体可以用于:
根据以下公式计算所述夹角:
Figure PCTCN2020135113-appb-000037
其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000038
为所述第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000039
为所述目标物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为所述AR设备的中心点O的位置坐标,∠NOB为第一物体的航向角,O 1O为第一物体的GNSS设备中心点O 1与所述AR设备的中心点O之间的距离。
或者,所述计算子单元具体可以用于:
根据以下公式计算所述夹角:
Figure PCTCN2020135113-appb-000040
其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水 平安装角度,
Figure PCTCN2020135113-appb-000041
为所述第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000042
为所述目标物体的GNSS设备中心点O 2的位置坐标。
其中,所述计算子单元具体可以用于:
根据所述周边物体的运行数据、以及第一物体的运行数据,计算每个周边物体与第一物体按照所述运行数据运行时发生碰撞的时长;
按照时长的从小到大的顺序从所述周边物体中获取至少一个所述目标物体。
其中,所述计算子单元具体可以用于:
根据所述周边物体的运行数据、以及第一物体的运行数据,从所述周边物体中选择与第一物体关联的周边物体;从与第一物体关联的周边物体中获取至少一个所述目标物体。
其中,所述预期偏角获取单元111还可以用于:获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角;所述周边物体的预期水平偏角是对所述周边物体在AR画面中图像的绝对水平偏角的预测值;
相应的,图像获得单元113还可以用于:选择预期水平偏角与该目标物体的预期水平偏角之间的差值满足第二差值要求的周边物体;根据选择的所述周边物体与所述第一物体之间的距离、以及该目标物体与所述第一物体之间的距离,对选择的所述周边物体和该目标物体按照距离从小到大进行排序,得到该目标物体的排序位次;获取该目标物体对应的物体图像在所述AR画面中Y轴方向上的坐标值,对该目标物体对应的物体图像按照所述坐标值从小到大进行排序;选择排序位次与该目标物体的排序位次相同的物体图像,作为该目标物体对应的物体图像。
其中,所述预期偏角获取单元111具体可以用于:
从V2X设备获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角,所述距离以及所述预期水平偏角由所述V2X设备所述目标物体的运行数据以及所述第一物体的运行数据确定。
其中,所述预期偏角获取单元111具体可以用于:
从V2X设备获取所述周边物体的运行数据,并且,从GNSS设备获取所述第一物体的运行数据;所述V2X设备以及所述GNSS设备设置于所述第一物体上;
根据所述周边物体的运行数据、以及所述第一物体的运行数据,计算所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角。
其中,所述预期偏角获取单元111具体可以用于:
对于每个所述周边物体,根据该周边物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该周边物体中心点的方向 与所述AR设备摄像头的安装方向之间的夹角,将该夹角作为该周边物体的预期水平偏角。
其中,所述预期偏角获取单元111具体可以用于:根据以下公式计算所述夹角:
Figure PCTCN2020135113-appb-000043
其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000044
为所述第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000045
为所述周边物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为所述AR设备的中心点O的位置坐标,∠NOB为所述第一物体的航向角,O 1O为所述第一物体的GNSS设备中心点O 1与所述AR设备的中心点O之间的距离。
其中,所述预期偏角获取单元111具体可以用于:根据以下公式计算所述夹角:
Figure PCTCN2020135113-appb-000046
其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000047
为所述第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000048
为所述周边物体的GNSS设备中心点O 2的位置坐标。
其中,所述预期偏角获取单元111具体可以用于:根据所述周边物体的运行数据、以及第一物体的运行数据,从所述周边物体中选择与所述第一物体关联的周边物体;计算与所述第一物体关联的周边物体与所述第一物体之间的距离、以及与所述第一物体关联的周边物体的预期水平偏角。
图11所示的装置,获取至少一个对第一物体具有威胁的目标物体,通过偏角比对,得到目标物体在AR画面中对应的物体图像,对物体图像进行预警显示,从而能够为用户提供更大范围内的危险预警,提升用户体验。
图11所示实施例提供的装置110可用于执行本申请图3~图9所示方法实施例的技术方案,其实现原理和技术效果可以进一步参考方法实施例中的相关描述。
应理解以上图11所示的桌面文件夹的操作装置的各个单元的划分仅仅是一种逻辑功能的划分,实际实现时可以全部或部分集成到一个物理实体上,也可以物理上分开。且这些单元可以全部以软件通过处理元件调用的形式实现;也可以全部以硬件的形式实现;还可以部分单元以软件通过 处理元件调用的形式实现,部分单元通过硬件的形式实现。例如,预期偏角获取单元可以为单独设立的处理元件,也可以集成在电子设备的某一个芯片中实现。其它单元的实现与之类似。此外这些单元全部或部分可以集成在一起,也可以独立实现。在实现过程中,上述方法的各步骤或以上各个单元可以通过处理器元件中的硬件的集成逻辑电路或者软件形式的指令完成。
例如,以上这些单元可以是被配置成实施以上方法的一个或多个集成电路,例如:一个或多个特定集成电路(Application Specific Integrated Circuit;以下简称:ASIC),或,一个或多个微处理器(Digital Singnal Processor;以下简称:DSP),或,一个或者多个现场可编程门阵列(Field Programmable Gate Array;以下简称:FPGA)等。再如,这些单元可以集成在一起,以片上系统(System-On-a-Chip;以下简称:SOC)的形式实现。
图12为本申请电子设备一个实施例的结构示意图,如图12所示,上述电子设备可以包括:触控屏;一个或多个处理器;存储器;多个应用程序;以及一个或多个计算机程序。
其中,上述触控屏可以包括车载计算机(移动数据中心Mobile Data Center)的触控屏;上述电子设备可以为电子设备(手机),智慧屏,无人机,智能网联车(Intelligent Connected Vehicle;以下简称:ICV),智能(汽)车(smart/intelligent car)或车载设备等设备。
其中上述一个或多个计算机程序被存储在上述存储器中,上述一个或多个计算机程序包括指令,当上述指令被上述设备执行时,使得上述设备执行以下步骤:
获取至少一个对第一物体具有威胁的目标物体,并获取所述目标物体的预期水平偏角;所述目标物体的预期水平偏角是对所述目标物体在AR画面中图像的绝对水平偏角的预测值;所述AR画面是所述第一物体的AR设备展示的AR画面;
获取所述AR画面中各个物体图像的绝对水平偏角;所述物体图像的绝对水平偏角是AR画面的拍摄点指向物体图像中心点的方向与所述拍摄点指向AR画面中心点的方向在水平方向上的夹角;
根据所述目标物体的预期水平偏角、以及各个物体图像的绝对水平偏角,获得所述目标物体在所述AR画面中对应的物体图像,所述目标物体的预期水平偏角与所述目标物体对应的物体图像的绝对水平偏角之间的差值满足第一差值要求;
在所述AR画面中对所述目标物体对应的物体图像进行预警显示。
所述指令被所述设备执行时,使得所述获取所述AR画面中各个物体图像的绝对水平偏角的步骤包括:
从所述AR画面中识别出物体图像;
对于每个物体图像,通过以下公式计算该物体图像的绝对水平偏角:
y=arctan(2x*tan(m/2)/L)
其中,y为物体图像的绝对水平偏角的角度值,L为AR画面水平方向的总像素数,m为AR设备的摄像头的水平视野角度范围,x为物体图像的中心点与AR画面的中心点之间的线段在水平方向上占用的像素个数。
所述指令被所述设备执行时,使得所述获取至少一个对第一物体具有威胁的目标物体,并获取所述目标物体的预期水平偏角的步骤包括:
从V2X设备获取所述目标物体以及所述目标物体的预期水平偏角,所述目标物体以及所述目标物体的预期水平偏角由所述V2X设备根据所述目标物体的运行数据以及所述第一物体的运行数据确定,所述V2X设备设置于所述第一物体上。
所述指令被所述设备执行时,使得所述获取至少一个对第一物体具有威胁的目标物体,并获取所述目标物体的预期水平偏角的步骤包括:
从V2X设备获取周边物体的运行数据,并且,从GNSS设备获取所述第一物体的运行数据;所述V2X设备以及GNSS设备设置于所述第一物体上;
根据所述周边物体的运行数据、以及所述第一物体的运行数据,从所述周边物体中获取至少一个所述目标物体,计算所述目标物体的预期水平偏角。
所述指令被所述设备执行时,使得所述计算所述目标物体的预期水平偏角的步骤包括:
对于每个所述目标物体,根据该目标物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该目标物体的中心点的方向与所述AR设备摄像头的安装方向之间的夹角,将该夹角作为该目标物体的预期水平偏角。
所述指令被所述设备执行时,使得所述根据该目标物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该目标物体的中心点的方向与所述AR设备摄像头的安装方向之间的夹角的步骤包括:
根据以下公式计算所述夹角:
Figure PCTCN2020135113-appb-000049
其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000050
为所述第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000051
为所述目标物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为所述AR设备的中心点O的位置坐标,∠NOB为第一物体的航向角, O 1O为第一物体的GNSS设备中心点O 1与所述AR设备的中心点O之间的距离。
所述指令被所述设备执行时,使得所述根据该目标物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备的中心点指向该目标物体中GNSS设备中心点的方向与所述AR设备摄像头的安装方向之间的夹角的步骤包括:
根据以下公式计算所述夹角:
Figure PCTCN2020135113-appb-000052
其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000053
为所述第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000054
为所述目标物体的GNSS设备中心点O 2的位置坐标。
所述指令被所述设备执行时,使得所述根据所述周边物体的运行数据、以及第一物体的运行数据,从所述周边物体中获取至少一个所述目标物体的步骤包括:
根据所述周边物体的运行数据、以及第一物体的运行数据,计算每个周边物体与第一物体按照所述运行数据运行时发生碰撞的时长;
按照时长的从小到大的顺序从所述周边物体中获取至少一个所述目标物体。
所述指令被所述设备执行时,使得所述从所述周边物体中获取至少一个所述目标物体的步骤之前,还执行以下步骤:
根据所述周边物体的运行数据、以及第一物体的运行数据,从所述周边物体中选择与第一物体关联的周边物体;相应的,
所述从所述周边物体中获取至少一个所述目标物体,包括:
从与第一物体关联的周边物体中获取至少一个所述目标物体。
所述指令被所述设备执行时,使得所述对于一个目标物体,如果获得的该目标物体在所述AR画面中对应的物体图像为至少两个,所述在所述AR画面中对所述目标物体对应的物体图像进行预警显示的步骤之前,还执行以下步骤:
获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角;所述周边物体的预期水平偏角是对所述周边物体在AR画面中图像的绝对水平偏角的预测值;
相应的,所述获得所述目标物体在所述AR画面中对应的物体图像的步骤与所述进行预警显示的步骤之间,还执行以下步骤:
选择预期水平偏角与该目标物体的预期水平偏角之间的差值满足第 二差值要求的周边物体;
根据选择的所述周边物体与所述第一物体之间的距离、以及该目标物体与所述第一物体之间的距离,对选择的所述周边物体和该目标物体按照距离从小到大进行排序,得到该目标物体的排序位次;
获取该目标物体对应的物体图像在所述AR画面中Y轴方向上的坐标值,对该目标物体对应的物体图像按照所述坐标值从小到大进行排序;
选择排序位次与该目标物体的排序位次相同的物体图像,作为该目标物体对应的物体图像。
所述指令被所述设备执行时,使得所述获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角的步骤包括:
从V2X设备获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角,所述距离以及所述预期水平偏角由所述V2X设备所述目标物体的运行数据以及所述第一物体的运行数据确定。
所述指令被所述设备执行时,使得所述获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角的步骤包括:
从V2X设备获取所述周边物体的运行数据,并且,从GNSS设备获取所述第一物体的运行数据;所述V2X设备以及所述GNSS设备设置于所述第一物体上;
根据所述周边物体的运行数据、以及所述第一物体的运行数据,计算所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角。
所述指令被所述设备执行时,使得所述计算所述周边物体的预期水平偏角的步骤包括:
对于每个所述周边物体,根据该周边物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该周边物体中心点的方向与所述AR设备摄像头的安装方向之间的夹角,将该夹角作为该周边物体的预期水平偏角。
所述指令被所述设备执行时,使得所述根据该周边物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备的中心点指向该周边物体中GNSS设备中心点的方向与所述AR设备摄像头的安装方向之间的夹角的步骤包括:
根据以下公式计算所述夹角:
Figure PCTCN2020135113-appb-000055
其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水 平安装角度,
Figure PCTCN2020135113-appb-000056
为所述第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000057
为所述周边物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为所述AR设备的中心点O的位置坐标,∠NOB为所述第一物体的航向角,O 1O为所述第一物体的GNSS设备中心点O 1与所述AR设备的中心点O之间的距离。
所述指令被所述设备执行时,使得所述根据该周边物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该周边物体中心点的方向与所述AR设备摄像头的安装方向之间的夹角的步骤包括:
根据以下公式计算所述夹角:
Figure PCTCN2020135113-appb-000058
其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
Figure PCTCN2020135113-appb-000059
为所述第一物体的GNSS设备中心点O 1的位置坐标,
Figure PCTCN2020135113-appb-000060
为所述周边物体的GNSS设备中心点O 2的位置坐标。
所述指令被所述设备执行时,使得所述计算所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角的步骤之前,还执行以下步骤:
根据所述周边物体的运行数据、以及第一物体的运行数据,从所述周边物体中选择与所述第一物体关联的周边物体;
相应的,所述计算所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角的步骤包括:
计算与所述第一物体关联的周边物体与所述第一物体之间的距离、以及与所述第一物体关联的周边物体的预期水平偏角。
图12所示的电子设备可以是终端设备也可以是内置于上述终端设备的电路设备。该设备可以用于执行本申请图3~图9所示实施例提供的方法中的功能/步骤。
电子设备1200可以包括处理器1210,外部存储器接口1220,内部存储器1221,通用串行总线(universal serial bus,USB)接口1230,充电管理模块1240,电源管理模块1241,电池1242,天线1,天线2,移动通信模块1250,无线通信模块1260,音频模块1270,扬声器1270A,受话器1270B,麦克风1270C,耳机接口1270D,传感器模块1280,按键1290,马达1291,指示器1292,摄像头1293,显示屏1294,以及用户标识模块(subscriber identification module,SIM)卡接口1295等。其中传感器模块1280可以包括压力传感器1280A,陀螺仪传感器1280B,气压传感器1280C,磁传感器1280D,加速度传感器1280E,距离传感器1280F,接近光传感器1280G,指纹传感器1280H,温度传感器1280J,触摸传感器1280K,环境光传感器 1280L,骨传导传感器1280M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备1200的具体限定。在本申请另一些实施例中,电子设备1200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器1210可以包括一个或多个处理单元,例如:处理器1210可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器1210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器1210中的存储器为高速缓冲存储器。该存储器可以保存处理器1210刚用过或循环使用的指令或数据。如果处理器1210需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器1210的等待时间,因而提高了系统的效率。
在一些实施例中,处理器1210可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器1210可以包含多组I2C总线。处理器1210可以通过不同的I2C总线接口分别耦合触摸传感器1280K,充电器,闪光灯,摄像头1293等。例如:处理器1210可以通过I2C接口耦合触摸传感器1280K,使处理器1210与触摸传感器1280K通过I2C总线接口通信,实现电子设备1200的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器1210可以包含多组I2S总线。处理器1210可以通过I2S总线与音频模块1270耦合,实现处理器1210与音频模块1270之间的通信。在一些实施例中,音频模块1270可以通过I2S接口向无线通信模块1260传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块1270与无线通信模块1260可以通过PCM总线接口耦合。在一些实施例中,音频模块1270也可以通过PCM接口向无线通信模块1260传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S 接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器1210与无线通信模块1260。例如:处理器1210通过UART接口与无线通信模块1260中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块1270可以通过UART接口向无线通信模块1260传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器1210与显示屏1294,摄像头1293等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器1210和摄像头1293通过CSI接口通信,实现电子设备1200的拍摄功能。处理器1210和显示屏1294通过DSI接口通信,实现电子设备1200的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器1210与摄像头1293,显示屏1294,无线通信模块1260,音频模块1270,传感器模块1280等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口1230是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口1230可以用于连接充电器为电子设备1200充电,也可以用于电子设备1200与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备1200的结构限定。在本申请另一些实施例中,电子设备1200也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块1240用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块1240可以通过USB接口1230接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块1240可以通过电子设备1200的无线充电线圈接收无线充电输入。充电管理模块1240为电池1242充电的同时,还可以通过电源管理模块1241为电子设备供电。
电源管理模块1241用于连接电池1242,充电管理模块1240与处理器1210。电源管理模块1241接收电池1242和/或充电管理模块1240的输入,为处理器1210,内部存储器1221,显示屏1294,摄像头1293,和无线通信模块1260等供电。电源管理模块1241还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块1241也可以设置于处理器1210中。在另一些实施例中,电源管理模块1241和充电管理模块1240也可以设置于同一个器件中。
电子设备1200的无线通信功能可以通过天线1,天线2,移动通信模块1250,无线通信模块1260,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备1200中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块1250可以提供应用在电子设备1200上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块1250可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块1250可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块1250还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块1250的至少部分功能模块可以被设置于处理器1210中。在一些实施例中,移动通信模块1250的至少部分功能模块可以与处理器1210的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器1270A,受话器1270B等)输出声音信号,或通过显示屏1294显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器1210,与移动通信模块1250或其他功能模块设置在同一个器件中。
无线通信模块1260可以提供应用在电子设备1200上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块1260可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块1260经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器1210。无线通信模块1260还可以从处理器1210接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备1200的天线1和移动通信模块1250耦合,天线2和无线通信模块1260耦合,使得电子设备1200可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强 系统(satellite based augmentation systems,SBAS)。
电子设备1200通过GPU,显示屏1294,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏1294和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器1210可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏1294用于显示图像,视频等。显示屏1294包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备1200可以包括1个或N个显示屏1294,N为大于1的正整数。
电子设备1200可以通过ISP,摄像头1293,视频编解码器,GPU,显示屏1294以及应用处理器等实现拍摄功能。
ISP用于处理摄像头1293反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头1293中。
摄像头1293用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备1200可以包括1个或N个摄像头1293,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备1200在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备1200可以支持一种或多种视频编解码器。这样,电子设备1200可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备1200的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口1220可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备1200的存储能力。外部存储卡通过外部存储器接口1220与处理器1210通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器1221可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器1221可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备1200使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器1221可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器1210通过运行存储在内部存储器1221的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备1200的各种功能应用以及数据处理。
电子设备1200可以通过音频模块1270,扬声器1270A,受话器1270B,麦克风1270C,耳机接口1270D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块1270用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块1270还可以用于对音频信号编码和解码。在一些实施例中,音频模块1270可以设置于处理器1210中,或将音频模块1270的部分功能模块设置于处理器1210中。
扬声器1270A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备1200可以通过扬声器1270A收听音乐,或收听免提通话。
受话器1270B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备1200接听电话或语音信息时,可以通过将受话器1270B靠近人耳接听语音。
麦克风1270C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风1270C发声,将声音信号输入到麦克风1270C。电子设备1200可以设置至少一个麦克风1270C。在另一些实施例中,电子设备1200可以设置两个麦克风1270C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备1200还可以设置三个,四个或更多麦克风1270C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口1270D用于连接有线耳机。耳机接口1270D可以是USB接口1230,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器1280A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器1280A可以设置于显示屏1294。压力传感器1280A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器1280A,电极之间的电容改变。电子设备1200根据电容的变化确定压力的强度。当有触摸操作作用于显示屏1294,电子设备1200根据压力传感器1280A检测所述触摸操作强度。电子设备1200也可以根据压力传感器1280A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作 用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器1280B可以用于确定电子设备1200的运动姿态。在一些实施例中,可以通过陀螺仪传感器1280B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器1280B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器1280B检测电子设备1200抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备1200的抖动,实现防抖。陀螺仪传感器1280B还可以用于导航,体感游戏场景。
气压传感器1280C用于测量气压。在一些实施例中,电子设备1200通过气压传感器1280C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器1280D包括霍尔传感器。电子设备1200可以利用磁传感器1280D检测翻盖皮套的开合。在一些实施例中,当电子设备1200是翻盖机时,电子设备1200可以根据磁传感器1280D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器1280E可检测电子设备1200在各个方向上(一般为三轴)加速度的大小。当电子设备1200静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器1280F,用于测量距离。电子设备1200可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备1200可以利用距离传感器1280F测距以实现快速对焦。
接近光传感器1280G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备1200通过发光二极管向外发射红外光。电子设备1200使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备1200附近有物体。当检测到不充分的反射光时,电子设备1200可以确定电子设备1200附近没有物体。电子设备1200可以利用接近光传感器1280G检测用户手持电子设备1200贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器1280G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器1280L用于感知环境光亮度。电子设备1200可以根据感知的环境光亮度自适应调节显示屏1294亮度。环境光传感器1280L也可用于拍照时自动调节白平衡。环境光传感器1280L还可以与接近光传感器1280G配合,检测电子设备1200是否在口袋里,以防误触。
指纹传感器1280H用于采集指纹。电子设备1200可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器1280J用于检测温度。在一些实施例中,电子设备1200利用温度传感器1280J检测的温度,执行温度处理策略。例如,当温度传感器1280J上报的温度超过阈值,电子设备1200执行降低位于温度传感器1280J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备1200对电池1242加热,以避免低温导致电子设备1200异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备1200对电池1242的输出电压执行升压,以避免低温导致 的异常关机。
触摸传感器1280K,也称“触控器件”。触摸传感器1280K可以设置于显示屏1294,由触摸传感器1280K与显示屏1294组成触摸屏,也称“触控屏”。触摸传感器1280K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏1294提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器1280K也可以设置于电子设备1200的表面,与显示屏1294所处的位置不同。
骨传导传感器1280M可以获取振动信号。在一些实施例中,骨传导传感器1280M可以获取人体声部振动骨块的振动信号。骨传导传感器1280M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器1280M也可以设置于耳机中,结合成骨传导耳机。音频模块1270可以基于所述骨传导传感器1280M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器1280M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键1290包括开机键,音量键等。按键1290可以是机械按键。也可以是触摸式按键。电子设备1200可以接收按键输入,产生与电子设备1200的用户设置以及功能控制有关的键信号输入。
马达1291可以产生振动提示。马达1291可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏1294不同区域的触摸操作,马达1291也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器1292可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口1295用于连接SIM卡。SIM卡可以通过插入SIM卡接口1295,或从SIM卡接口1295拔出,实现和电子设备1200的接触和分离。电子设备1200可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口1295可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口1295可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口1295也可以兼容不同类型的SIM卡。SIM卡接口1295也可以兼容外部存储卡。电子设备1200通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备1200采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备1200中,不能和电子设备1200分离。
应理解,图12所示的电子设备1200能够实现本申请图3~图9所示实施例提供的方法的各个过程。电子设备1200中的各个模块的操作和/或功能,分别为了实现上述方法实施例中的相应流程。具体可参见本申请图3~图9所示方法实施例中的描述,为避免重复,此处适当省略详细描述。
应理解,图12所示的电子设备1200中的处理器1210可以是片上系统SOC,该处理器1210中可以包括中央处理器(Central Processing Unit,CPU),还可以进一步包括其他类型的处理器,例如:图像处理器(Graphics  Processing Unit,GPU)等。
总之,处理器1210内部的各部分处理器或处理单元可以共同配合实现之前的方法流程,且各部分处理器或处理单元相应的软件程序可存储在内部存储器121中。
本申请还提供一种电子设备,所述设备包括存储介质和中央处理器,所述存储介质可以是非易失性存储介质,所述存储介质中存储有计算机可执行程序,所述中央处理器与所述非易失性存储介质连接,并执行所述计算机可执行程序以实现本申请图3~图9所示实施例提供的方法。
以上各实施例中,涉及的处理器可以例如包括CPU、DSP、微控制器或数字信号处理器,还可包括GPU、嵌入式神经网络处理器(Neural-network Process Units;以下简称:NPU)和图像信号处理器(Image Signal Processing;以下简称:ISP),该处理器还可包括必要的硬件加速器或逻辑处理硬件电路,如ASIC,或一个或多个用于控制本申请技术方案程序执行的集成电路等。此外,处理器可以具有操作一个或多个软件程序的功能,软件程序可以存储在存储介质中。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序,当其在计算机上运行时,使得计算机执行本申请图3~图9所示实施例提供的方法。
本申请实施例还提供一种计算机程序产品,该计算机程序产品包括计算机程序,当其在计算机上运行时,使得计算机执行本申请图3~图9所示实施例提供的方法。
本申请实施例中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示单独存在A、同时存在A和B、单独存在B的情况。其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项”及其类似表达,是指的这些项中的任意组合,包括单项或复数项的任意组合。例如,a,b和c中的至少一项可以表示:a,b,c,a和b,a和c,b和c或a和b和c,其中a,b,c可以是单个,也可以是多个。
本领域普通技术人员可以意识到,本文中公开的实施例中描述的各单元及算法步骤,能够以电子硬件、计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,任一功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计 算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory;以下简称:ROM)、随机存取存储器(Random Access Memory;以下简称:RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。本申请的保护范围应以所述权利要求的保护范围为准。

Claims (34)

  1. 一种目标物体的预警方法,其特征在于,包括:
    获取至少一个对第一物体具有威胁的目标物体,并获取所述目标物体的预期水平偏角;所述目标物体的预期水平偏角是对所述目标物体在增强现实AR画面中图像的绝对水平偏角的预测值;所述AR画面是所述第一物体的AR设备展示的AR画面;
    获取所述AR画面中各个物体图像的绝对水平偏角;所述物体图像的绝对水平偏角是AR画面的拍摄点指向物体图像中心点的方向与所述拍摄点指向AR画面中心点的方向在水平方向上的夹角;
    根据所述目标物体的预期水平偏角、以及各个物体图像的绝对水平偏角,获得所述目标物体在所述AR画面中对应的物体图像,所述目标物体的预期水平偏角与所述目标物体对应的物体图像的绝对水平偏角之间的差值满足第一差值要求;
    在所述AR画面中对所述目标物体对应的物体图像进行预警显示。
  2. 根据权利要求1所述的方法,其特征在于,所述获取所述AR画面中各个物体图像的绝对水平偏角,包括:
    从所述AR画面中识别出物体图像;
    对于每个物体图像,通过以下公式计算该物体图像的绝对水平偏角:
    y=arctan(2x*tan(m/2)/L)
    其中,y为物体图像的绝对水平偏角的角度值,L为AR画面水平方向的总像素数,m为AR设备的摄像头的水平视野角度范围,x为物体图像的中心点与AR画面的中心点之间的线段在水平方向上占用的像素个数。
  3. 根据权利要求1或2所述的方法,其特征在于,所述获取至少一个对第一物体具有威胁的目标物体,并获取所述目标物体的预期水平偏角,包括:
    从车与万物的互联V2X设备获取所述目标物体以及所述目标物体的预期水平偏角,所述目标物体以及所述目标物体的预期水平偏角由所述V2X设备根据所述目标物体的运行数据以及所述第一物体的运行数据确定,所述V2X设备设置于所述第一物体上。
  4. 根据权利要求1或2所述的方法,其特征在于,所述获取至少一个对第一物体具有威胁的目标物体,并获取所述目标物体的预期水平偏角,包括:
    从V2X设备获取周边物体的运行数据,并且,从全球导航卫星系统GNSS设备获取所述第一物体的运行数据;所述V2X设备以及GNSS设备设置于所述第一物体上;
    根据所述周边物体的运行数据、以及所述第一物体的运行数据,从所述周边物体中获取至少一个所述目标物体,计算所述目标物体的预期水平 偏角。
  5. 根据权利要求4所述的方法,其特征在于,所述计算所述目标物体的预期水平偏角,包括:
    对于每个所述目标物体,根据该目标物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该目标物体的中心点的方向与所述AR设备摄像头的安装方向之间的夹角,将该夹角作为该目标物体的预期水平偏角。
  6. 根据权利要求5所述的方法,其特征在于,所述根据该目标物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该目标物体的中心点的方向与所述AR设备摄像头的安装方向之间的夹角,包括:
    根据以下公式计算所述夹角:
    Figure PCTCN2020135113-appb-100001
    其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
    Figure PCTCN2020135113-appb-100002
    为所述第一物体的GNSS设备中心点O 1的位置坐标,
    Figure PCTCN2020135113-appb-100003
    为所述目标物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为所述AR设备的中心点O的位置坐标,∠NOB为第一物体的航向角,O 1O为第一物体的GNSS设备中心点O 1与所述AR设备的中心点O之间的距离。
  7. 根据权利要求5所述的方法,其特征在于,所述根据该目标物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备的中心点指向该目标物体中GNSS设备中心点的方向与所述AR设备摄像头的安装方向之间的夹角,包括:
    根据以下公式计算所述夹角:
    Figure PCTCN2020135113-appb-100004
    其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
    Figure PCTCN2020135113-appb-100005
    为所述第一物体的GNSS设备中心点O 1的位置坐标,
    Figure PCTCN2020135113-appb-100006
    为所述目标物体的GNSS设备中心点O 2的位置坐标。
  8. 根据权利要求4所述的方法,其特征在于,根据所述周边物体的运行数据、以及第一物体的运行数据,从所述周边物体中获取至少一个所述 目标物体,包括:
    根据所述周边物体的运行数据、以及第一物体的运行数据,计算每个周边物体与第一物体按照所述运行数据运行时发生碰撞的时长;
    按照时长的从小到大的顺序从所述周边物体中获取至少一个所述目标物体。
  9. 根据权利要求4所述的方法,其特征在于,所述从所述周边物体中获取至少一个所述目标物体之前,还包括:
    根据所述周边物体的运行数据、以及第一物体的运行数据,从所述周边物体中选择与第一物体关联的周边物体;相应的,
    所述从所述周边物体中获取至少一个所述目标物体,包括:
    从与第一物体关联的周边物体中获取至少一个所述目标物体。
  10. 根据权利要求1或2所述的方法,其特征在于,对于一个目标物体,如果获得的该目标物体在所述AR画面中对应的物体图像为至少两个,所述在所述AR画面中对所述目标物体对应的物体图像进行预警显示之前,还包括:
    获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角;所述周边物体的预期水平偏角是对所述周边物体在AR画面中图像的绝对水平偏角的预测值;
    相应的,所述获得所述目标物体在所述AR画面中对应的物体图像与所述进行预警显示之间,还包括:
    选择预期水平偏角与该目标物体的预期水平偏角之间的差值满足第二差值要求的周边物体;
    根据选择的所述周边物体与所述第一物体之间的距离、以及该目标物体与所述第一物体之间的距离,对选择的所述周边物体和该目标物体按照距离从小到大进行排序,得到该目标物体的排序位次;
    获取该目标物体对应的物体图像在所述AR画面中Y轴方向上的坐标值,对该目标物体对应的物体图像按照所述坐标值从小到大进行排序;
    选择排序位次与该目标物体的排序位次相同的物体图像,作为该目标物体对应的物体图像。
  11. 根据权利要求10所述的方法,其特征在于,所述获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角,包括:
    从V2X设备获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角,所述距离以及所述预期水平偏角由所述V2X设备所述目标物体的运行数据以及所述第一物体的运行数据确定。
  12. 根据权利要求10所述的方法,其特征在于,所述获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角,包括:
    从V2X设备获取所述周边物体的运行数据,并且,从GNSS设备获取所述第一物体的运行数据;所述V2X设备以及所述GNSS设备设置于所 述第一物体上;
    根据所述周边物体的运行数据、以及所述第一物体的运行数据,计算所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角。
  13. 根据权利要求12所述的方法,其特征在于,所述计算所述周边物体的预期水平偏角,包括:
    对于每个所述周边物体,根据该周边物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该周边物体中心点的方向与所述AR设备摄像头的安装方向之间的夹角,将该夹角作为该周边物体的预期水平偏角。
  14. 根据权利要求13所述的方法,其特征在于,所述根据该周边物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备的中心点指向该周边物体中GNSS设备中心点的方向与所述AR设备摄像头的安装方向之间的夹角,包括:
    根据以下公式计算所述夹角:
    Figure PCTCN2020135113-appb-100007
    其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
    Figure PCTCN2020135113-appb-100008
    为所述第一物体的GNSS设备中心点O 1的位置坐标,
    Figure PCTCN2020135113-appb-100009
    为所述周边物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为所述AR设备的中心点O的位置坐标,∠NOB为所述第一物体的航向角,O 1O为所述第一物体的GNSS设备中心点O 1与所述AR设备的中心点O之间的距离。
  15. 根据权利要求13所述的方法,其特征在于,所述根据该周边物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该周边物体中心点的方向与所述AR设备摄像头的安装方向之间的夹角,包括:
    根据以下公式计算所述夹角:
    Figure PCTCN2020135113-appb-100010
    其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
    Figure PCTCN2020135113-appb-100011
    为所述第一物体的GNSS设备中心点O 1的位置坐标,
    Figure PCTCN2020135113-appb-100012
    为所述周边物体的GNSS设备中心点O 2的位置坐标。
  16. 根据权利要求12所述的方法,其特征在于,所述计算所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角之前,还包括:
    根据所述周边物体的运行数据、以及第一物体的运行数据,从所述周边物体中选择与所述第一物体关联的周边物体;
    相应的,所述计算所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角,包括:
    计算与所述第一物体关联的周边物体与所述第一物体之间的距离、以及与所述第一物体关联的周边物体的预期水平偏角。
  17. 一种目标物体的预警装置,其特征在于,包括:
    预期偏角获取单元,用于获取至少一个对第一物体具有威胁的目标物体,并获取所述目标物体的预期水平偏角;所述目标物体的预期水平偏角是对所述目标物体在增强现实AR画面中图像的绝对水平偏角的预测值;所述AR画面是所述第一物体的AR设备展示的AR画面;
    绝对偏角获取单元,用于获取所述AR画面中各个物体图像的绝对水平偏角;所述物体图像的绝对水平偏角是AR画面的拍摄点指向物体图像中心点的方向与所述拍摄点指向AR画面中心点的方向在水平方向上的夹角;
    图像获得单元,用于根据所述预期偏角获取单元获取的所述目标物体的预期水平偏角、以及所述绝对偏角获取单元获取的各个物体图像的绝对水平偏角,获得所述目标物体在所述AR画面中对应的物体图像,所述目标物体的预期水平偏角与所述目标物体对应的物体图像的绝对水平偏角之间的差值满足第一差值要求;
    显示单元,用于在所述AR画面中对所述图像获得单元获得的所述目标物体对应的物体图像进行预警显示。
  18. 根据权利要求17所述的装置,其特征在于,所述绝对偏角获取单元具体用于:
    从所述AR画面中识别出物体图像;
    对于每个物体图像,通过以下公式计算该物体图像的绝对水平偏角:
    y=arctan(2x*tan(m/2)/L)
    其中,y为物体图像的绝对水平偏角的角度值,L为AR画面水平方向的总像素数,m为AR设备的摄像头的水平视野角度范围,x为物体图像的中心点与AR画面的中心点之间的线段在水平方向上占用的像素个数。
  19. 根据权利要求17或18所述的装置,其特征在于,所述预期偏角获取单元具体用于:
    从车与万物的互联V2X设备获取所述目标物体以及所述目标物体的预期水平偏角,所述目标物体以及所述目标物体的预期水平偏角由所述 V2X设备根据所述目标物体的运行数据以及所述第一物体的运行数据确定,所述V2X设备设置于所述第一物体上。
  20. 根据权利要求17或18所述的装置,其特征在于,所述预期偏角获取单元包括:
    数据获取子单元,用于从V2X设备获取周边物体的运行数据,并且,从全球导航卫星系统GNSS设备获取所述第一物体的运行数据;所述V2X设备以及GNSS设备设置于所述第一物体上;
    计算子单元,用于根据所述周边物体的运行数据、以及所述第一物体的运行数据,从所述周边物体中获取至少一个所述目标物体,计算所述目标物体的预期水平偏角。
  21. 根据权利要求20所述的装置,其特征在于,所述计算子单元具体用于:
    对于每个所述目标物体,根据该目标物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该目标物体的中心点的方向与所述AR设备摄像头的安装方向之间的夹角,将该夹角作为该目标物体的预期水平偏角。
  22. 根据权利要求21所述的装置,其特征在于,所述计算子单元具体用于:
    根据以下公式计算所述夹角:
    Figure PCTCN2020135113-appb-100013
    其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
    Figure PCTCN2020135113-appb-100014
    为所述第一物体的GNSS设备中心点O 1的位置坐标,
    Figure PCTCN2020135113-appb-100015
    为所述目标物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为所述AR设备的中心点O的位置坐标,∠NOB为第一物体的航向角,O 1O为第一物体的GNSS设备中心点O 1与所述AR设备的中心点O之间的距离。
  23. 根据权利要求21所述的装置,其特征在于,所述计算子单元具体用于:
    根据以下公式计算所述夹角:
    Figure PCTCN2020135113-appb-100016
    其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水 平安装角度,
    Figure PCTCN2020135113-appb-100017
    为所述第一物体的GNSS设备中心点O 1的位置坐标,
    Figure PCTCN2020135113-appb-100018
    为所述目标物体的GNSS设备中心点O 2的位置坐标。
  24. 根据权利要求20所述的装置,其特征在于,所述计算子单元具体用于:
    根据所述周边物体的运行数据、以及第一物体的运行数据,计算每个周边物体与第一物体按照所述运行数据运行时发生碰撞的时长;
    按照时长的从小到大的顺序从所述周边物体中获取至少一个所述目标物体。
  25. 根据权利要求20所述的装置,其特征在于,所述计算子单元具体用于:
    根据所述周边物体的运行数据、以及第一物体的运行数据,从所述周边物体中选择与第一物体关联的周边物体;从与第一物体关联的周边物体中获取至少一个所述目标物体。
  26. 根据权利要求17或18所述的装置,其特征在于,所述预期偏角获取单元还用于:获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角;所述周边物体的预期水平偏角是对所述周边物体在AR画面中图像的绝对水平偏角的预测值;
    相应的,图像获得单元还用于:选择预期水平偏角与该目标物体的预期水平偏角之间的差值满足第二差值要求的周边物体;根据选择的所述周边物体与所述第一物体之间的距离、以及该目标物体与所述第一物体之间的距离,对选择的所述周边物体和该目标物体按照距离从小到大进行排序,得到该目标物体的排序位次;获取该目标物体对应的物体图像在所述AR画面中Y轴方向上的坐标值,对该目标物体对应的物体图像按照所述坐标值从小到大进行排序;选择排序位次与该目标物体的排序位次相同的物体图像,作为该目标物体对应的物体图像。
  27. 根据权利要求26所述的装置,其特征在于,所述预期偏角获取单元具体用于:
    从V2X设备获取所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平偏角,所述距离以及所述预期水平偏角由所述V2X设备所述目标物体的运行数据以及所述第一物体的运行数据确定。
  28. 根据权利要求26所述的装置,其特征在于,所述预期偏角获取单元具体用于:
    从V2X设备获取所述周边物体的运行数据,并且,从GNSS设备获取所述第一物体的运行数据;所述V2X设备以及所述GNSS设备设置于所述第一物体上;
    根据所述周边物体的运行数据、以及所述第一物体的运行数据,计算所述周边物体与所述第一物体之间的距离、以及所述周边物体的预期水平 偏角。
  29. 根据权利要求28所述的装置,其特征在于,所述预期偏角获取单元具体用于:
    对于每个所述周边物体,根据该周边物体的运行数据、以及所述第一物体的运行数据,计算所述AR设备中心点指向该周边物体中心点的方向与所述AR设备摄像头的安装方向之间的夹角,将该夹角作为该周边物体的预期水平偏角。
  30. 根据权利要求29所述的装置,其特征在于,所述预期偏角获取单元具体用于:根据以下公式计算所述夹角:
    Figure PCTCN2020135113-appb-100019
    其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
    Figure PCTCN2020135113-appb-100020
    为所述第一物体的GNSS设备中心点O 1的位置坐标,
    Figure PCTCN2020135113-appb-100021
    为所述周边物体的GNSS设备中心点O 2的位置坐标,(X O,Y O)为所述AR设备的中心点O的位置坐标,∠NOB为所述第一物体的航向角,O 1O为所述第一物体的GNSS设备中心点O 1与所述AR设备的中心点O之间的距离。
  31. 根据权利要求29所述的装置,其特征在于,所述预期偏角获取单元具体用于:根据以下公式计算所述夹角:
    Figure PCTCN2020135113-appb-100022
    其中,∠O 2OA为所述夹角,∠NOA为所述摄像头相对于正北方向的水平安装角度,
    Figure PCTCN2020135113-appb-100023
    为所述第一物体的GNSS设备中心点O 1的位置坐标,
    Figure PCTCN2020135113-appb-100024
    为所述周边物体的GNSS设备中心点O 2的位置坐标。
  32. 根据权利要求28所述的装置,其特征在于,所述预期偏角获取单元具体用于:根据所述周边物体的运行数据、以及第一物体的运行数据,从所述周边物体中选择与所述第一物体关联的周边物体;计算与所述第一物体关联的周边物体与所述第一物体之间的距离、以及与所述第一物体关联的周边物体的预期水平偏角。
  33. 一种电子设备,其特征在于,包括:
    显示屏;一个或多个处理器;存储器;多个应用程序;以及一个或多个计算机程序,其中所述一个或多个计算机程序被存储在所述存储器中, 所述一个或多个计算机程序包括指令,当所述指令被所述设备执行时,使得所述设备执行以下步骤:
    获取至少一个对第一物体具有威胁的目标物体,并获取所述目标物体的预期水平偏角;所述目标物体的预期水平偏角是对所述目标物体在增强现实AR画面中图像的绝对水平偏角的预测值;所述AR画面是所述第一物体的AR设备展示的AR画面;
    获取所述AR画面中各个物体图像的绝对水平偏角;所述物体图像的绝对水平偏角是AR画面的拍摄点指向物体图像中心点的方向与所述拍摄点指向AR画面中心点的方向在水平方向上的夹角;
    根据所述目标物体的预期水平偏角、以及各个物体图像的绝对水平偏角,获得所述目标物体在所述AR画面中对应的物体图像,所述目标物体的预期水平偏角与所述目标物体对应的物体图像的绝对水平偏角之间的差值满足第一差值要求;
    在所述AR画面中对所述目标物体对应的物体图像进行预警显示。
  34. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机程序,当其在计算机上运行时,使得计算机执行如权利要求1-16任一项所述的方法。
PCT/CN2020/135113 2020-02-20 2020-12-10 目标物体的预警方法、装置和电子设备 WO2021164387A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010105033.8 2020-02-20
CN202010105033.8A CN111323042B (zh) 2020-02-20 2020-02-20 目标物体的预警方法、装置和电子设备

Publications (1)

Publication Number Publication Date
WO2021164387A1 true WO2021164387A1 (zh) 2021-08-26

Family

ID=71167949

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/135113 WO2021164387A1 (zh) 2020-02-20 2020-12-10 目标物体的预警方法、装置和电子设备

Country Status (2)

Country Link
CN (1) CN111323042B (zh)
WO (1) WO2021164387A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111323042B (zh) * 2020-02-20 2023-10-10 华为技术有限公司 目标物体的预警方法、装置和电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012101318A1 (en) * 2011-01-28 2012-08-02 Wärtsilä Finland Oy An arrangement and a method for synchronizing a generator set to an electric network
WO2017113403A1 (zh) * 2015-12-31 2017-07-06 华为技术有限公司 一种影像信息处理方法及增强现实ar设备
CN107102736A (zh) * 2017-04-25 2017-08-29 上海唱风信息科技有限公司 实现增强现实的方法
CN108243332A (zh) * 2016-12-23 2018-07-03 深圳点石创新科技有限公司 车载抬头显示系统影像调节方法及车载抬头显示系统
CN109792543A (zh) * 2016-09-27 2019-05-21 深圳市大疆创新科技有限公司 根据可移动物捕获的图像数据创建视频抽象的方法和系统
CN111323042A (zh) * 2020-02-20 2020-06-23 华为技术有限公司 目标物体的预警方法、装置和电子设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004212232A (ja) * 2003-01-06 2004-07-29 Alpine Electronics Inc 風景動画表示ナビゲーション装置
CN101763640B (zh) * 2009-12-31 2011-10-19 无锡易斯科电子技术有限公司 车载多目摄像机环视系统的在线标定处理方法
KR102406489B1 (ko) * 2014-12-01 2022-06-10 현대자동차주식회사 전자 장치, 전자 장치의 제어 방법, 컴퓨터 프로그램 및 컴퓨터 판독 가능한 기록 매체
CN105667496B (zh) * 2016-03-15 2018-04-24 江苏大学 一种汽车盘山公路防坠控制方法
US20200004269A1 (en) * 2017-02-09 2020-01-02 Sony Semiconductor Solutions Corporation Traveling assistance device, traveling assistance management device, methods of same devices, and traveling assistance system
KR20180123354A (ko) * 2017-05-08 2018-11-16 엘지전자 주식회사 차량용 사용자 인터페이스 장치 및 차량
CN110031010A (zh) * 2019-04-09 2019-07-19 百度在线网络技术(北京)有限公司 车辆引导路线绘制方法、装置及设备
CN110588510B (zh) * 2019-08-26 2021-09-07 华为技术有限公司 一种对本车的预警方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012101318A1 (en) * 2011-01-28 2012-08-02 Wärtsilä Finland Oy An arrangement and a method for synchronizing a generator set to an electric network
WO2017113403A1 (zh) * 2015-12-31 2017-07-06 华为技术有限公司 一种影像信息处理方法及增强现实ar设备
CN109792543A (zh) * 2016-09-27 2019-05-21 深圳市大疆创新科技有限公司 根据可移动物捕获的图像数据创建视频抽象的方法和系统
CN108243332A (zh) * 2016-12-23 2018-07-03 深圳点石创新科技有限公司 车载抬头显示系统影像调节方法及车载抬头显示系统
CN107102736A (zh) * 2017-04-25 2017-08-29 上海唱风信息科技有限公司 实现增强现实的方法
CN111323042A (zh) * 2020-02-20 2020-06-23 华为技术有限公司 目标物体的预警方法、装置和电子设备

Also Published As

Publication number Publication date
CN111323042A (zh) 2020-06-23
CN111323042B (zh) 2023-10-10

Similar Documents

Publication Publication Date Title
WO2021213120A1 (zh) 投屏方法、装置和电子设备
WO2020238741A1 (zh) 图像处理方法、相关设备及计算机存储介质
WO2021258321A1 (zh) 一种图像获取方法以及装置
WO2020244623A1 (zh) 一种空鼠模式实现方法及相关设备
WO2021208723A1 (zh) 全屏显示方法、装置和电子设备
CN114119758B (zh) 获取车辆位姿的方法、电子设备和计算机可读存储介质
US20220262035A1 (en) Method, apparatus, and system for determining pose
WO2021023035A1 (zh) 一种镜头切换方法及装置
TWI818211B (zh) 眼部定位裝置、方法及3d顯示裝置、方法
WO2021180089A1 (zh) 界面切换方法、装置和电子设备
WO2021180085A1 (zh) 拾音方法、装置和电子设备
US20230005277A1 (en) Pose determining method and related device
US20220245778A1 (en) Image bloom processing method and apparatus, and storage medium
WO2021057626A1 (zh) 图像处理方法、装置、设备及计算机存储介质
US12027112B2 (en) Always on display method and mobile device
WO2021175266A1 (zh) 身份验证方法、装置和电子设备
WO2022022319A1 (zh) 一种图像处理方法、电子设备、图像处理系统及芯片系统
US20240224357A1 (en) Data Download Method, Apparatus, and Terminal Device
CN111030719A (zh) 车载装置和数据处理的方法
WO2021164387A1 (zh) 目标物体的预警方法、装置和电子设备
CN113468929A (zh) 运动状态识别方法、装置、电子设备和存储介质
CN116405758A (zh) 一种数据传输方法及电子设备
CN114079886B (zh) V2x报文发送方法、v2x通信设备及电子设备
EP4435720A1 (en) Image processing method and related device
CN116708317B (zh) 数据包mtu的调整方法、装置和终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20919822

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20919822

Country of ref document: EP

Kind code of ref document: A1