WO2018119701A1 - Procédé et dispositif d'affichage d'interface de navigation - Google Patents

Procédé et dispositif d'affichage d'interface de navigation Download PDF

Info

Publication number
WO2018119701A1
WO2018119701A1 PCT/CN2016/112464 CN2016112464W WO2018119701A1 WO 2018119701 A1 WO2018119701 A1 WO 2018119701A1 CN 2016112464 W CN2016112464 W CN 2016112464W WO 2018119701 A1 WO2018119701 A1 WO 2018119701A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual image
navigation
interface display
real
distance
Prior art date
Application number
PCT/CN2016/112464
Other languages
English (en)
Chinese (zh)
Inventor
王斌
王凤周
马世奎
邵煜璇
郭丽媛
廉士国
王恺
Original Assignee
深圳前海达闼云端智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳前海达闼云端智能科技有限公司 filed Critical 深圳前海达闼云端智能科技有限公司
Priority to CN201680013203.1A priority Critical patent/CN107466357B/zh
Priority to PCT/CN2016/112464 priority patent/WO2018119701A1/fr
Publication of WO2018119701A1 publication Critical patent/WO2018119701A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes

Definitions

  • the present application relates to the field of navigation, and in particular, to a navigation interface display method and apparatus.
  • intelligent terminals In the process of intelligent navigation, intelligent terminals usually need to obtain a variety of environmental information for navigation decisions through various sensors. For example, the real-time image of the surrounding environment, the distance of the nearby obstacles, the current position indicated by the GPS (global position system) signal, and the current position obtained by three-point positioning of the base station signal are acquired by the terminal located at the site, These environmental information is then displayed on the navigation interface to make navigation decisions based on the above environmental information.
  • GPS global position system
  • real-time images of the surrounding environment are displayed on the navigation interface, and navigation decisions (eg, steering, forward, stop, etc.) can be made accordingly.
  • the inventors found in the process of studying the prior art that other environmental information, such as the distance of the nearby obstacles, the current location, etc., are also necessary for making more accurate navigation decisions.
  • the environmental information is displayed directly in the navigation interface, it is not intuitive enough to make navigation decisions quickly. For example, if the distance of the nearest obstacle is displayed directly on the navigation interface, the distance between the recent obstacles is 5m, and the real-time image of the surrounding environment is used to judge the nearest. The direction or location of the obstacle.
  • Embodiments of the present application provide a navigation interface display method and apparatus for visually displaying various environmental information on a navigation interface.
  • a navigation interface display method including:
  • the auxiliary guide The navigation information is measured for the surrounding environment of the current location;
  • the real-time image and the virtual image are displayed on a navigation interface, wherein the virtual image is superimposed on the real-time image.
  • a navigation interface display device including:
  • An acquiring unit configured to acquire a real-time image of the current location and corresponding auxiliary navigation information, where the auxiliary navigation information is measured by the surrounding environment of the current location;
  • a converting unit configured to convert the auxiliary navigation information into a virtual image
  • a display unit configured to display the real-time image and the virtual image on a navigation interface, wherein the virtual image is superimposed on the real-time image.
  • a computer storage medium for storing computer software instructions for use in a navigation interface display device, comprising program code designed to perform the navigation interface display method of the first aspect.
  • a computer program product which can be directly loaded into an internal memory of a computer and includes software code, and the computer program can be loaded and executed by a computer to implement the navigation interface display method according to the first aspect.
  • an electronic device comprising: a memory for communicating computer execution code, and a processor for executing the computer to perform code control to perform the navigation according to the first aspect
  • An interface display method is used for navigating data transmission between the interface display device and an external device.
  • the navigation interface display method and device obtained the real-time image of the current location and the corresponding auxiliary navigation information, and convert the auxiliary navigation information into a virtual image and superimpose it on the real-time image for display, so that the auxiliary navigation information is further Intuitively displayed on the navigation interface, it is easy to make navigation decisions quickly with real-time images.
  • FIG. 1 is a schematic structural diagram of a navigation interface display system according to an embodiment of the present disclosure
  • FIG. 2 is a schematic structural diagram of a server according to an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a navigation interface display method according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of the distance of the auxiliary navigation information provided by the embodiment of the present application as a nearby obstacle;
  • FIG. 5 is a schematic diagram of the auxiliary navigation information provided by the embodiment of the present application as location information
  • FIG. 6 is a schematic diagram of displaying real-time images and virtual images on a navigation interface according to an embodiment of the present application
  • FIG. 7 is a schematic diagram of another real-time image and a virtual image displayed on a navigation interface according to an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a navigation interface display device according to an embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of another navigation interface display device according to an embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the embodiment of the present application provides a navigation interface display system, which is shown in FIG. 1 and includes: a server 1 and a corresponding display device 2 located in the cloud, and a terminal 3 located at the site.
  • the server 1 includes a navigation interface display device 11 for controlling content displayed in the display device 2.
  • the display device 2 may include a device having a display function such as a display, a tablet, or the like.
  • Terminal 3 can be a smart device that combines information collection and presentation (such as mobile phone, smart glasses, smart helmet, etc.), which can collect target information and send it to the server 1 through wired (for example, cable, network cable) or wireless (for example, WIFI, Bluetooth), and is controlled by the navigation interface display device 11 of the server. Display is performed on the display device 2.
  • Target information includes, but is not limited to, sound, image, distance, light intensity, three-dimensional data, and the like.
  • the server 1 provided by the embodiment of the present application, as shown in FIG. 2, includes: the server 12 includes: a processor 1101, a memory 1102, and a communication interface 1103, which are connected by a bus.
  • the memory 1102 is configured to store code and data and the like for execution by the processor 1101, and the communication interface 1103 is configured to communicate with the terminal 3 in a wired or wireless manner.
  • Embodiments of the present application are particularly applicable to guide blind helmet products.
  • environmental information such as images and distances are collected by the guide helmet located in the field, and then transmitted to the server located in the cloud and displayed by the server on the navigation interface, which is displayed by the manual customer service according to the navigation interface.
  • Environmental information makes blind decisions.
  • the embodiment of the present application provides a navigation interface display method, as shown in FIG. 3, including:
  • the server acquires a real-time image of the current location and corresponding auxiliary navigation information, where the auxiliary navigation information is measured by a surrounding environment of the current location, and the auxiliary navigation information is used to provide decision assistance when making a navigation decision for the terminal.
  • the auxiliary navigation information described in the embodiments of the present application may include: a distance of a nearby obstacle measured by a distance sensor such as an ultrasonic radar or a laser radar, or a position information corresponding to a GPS signal measured by a GPS sensor, or received by a transceiver.
  • the base station signal performs position information obtained by three-point positioning and the like.
  • the terminal is a guide blind device.
  • the server converts the auxiliary navigation information into a virtual image.
  • the auxiliary navigation information is the distance of the nearby obstacle
  • a plurality of distance sensors are generally used, and respectively, different directions are detected.
  • the actual distribution position of each distance sensor can be represented by a different position marker 401 in the virtual image to represent the distance sensor in different measurement directions (shown by black arrows in the figure).
  • different colors can be used to represent the distance value measured by the distance sensor in each direction, for example, the first color indicates safety, indicating nearby obstacles; the second color indicates normal alarm, indicating that there is an obstacle in the distance; The third color indicates a severe alarm indicating that there is an obstacle in the close range or the movement speed is fast.
  • the alarm information of the distance sensor is enhanced, so that the manual customer service can easily judge the dangerous direction and urgency.
  • the first color is green
  • the second color is yellow
  • the third color is red.
  • the auxiliary navigation information is position information, as shown in FIG. 5, the current position 501 and the forward direction 502 corresponding to the position information may be displayed in a map in the virtual image.
  • the server displays the real-time image and the virtual image on the navigation interface, wherein the virtual image is superimposed on the real-time image.
  • the virtual image 601 representing the distance sensor may be superimposed on the real-time image 602.
  • the solid mark in the virtual image 601 indicates that the distance sensor in the direction measures an obstacle nearby, and in combination with the real-time image 601, it is found that the object 6021 exists in the direction, so that the navigation decision can be made accordingly.
  • the virtual image 701 representing the position information in the form of a map may be superimposed on the live image 602.
  • the current location in the map can be known to facilitate further navigation decisions.
  • the navigation interface display method provided by the embodiment of the present application obtains the real-time image of the current location and the corresponding auxiliary navigation information, and converts the auxiliary navigation information into a virtual image and superimposes it on the real-time image for display, so that the auxiliary navigation information is more intuitively Displayed on the navigation interface, it is easy to make navigation decisions quickly with real-time images.
  • the embodiments of the present application may divide the functional modules of each device according to the foregoing method example.
  • each functional module may be divided according to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules can be used It can be implemented in the form of hardware or in the form of software function modules. It should be noted that the division of the module in the embodiment of the present application is schematic, and is only a logical function division, and the actual implementation may have another division manner.
  • FIG. 8 is a schematic diagram showing a possible structure of the navigation interface display device involved in the above embodiment.
  • the navigation interface display device 11 includes: an acquisition unit 1111 and a conversion unit. 1112. Display unit 1113.
  • the obtaining unit 1111 is configured to support the navigation interface display device to perform the process S101 in FIG. 3; the converting unit 1112 is configured to support the navigation interface display device to perform the process S102 in FIG. 3; and the display unit 1113 is configured to support the navigation interface display device to perform the process in FIG. Process S103.
  • the acquiring unit 1111 is configured to acquire a real-time image of the current location and corresponding auxiliary navigation information, where the auxiliary navigation information is measured by the surrounding environment of the current location; the converting unit 1112 is configured to convert the auxiliary navigation information into a virtual image; 1113 is used to display real-time images and virtual images on a navigation interface, wherein the virtual images are superimposed on the real-time images. All the related content of the steps involved in the foregoing method embodiments may be referred to the functional descriptions of the corresponding functional modules, and details are not described herein again.
  • the auxiliary navigation information is distance information
  • the virtual image contains markers at different positions
  • the markers at different positions are used to identify the actual distribution position of the distance sensor
  • the markers at different positions represent the distance sensor measured by different colors. Distance value.
  • the markers at different positions are safe in the first color, indicating no obstacles in the vicinity; the second color indicates the normal alarm, indicating that there is an obstacle in the distance; the third color indicates the severe alarm, indicating Obstacle or obstacle moves at a short distance.
  • the auxiliary navigation information is the current location and the virtual image displays the current location as a map.
  • the terminal is a guide blind device.
  • FIG. 9 shows a possible structural diagram of the navigation interface display device involved in the above embodiment.
  • Navigation interface display device 11 includes a processing module 1122 and a communication module 1123.
  • the processing module 1122 is configured to control and manage the actions of the navigation interface display device.
  • the processing module 1122 is configured to support the navigation interface display device to perform the processes S101-S103 in FIG. 3.
  • the communication module 1113 is for supporting communication between the navigation interface display device and other entities, such as communication with the functional modules or network entities shown in FIG.
  • the navigation interface display device 11 may further include a storage module 1121 for storing program codes and data of the navigation interface display device.
  • the processing module 1122 may be a processor or a controller, for example, may be a central processing unit (CPU), a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (application-specific Integrated circuit (ASIC), field programmable gate array (FPGA) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. It is possible to implement or carry out the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor may also be a combination of computing functions, for example, including one or more microprocessor combinations, a combination of a DSP and a microprocessor, and the like.
  • the communication module 1123 can be a transceiver, a transceiver circuit, a communication interface, or the like.
  • the storage module 1121 can be a memory.
  • the navigation interface display device may be the electronic device shown in FIG.
  • the electronic device 1 includes a processor 1132, a transceiver 1133, a memory 1131, and a bus 1134.
  • the transceiver 1133, the processor 1132, and the memory 1131 are connected to each other through a bus 1134;
  • the bus 1134 may be a peripheral component interconnect (PCI) bus or an extended industry standard architecture (EISA) bus. Wait.
  • PCI peripheral component interconnect
  • EISA extended industry standard architecture
  • the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is shown in Figure 11, but it does not mean that there is only one bus or one type of bus.
  • the steps of a method or algorithm described in connection with the present disclosure may be implemented in a hardware or may be implemented by a processor executing software instructions.
  • Ben Shen The embodiment also provides a storage medium, which may include a memory 1131 for storing computer software instructions for use, including program code designed to execute a navigation interface display method.
  • the software instructions may be composed of corresponding software modules, and the software modules may be stored in a random access memory (RAM), a flash memory, a read only memory (ROM), and an erasable programmable only.
  • RAM random access memory
  • ROM read only memory
  • EEPROM electrically erasable programmable read only memory
  • An exemplary storage medium is coupled to the processor to enable the processor to read information from, and write information to, the storage medium.
  • the storage medium can also be an integral part of the processor.
  • the processor and the storage medium can be located in an ASIC.
  • the ASIC can be located in the middle.
  • the processor and the storage medium may also be present as discrete components.
  • the embodiment of the present application further provides a computer program, which can be directly loaded into the memory 1131 and contains software code. After the computer program is loaded and executed by the computer, the navigation interface display method described above can be implemented.

Abstract

La présente invention concerne un procédé et un dispositif d'affichage d'interface de navigation, qui concernent le domaine de la navigation et sont utilisés pour afficher visuellement divers types d'informations d'environnement sur une interface de navigation. Le procédé d'affichage d'interface de navigation comprend : l'acquisition d'une image en temps réel d'une position actuelle et d'informations de navigation auxiliaires correspondantes, les informations de navigation auxiliaires étant obtenues en mesurant un environnement de la position actuelle ; la conversion des informations de navigation auxiliaires en une image virtuelle ; et l'affichage, sur une interface de navigation, de l'image en temps réel et de l'image virtuelle, l'image virtuelle étant superposée sur l'image en temps réel. Les modes de réalisation de la présente invention sont appliqués à un casque de guidage pour personnes aveugles.
PCT/CN2016/112464 2016-12-27 2016-12-27 Procédé et dispositif d'affichage d'interface de navigation WO2018119701A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201680013203.1A CN107466357B (zh) 2016-12-27 2016-12-27 导航界面显示方法和装置
PCT/CN2016/112464 WO2018119701A1 (fr) 2016-12-27 2016-12-27 Procédé et dispositif d'affichage d'interface de navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/112464 WO2018119701A1 (fr) 2016-12-27 2016-12-27 Procédé et dispositif d'affichage d'interface de navigation

Publications (1)

Publication Number Publication Date
WO2018119701A1 true WO2018119701A1 (fr) 2018-07-05

Family

ID=60545986

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/112464 WO2018119701A1 (fr) 2016-12-27 2016-12-27 Procédé et dispositif d'affichage d'interface de navigation

Country Status (2)

Country Link
CN (1) CN107466357B (fr)
WO (1) WO2018119701A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111125582A (zh) * 2019-12-16 2020-05-08 北京明略软件系统有限公司 信息图标显示方法、装置、电子设备及可读存储介质
CN112184919A (zh) * 2020-10-12 2021-01-05 中国联合网络通信集团有限公司 生成ar设备视野信息的方法、装置及存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114577233B (zh) * 2022-05-05 2022-07-29 腾讯科技(深圳)有限公司 一种车辆导航方法、装置及计算机设备、存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101976460A (zh) * 2010-10-18 2011-02-16 胡振程 车载多目摄像机环视系统的虚拟视点图像生成方法
US20120249794A1 (en) * 2011-03-31 2012-10-04 Fujitsu Ten Limited Image display system
CN103150759A (zh) * 2013-03-05 2013-06-12 腾讯科技(深圳)有限公司 一种对街景图像进行动态增强的方法和装置
CN103201731A (zh) * 2010-12-02 2013-07-10 英派尔科技开发有限公司 增强现实系统
CN104359487A (zh) * 2014-11-13 2015-02-18 沈阳美行科技有限公司 一种实景导航系统
CN105513389A (zh) * 2015-11-30 2016-04-20 小米科技有限责任公司 增强现实的方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101976460A (zh) * 2010-10-18 2011-02-16 胡振程 车载多目摄像机环视系统的虚拟视点图像生成方法
CN103201731A (zh) * 2010-12-02 2013-07-10 英派尔科技开发有限公司 增强现实系统
US20120249794A1 (en) * 2011-03-31 2012-10-04 Fujitsu Ten Limited Image display system
CN102740056A (zh) * 2011-03-31 2012-10-17 富士通天株式会社 图像显示系统
CN103150759A (zh) * 2013-03-05 2013-06-12 腾讯科技(深圳)有限公司 一种对街景图像进行动态增强的方法和装置
CN104359487A (zh) * 2014-11-13 2015-02-18 沈阳美行科技有限公司 一种实景导航系统
CN105513389A (zh) * 2015-11-30 2016-04-20 小米科技有限责任公司 增强现实的方法及装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111125582A (zh) * 2019-12-16 2020-05-08 北京明略软件系统有限公司 信息图标显示方法、装置、电子设备及可读存储介质
CN112184919A (zh) * 2020-10-12 2021-01-05 中国联合网络通信集团有限公司 生成ar设备视野信息的方法、装置及存储介质
CN112184919B (zh) * 2020-10-12 2023-12-01 中国联合网络通信集团有限公司 生成ar设备视野信息的方法、装置及存储介质

Also Published As

Publication number Publication date
CN107466357A (zh) 2017-12-12
CN107466357B (zh) 2019-12-03

Similar Documents

Publication Publication Date Title
KR101622028B1 (ko) 차량 통신을 이용한 차량 제어 장치 및 제어 방법
KR102348127B1 (ko) 전자 장치 및 그의 제어 방법
CN107223200B (zh) 一种导航方法、装置及终端设备
US20230048230A1 (en) Method for displaying lane information and apparatus for executing the method
EP3871936B1 (fr) Procédé d'avertissement de distance de sécurité pour stationnement automatique et terminal embarqué
US20200180656A1 (en) Method, apparatus, electronic device, computer program and computer readable recording medium for measuring inter-vehicle distance using driving image
KR102383425B1 (ko) 전자 장치, 전자 장치의 제어 방법, 컴퓨터 프로그램 및 컴퓨터 판독 가능한 기록 매체
KR20150140449A (ko) 전자 장치, 전자 장치의 제어 방법 및 컴퓨터 판독 가능한 기록 매체
KR102406491B1 (ko) 전자 장치, 전자 장치의 제어 방법, 컴퓨터 프로그램 및 컴퓨터 판독 가능한 기록 매체
EP3715162B1 (fr) Dispositif et procédé d'affichage de contenu
WO2018119701A1 (fr) Procédé et dispositif d'affichage d'interface de navigation
KR102233391B1 (ko) 전자 장치, 전자 장치의 제어 방법 및 컴퓨터 판독 가능한 기록 매체
CN115427832A (zh) 自动驾驶车辆的激光雷达和图像校准
CN110962744A (zh) 车辆盲区检测方法和车辆盲区检测系统
KR20170007146A (ko) 반도체 장치, 제어 시스템 및 관측 방법
US20200166346A1 (en) Method and Apparatus for Constructing an Environment Model
KR102406490B1 (ko) 전자 장치, 전자 장치의 제어 방법, 컴퓨터 프로그램 및 컴퓨터 판독 가능한 기록 매체
KR101601644B1 (ko) 타워 크레인 충돌 방지 시스템
KR20200084955A (ko) 차량 및 그 제어방법
KR20200070101A (ko) 차선 표시 방법 및 이를 수행하는 전자 기기
KR102299501B1 (ko) 전자 장치, 전자 장치의 제어 방법 및 컴퓨터 판독 가능한 기록 매체
KR102371620B1 (ko) 전자 장치, 전자 장치의 제어 방법 및 컴퓨터 판독 가능한 기록 매체
JP2014170378A (ja) 情報処理センターおよび車載機
CN116046003A (zh) 路径规划方法、装置、车辆以及存储介质
CN114511840A (zh) 自动驾驶的感知数据处理方法、装置及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16925163

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 16/10/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16925163

Country of ref document: EP

Kind code of ref document: A1