CN105301585A - Information display method and device - Google Patents

Information display method and device Download PDF

Info

Publication number
CN105301585A
CN105301585A CN201510736035.6A CN201510736035A CN105301585A CN 105301585 A CN105301585 A CN 105301585A CN 201510736035 A CN201510736035 A CN 201510736035A CN 105301585 A CN105301585 A CN 105301585A
Authority
CN
China
Prior art keywords
data
radar
view data
information
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510736035.6A
Other languages
Chinese (zh)
Other versions
CN105301585B (en
Inventor
鲍协浩
万钰臻
杨万坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Technology Co Ltd
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201510736035.6A priority Critical patent/CN105301585B/en
Publication of CN105301585A publication Critical patent/CN105301585A/en
Application granted granted Critical
Publication of CN105301585B publication Critical patent/CN105301585B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses an information display method and device, and belongs to the technical field of near field identification. The method is used for a terminal that includes a radar component and a camera shooting component, and includes the steps of: obtaining radar data to which each object in a designated area respectively corresponds and which are obtained by detection of the designated area by the radar component; obtaining image data to which each object respectively corresponds to and which are obtained by shooting the designated area by the camera shooting component; associating the radar data and the image data which correspond to the same object to be the same set of data; and performing aggregation display of each set of associated radar data and image data. The information display method provided by the invention aggregates the radar data and image data of the same object together to be displayed, so that a user can visually feel an image of the object and the position of the object relative to the terminal, thereby achieving the effects of expanding a display mode of a near field identification result, and improving a display effect and user experience.

Description

Information displaying method and device
Technical field
The disclosure relates to recognition technology field, near field, particularly a kind of information displaying method and device.
Background technology
The recognition technology that radar combines with camera is a kind of near field recognition technology, and the application in parking assisting system is comparatively extensive.
In the related, parking assisting system generally includes the radar for backing car and rearview camera that are arranged on automobile rear, when moveing backward, the image shows photographed is controlled in display screen by rearview camera in automobile, the object in the certain angle of the automobile rear of radar for backing car scanning simultaneously.Barrier is there is when radar for backing car scans automobile rear, and when the distance between barrier and automobile is less than certain threshold value, by voice or buzzing call user's attention reversing safety.
Summary of the invention
Present disclose provides a kind of information displaying method and device.Described technical scheme is as follows:
According to first aspect of the present disclosure, provide a kind of information displaying method, for comprising in the terminal of radar component and camera assembly, described method comprises:
Obtain described radar component to appointed area carry out detect obtain, each self-corresponding radar data of each object in described appointed area;
Obtain described camera assembly to described appointed area carry out take obtain, each self-corresponding view data of each object described;
The radar data of same for correspondence object and view data are associated as same group of data;
The each group of radar data be associated and view data are carried out polymerization to show.
Optionally, described radar data and described view data comprise the positional information of each self-corresponding object relative to described terminal, described the radar data of same for correspondence object and view data are associated as same group of data, comprising:
Identical relative to the positional information of described terminal one group of radar data and view data are associated as same group of data.
Optionally, described radar data comprises the shape of corresponding object, size and the positional information relative to described terminal, and described view data comprises the image of corresponding object; Described each group of radar data be associated and view data are carried out polymerization show, comprising:
In visualization interface, the contour pattern of described corresponding object is shown according to the shape of corresponding object, size and the positional information relative to described terminal;
According to the radar data be associated and view data, the contour pattern of corresponding described corresponding object shows the image of described corresponding object.
Optionally, described radar data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object; Described each group of radar data be associated and view data are carried out respectively polymerization show, comprising:
Image according to corresponding object carries out image recognition, determines the name information of described corresponding object;
According to the radar data be associated and view data, in same speech message, show the positional information of described corresponding object relative to described terminal and the name information of described corresponding object.
Optionally, described view data comprises the image of corresponding object, and described method also comprises:
When the instruction of detail information receiving at least one object in inquiry each object described, the image corresponding according at least one object described carries out image recognition;
According to the result of the image recognition detail information from network side inquiry at least one object described;
Show the detail information of at least one object described.
Optionally, at least one in described radar data and described view data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object, and described method also comprises:
Corresponding object according to the image recognition of corresponding object;
Determine the hidden objects be associated with described corresponding object in advance;
Obtain corresponding the hiding Info of described hidden objects, described in hide Info and comprise the positional information of described hidden objects relative to described corresponding object and the name information of described hidden objects;
The positional information of described hidden objects relative to described terminal is determined relative to the positional information of described terminal and described hidden objects relative to the positional information of described corresponding object according to described corresponding object;
The positional information of described hidden objects relative to described terminal and the name information of described hidden objects is shown in same message.
Optionally, described method also comprises:
Determine not associated object, described not associated object is in described appointed area, only by assembly detection in described radar component and described camera assembly or the object that photographs;
Show the information of described not associated object;
Wherein, when the object that described not associated object is detected by described radar component, the information of described not associated object comprises the radar data of described not associated object; When the object that described not associated object is photographed by described camera assembly, the information of described not associated object comprises the view data of described not associated object.
According to second aspect of the present disclosure, provide a kind of device for displaying information, for comprising in the terminal of radar component and camera assembly, described device comprises:
Radar data acquisition module, for obtain described radar component to appointed area carry out detect obtain, each self-corresponding radar data of each object in described appointed area;
Image data acquisition module, for obtain described camera assembly to described appointed area carry out take obtain, each self-corresponding view data of each object described;
Relating module, for being associated as same group of data by the radar data of same for correspondence object and view data;
First display module, shows for each group that is associated radar data and view data being carried out polymerization.
Optionally, described relating module, for being associated as same group of data by identical relative to the positional information of described terminal one group of radar data and view data;
Wherein, described radar data and described view data comprise the positional information of each self-corresponding object relative to described terminal.
Optionally, described first display module, comprising:
First shows submodule, shows the contour pattern of described corresponding object for the shape according to corresponding object, size and the positional information relative to described terminal in visualization interface;
Second shows submodule, the radar data be associated for basis and view data, and the contour pattern of corresponding described corresponding object shows the image of described corresponding object;
Wherein, described radar data comprises the shape of corresponding object, size and the positional information relative to described terminal, and described view data comprises the image of corresponding object.
Optionally, described first display module, comprising:
Recognin module, for carrying out image recognition for the image according to corresponding object, determines the name information of described corresponding object;
3rd shows submodule, for according to the radar data be associated and view data, shows the positional information of described corresponding object relative to described terminal and the name information of described corresponding object in same speech message;
Wherein, described radar data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object.
Optionally, described device also comprises:
First identification module, for when the instruction of detail information receiving at least one object in inquiry each object described, the image corresponding according at least one object described carries out image recognition;
Enquiry module, for inquiring about the detail information of at least one object described from network side according to the result of image recognition;
Second display module, for showing the detail information of at least one object described;
Wherein, described view data comprises the image of corresponding object.
Optionally, described device also comprises:
Second identification module, for object corresponding according to the image recognition of corresponding object;
Hidden objects determination module, for determining the hidden objects be associated with described corresponding object in advance;
Data obtaining module, for obtaining corresponding the hiding Info of described hidden objects, described in hide Info and comprise the positional information of described hidden objects relative to described corresponding object and the name information of described hidden objects;
Position determination module, for determining described hidden objects positional information relative to described terminal relative to the positional information of described terminal and described hidden objects relative to the positional information of described corresponding object according to described corresponding object;
3rd display module, for showing the positional information of described hidden objects relative to described terminal and the name information of described hidden objects in same message;
Wherein, at least one in described radar data and described view data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object.
Optionally, described device also comprises:
Not associated object determination module, for determining not associated object, described not associated object is in described appointed area, only by assembly detection in described radar component and described camera assembly or the object that photographs;
4th display module, for showing the information of described not associated object;
Wherein, when the object that described not associated object is detected by described radar component, the information of described not associated object comprises the radar data of described not associated object; When the object that described not associated object is photographed by described camera assembly, the information of described not associated object comprises the view data of described not associated object.
According to the third aspect of the present disclosure, provide a kind of device for displaying information, for comprising in the terminal of radar component and camera assembly, described device comprises:
Processor;
For the storer of storage of processor executable instruction;
Wherein, described processor is configured to:
Obtain described radar component to appointed area carry out detect obtain, each self-corresponding radar data of each object in described appointed area;
Obtain described camera assembly to described appointed area carry out take obtain, each self-corresponding view data of each object described;
The radar data of same for correspondence object and view data are associated as same group of data;
The each group of radar data be associated and view data are carried out polymerization to show.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect:
By obtain this radar component to appointed area carry out detect obtain, the each self-corresponding radar data of each object in this appointed area, obtain this camera assembly to this appointed area carry out take obtain, the each self-corresponding view data of this each object, the radar data of same for correspondence object and view data are associated as same group of data, each group of radar data be associated and view data are carried out polymerization and show; The radar data of same object and view data are condensed together and shows, user is enable to experience the image of object and the object position relative to terminal intuitively, reach the exhibition method of expansion near field recognition result, improve the effect of bandwagon effect and Consumer's Experience.
Should be understood that, it is only exemplary that above general description and details hereinafter describe, and can not limit the disclosure.
Accompanying drawing explanation
Accompanying drawing to be herein merged in instructions and to form the part of this instructions, shows embodiment according to the invention, and is used from instructions one and explains principle of the present invention.
Fig. 1 is the process flow diagram of a kind of information displaying method according to an exemplary embodiment;
Fig. 2 A is the process flow diagram of a kind of information displaying method according to an exemplary embodiment;
Fig. 2 B is the one polymerization display schematic diagram according to Fig. 2 A illustrated embodiment;
Fig. 3 A is the process flow diagram of a kind of information displaying method according to an exemplary embodiment;
Fig. 3 B is the blind person's accessory terminal application scenarios figure according to Fig. 3 A illustrated embodiment;
Fig. 4 is the block diagram of a kind of device for displaying information according to an exemplary embodiment;
Fig. 5 is the block diagram of a kind of device for displaying information according to an exemplary embodiment;
Fig. 6 is the block diagram of a kind of device according to an exemplary embodiment.
Embodiment
Here will perform explanation to exemplary embodiment in detail, its sample table shows in the accompanying drawings.When description below relates to accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawing represents same or analogous key element.Embodiment described in following exemplary embodiment does not represent all embodiments consistent with the present invention.On the contrary, they only with as in appended claims describe in detail, the example of apparatus and method that aspects more of the present invention are consistent.
Fig. 1 is the process flow diagram of a kind of information displaying method according to an exemplary embodiment.This information displaying method is for comprising in the terminal of radar component and camera assembly.As shown in Figure 1, this information displaying method can comprise the following steps.
In a step 101, obtain this radar component to appointed area carry out detect obtain, each self-corresponding radar data of each object in this appointed area.
In a step 102, obtain this camera assembly to this appointed area carry out take obtain, each self-corresponding view data of this each object.
In step 103, the radar data of same for correspondence object and view data are associated as same group of data.
At step 104, each group of radar data be associated and view data are carried out polymerization to show.
Optionally, described radar data and described view data comprise the positional information of each self-corresponding object relative to described terminal, described the radar data of same for correspondence object and view data are associated as same group of data, comprising:
Identical relative to the positional information of described terminal one group of radar data and view data are associated as same group of data.
Optionally, described radar data comprises the shape of corresponding object, size and the positional information relative to described terminal, and described view data comprises the image of corresponding object; Described each group of radar data be associated and view data are carried out respectively polymerization show, comprising:
For each group radar data and view data, in visualization interface, show the contour pattern of described object according to the shape of corresponding object, size and the positional information relative to described terminal;
The contour pattern of corresponding described object shows the image of described object.
Optionally, described radar data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object; Described each group of radar data be associated and view data are carried out respectively polymerization show, comprising:
For each group radar data and view data, the image according to corresponding object carries out image recognition, determines the name information of described object;
The positional information of described object relative to described terminal and the name information of described object is shown in same speech message.
Optionally, described view data comprises the image of corresponding object, and described method also comprises:
When the instruction of detail information receiving at least one object in inquiry each object described, the image corresponding according at least one object described carries out image recognition;
According to the result of the image recognition detail information from network side inquiry at least one object described;
Show the detail information of at least one object described.
Optionally, at least one in described radar data and described view data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object, and described method also comprises:
Corresponding object according to the image recognition of corresponding object;
Determine the hidden objects be associated with described corresponding object in advance;
Obtain corresponding the hiding Info of described hidden objects, described in hide Info and comprise the positional information of described hidden objects relative to described corresponding object and the name information of described hidden objects;
The positional information of described hidden objects relative to described terminal is determined relative to the positional information of described terminal and described hidden objects relative to the positional information of described corresponding object according to described corresponding object;
The positional information of described hidden objects relative to described terminal and the name information of described hidden objects is shown in same message.
Optionally, described method also comprises:
Determine not associated object, described not associated object is in described appointed area, only by assembly detection in described radar component and described camera assembly or the object that photographs;
Show the information of described not associated object;
Wherein, when the object that described not associated object is detected by described radar component, the information of described not associated object comprises the radar data of described not associated object; When the object that described not associated object is photographed by described camera assembly, the information of described not associated object comprises the view data of described not associated object.
In sum, information displaying method shown in disclosure embodiment, by obtain this radar component to appointed area carry out detect obtain, the each self-corresponding radar data of each object in this appointed area, obtain this camera assembly to this appointed area carry out take obtain, the each self-corresponding view data of this each object, is associated as same group of data by the radar data of same for correspondence object and view data, each group of radar data be associated and view data is carried out polymerization and shows; The radar data of same object and view data are condensed together and shows, user is enable to experience the image of object and the object position relative to terminal intuitively, reach the exhibition method of expansion near field recognition result, improve the effect of bandwagon effect and Consumer's Experience.
Fig. 2 A is the process flow diagram of a kind of information displaying method according to an exemplary embodiment.This information displaying method may be used for comprising in the terminal of radar component and camera assembly.In one embodiment, this terminal can be the near field identification terminal be built in automobile, such as parking assisting system.As shown in Figure 2 A, this information displaying method can comprise the following steps.
In step 201, obtain this radar component to appointed area carry out detect obtain, each self-corresponding radar data of each object in this appointed area.
Radar component is fixedly installed in the terminal, and its direction of scanning is also constant relative to terminal.The radar data that the scanning of this radar component obtains comprises the shape of each object in appointed area, size and the positional information etc. relative to this terminal.
Such as, when this radar component applies near field identification terminal in the car, can set up coordinate system according to the direction of headstock, the center of automobile is initial point, and each object can be each object coordinate in the coordinate system relative to the positional information of terminal.When radar component scans, first the angle and distance of an object relative to radar component is obtained, and calculate the angle and distance of this object relative to autocentre according to this object relative to the angle and distance of radar component, and calculate the coordinate of this object in the coordinate system taking autocentre as initial point further.
In step 202., obtain this camera assembly to this appointed area carry out take obtain, each self-corresponding view data of this each object.
In one embodiment, camera assembly, when the image of shot object, can also measure the distance between camera assembly and object by the mode of focusing, and the angle of records photographing.
Such as, when this camera assembly applies near field identification terminal in the car, similar with radar component, camera assembly is fixedly installed in the terminal, the distance between each object and camera assembly that camera assembly can obtain according to focusing and shooting angle are determined to calculate the angle and distance of this object relative to autocentre, and calculate the coordinate of this object in the coordinate system taking autocentre as initial point further.
In one embodiment, for each object photographed, camera assembly can by Iamge Segmentation corresponding for this object out and store.
In step 203, identical relative to the positional information of this terminal one group of radar data and view data are associated as same group of data.
According to the disclosure one embodiment, when radar component scanning object and camera assembly shot object, capital obtains the positional information (angle and distance) of this object relative to terminal, therefore, identical relative to the positional information of terminal one group of radar data and view data can be associated as radar data and the view data of same object by terminal.
In step 204, in visualization interface, the contour pattern of this corresponding object is shown according to the shape of corresponding object, size and the positional information relative to this terminal.
In step 205, according to the radar data be associated and view data, to the contour pattern of homologue body showing the image of this corresponding object.
When carrying out the displaying of radar data and view data, for radar data and the view data of same object, first the contour pattern of this object can be shown in visualization interface, wherein, the center of visualization interface can the center of terminal, the contour pattern of object to be illustrated in the correspondence position in visualization interface relative to the positional information of terminal according to object, and by the image of this object to showing the contour pattern of object.
In step 206, when the instruction of detail information receiving at least one object in this each object of inquiry, the image corresponding according to this at least one object carries out image recognition.
In step 207, inquire about the detail information of this at least one object according to the result of image recognition from network side, and show the detail information of this at least one object.
When user wants the detail information checking some objects, contour pattern or the image of this object can be chosen in visualization interface, terminal can carry out image recognition to the image of this object, obtain the type information of this object, name information and relevant text message etc., and from the information that network side search is identified, the result of search is shown in visualization interface, or is shown by modes such as voice.
Optionally, near field identification terminal can also monitor each object relative to the distance change between terminal, when distance between the one or more object detected in each object and terminal is decreased to a certain default distance threshold, sends acousto-optic and remind, dodge with call user's attention.
Be applied in automobile with the method shown in disclosure embodiment, this terminal is the near field identification terminal in automobile is example, near field identification terminal is by being arranged on the radar component scanning motor vehicle environment 360 degree of automobile surrounding, and radius is the object within the scope of 50 meters, obtain each body form within the scope of this, size and the radar data such as positional information relative to automobile.In addition, near field identification terminal also by being arranged on the image within the scope of the camera assembly shooting motor vehicle environment 360 degree of automobile surrounding, and is partitioned into the image of the object photographed.When taking image, can focus to each object, to obtain the distance between this object and terminal, and obtain the positional information of each object relative to automobile in conjunction with shooting angle, be stored as view data by the image of each object and relative to the positional information of automobile.Identical relative to the positional information of automobile one group of radar data associates with view data by near field identification terminal, and in automobile, control radar data and view data that in display screen, polymerization display is associated.
Such as, please refer to Fig. 2 B, it illustrates the polymerization display schematic diagram that disclosure embodiment provides.As shown in Figure 2 B, the middle control display screen center of automobile is the signal figure 20 of automobile, and the near field identification terminal of automobile scans and photographs 3 objects, is respectively 21,22 and 23.Around the signal figure 20 of automobile, according to scan and photograph three relative positions between object and automobile show contour pattern and the image of these three objects.For object 21, the shape that the contour pattern 21a of object 21 is scanned according to this object 21 shows, and the image 21b of object 21 shows with the form of label frame, and the arrow points contour pattern 21a of this label frame.
When user wants the detail information checking object 21, can click contour pattern 21a or image 21b, near field identification terminal can carry out image recognition to image 21b, according to the result of image recognition from network side relevant search information.Such as, it is a billboard that near field identification terminal identifies image 21b, and identifying wherein advertisement text, near field identification terminal searches for the relevant information of this advertisement text from network side, the commodity that such as this advertisement text is corresponding or the price of service, the distance of businessman, position and navigation information etc.
Optionally, in the disclosed embodiments, terminal can also determine not associated object, and show the information of this not associated object, wherein, this not associated object is in this appointed area, only by an assembly detection in this radar component and this camera assembly or the object that photographs, when the object that this not associated object is detected by this radar component, the information of this not associated object is the radar data of this not associated object; When the object that this not associated object is photographed by this camera assembly, the information of this not associated object is the view data of this not associated object.
In some scenarios, because of radar component and the detection angle of camera assembly or the difference of detection principle, some object is caused only to be detected by one of them assembly or photograph.Such as, because the relation of shooting angle, jobbie is not detected by radar component, but being caught on camera assembly photographs image.Or because stopped by objects in front, certain object is not caught on camera assembly and photographs image, but can be detected by radar component.In this case, this object only has radar data or view data, cannot associate, and in the disclosed embodiments, this type objects is called not associated object.For not associated object, terminal also can show its relevant information, and namely when this object is only detected by radar component, terminal only shows its radar data, or, when this object be only caught on camera assembly photograph time, terminal only shows its view data.But the disclosure is not as limit, when this object is only detected by radar component, except showing in radar data except this object information, can also according to the relative position between this object and terminal, and other relevant informations of this object, in view data, indicate this object.Otherwise, such as when this object be only caught on camera assembly detect time, except showing in view data except this object information, according to information such as the relative position between this object and terminal, images, this object can also be indicated in radar data.
In sum, information displaying method shown in disclosure embodiment, by obtain this radar component to appointed area carry out detect obtain, the each self-corresponding radar data of each object in this appointed area, obtain this camera assembly to this appointed area carry out take obtain, the each self-corresponding view data of this each object, is associated as same group of data by the radar data of same for correspondence object and view data, each group of radar data be associated and view data is carried out polymerization and shows; The radar data of same object and view data are condensed together and shows, user is enable to experience the image of object and the object position relative to terminal intuitively, reach the exhibition method of expansion near field recognition result, improve the effect of bandwagon effect and Consumer's Experience.
Fig. 3 A is the process flow diagram of a kind of information displaying method according to another exemplary embodiment.This information displaying method is for comprising the terminal of radar component and camera assembly.In one embodiment, this terminal can be of portable form near field identification terminal, such as blind person's accessory terminal.As shown in Figure 3A, this information displaying method can comprise step 301 ~ step 305.
In step 301, obtain this radar component to appointed area carry out detect obtain, each self-corresponding radar data of each object in this appointed area.
In one embodiment, radar component is fixedly installed in the terminal, and its direction of scanning is also constant relative to terminal, but the disclosure is not as limit, and radar component also can be fixedly installed on other positions.The radar data that the scanning of this radar component obtains comprises the shape of each object in appointed area, size and the positional information etc. relative to this terminal.
Such as, when the method is applied in blind person's accessory terminal, user can according to the posture of specifying this terminal hand-held, can according to user towards setting up coordinate system, user is initial point, and each object can be each object coordinate in the coordinate system relative to the positional information of terminal.When radar component scans, first the angle and distance of an object relative to radar component is obtained, and calculate the angle and distance of this object relative to user according to this object relative to the angle and distance of radar component, and calculate the coordinate of this object in the coordinate system taking user as initial point further.
In step 302, obtain this camera assembly to this appointed area carry out take obtain, each self-corresponding view data of this each object.
In one embodiment, camera assembly, when the image of shot object, can measure the distance between camera assembly and object by the mode of focusing, and the angle of records photographing.
Similar with radar component, camera assembly is fixedly installed in the terminal, the distance between each object and camera assembly that camera assembly can obtain according to focusing and shooting angle are determined to calculate the angle and distance of this object relative to user, and calculate the coordinate of this object in the coordinate system taking user as initial point further.
Example, for each object photographed, camera assembly can by Iamge Segmentation corresponding for this object out and store.
In step 303, identical relative to the positional information of this terminal one group of radar data and view data are associated as same group of data.
In step 304, the image according to corresponding object carries out image recognition, determines the name information of this corresponding object.
In step 305, according to the radar data be associated and view data, in same speech message, show the positional information of this corresponding object relative to this terminal and the name information of this corresponding object.
Because blind person or the user that has defective vision cannot see visual image clearly, in order to the better user to having defective vision shows near field identifying information, position and the title of the object that radar component and camera assembly can be scanned or photograph show user by a speech message, so that user understands there is which object around.
As shown in Figure 3A, except above step 301 ~ step 305, a kind of information displaying method shown in the disclosure one exemplary embodiment such as can also comprise the following steps.
Within step 306, according to this object of image recognition of corresponding object; Determine the hidden objects that object corresponding to this is in advance associated.
In step 307, obtain corresponding the hiding Info of hidden objects, this hides Info and comprises the positional information of this hidden objects relative to this corresponding object and the name information of this hidden objects; The positional information of this hidden objects relative to this terminal is determined relative to the positional information of this terminal and this hidden objects relative to the positional information of this corresponding object according to this corresponding object.
In actual use, likely there is some radar component and camera assembly object not easy to identify, such as some object is not scanned or photographs by radar component or camera assembly because of being blocked by other object, or jobbie is because volume is too little or to be hidden in other object (socket on such as wall and switch) and not scanned by radar component.These objects not scanned or photograph can be called hidden objects, and in some cases, user may want the relevant information understanding these hidden objectses.
In the scheme shown in the disclosure one embodiment, the maintainer of user or equipment can in advance in the terminal, the object being not easy in fixed space to be identified is set to hidden objects, and these hidden objectses are close with position, and the object be easily identified associates.In one embodiment, for each hidden objects, also the positional information of this hidden objects relative to the object of associated can be stored in blind person's accessory terminal, and the name information of this hidden objects.After blind person's accessory terminal to be scanned by radar component and camera assembly and photographs certain object, the object that can be scanned by image recognition, and determine the hidden objects that associates with it, and obtain the positional information of hidden objects relative to this object and the name information of this hidden objects that associate with it, then determine the relative position between hidden objects and blind person's accessory terminal.
In step 308, in same message, the positional information of this hidden objects relative to this terminal and the name information of this hidden objects is shown.This message can be such as speech message.
After identifying hidden objects, blind person's accessory terminal can show the Name & Location of hidden objects by same speech message.
Optionally, blind person's accessory terminal can also receive the navigation requests that user is sent by voice mode, speech analysis is carried out to this navigation requests and obtains the object names information that comprises of navigation requests, and shown the navigation information running to the corresponding object of this object names information by voice mode.
Such as, please refer to Fig. 3 B, it illustrates blind person's accessory terminal application scenarios figure that disclosure embodiment provides.Scene shown in Fig. 3 B comprises the interior space 30, and this interior space 30 comprises sofa 31 and partition wall 32, and partition wall is provided with a socket 33 on 31.Sofa 31 and socket 33 are associated by the maintainer of blind person's accessory terminal 34 in advance in blind person's accessory terminal 34, and arrange the positional information of socket 33 relative to sofa 31 and the title of socket 33.When user wears the position that blind person's accessory terminal 34 stands in shown in Fig. 3 B, sofa 31 is between partition wall 32 and blind person's accessory terminal 34, and blind person's accessory terminal 34 can scan and photograph sofa 31, socket 33 then block by sofa 31.Now, blind person's accessory terminal 34 identifies sofa 31, determine that the concealed terminal be associated with sofa 31 is socket 33, obtain the positional information of socket 33 relative to sofa 31 and the title of socket 33, and determine socket 33 positional information relative to blind person accessory terminal 34 relative to the positional information of sofa 31 and blind person's accessory terminal 34 relative to the positional information of sofa 31 according to socket 33.When subsequent user wants searching socket to be mobile phone charging, such as can send by voice the instruction finding socket, now, blind person's accessory terminal 34 such as can show the title of socket 33 (such as by speech message, parlor socket 1) and socket 33 relative to the position (such as, 5 meters, right front) of blind person's accessory terminal 34.When user sends the phonetic order of navigation, by blind person's accessory terminal 34 planning guidance path 35 automatically, and socket 33 can be reached by voice guide user along path 35.
In sum, information displaying method shown in disclosure embodiment, by obtain this radar component to appointed area carry out detect obtain, the each self-corresponding radar data of each object in this appointed area, obtain this camera assembly to this appointed area carry out take obtain, the each self-corresponding view data of this each object, is associated as same group of data by the radar data of same for correspondence object and view data, each group of radar data be associated and view data is carried out polymerization and shows; The radar data of same object and view data are condensed together and shows, user is enable to experience the image of object and the object position relative to terminal intuitively, reach the exhibition method of expansion near field recognition result, improve the effect of bandwagon effect and Consumer's Experience.
Following is disclosure device embodiment, may be used for performing disclosure embodiment of the method.For the details do not disclosed in disclosure device embodiment, please refer to disclosure embodiment of the method.
Fig. 4 is the block diagram of a kind of device for displaying information according to an exemplary embodiment, and this device for displaying information may be used for comprising in the terminal of radar component and camera assembly.As shown in Figure 4, this device for displaying information includes but not limited to: radar data acquisition module 401, image data acquisition module 402, relating module 403 and the first display module 404;
Described radar data acquisition module 401 is configured to obtain described radar component to carry out detecting to appointed area and obtains, each self-corresponding radar data of each object in described appointed area;
Described image data acquisition module 402 is configured to obtain described camera assembly and carries out taking to described appointed area and obtain, each self-corresponding view data of each object described;
Described relating module 403 is configured to the radar data of same for correspondence object and view data to be associated as same group of data;
Described first display module 404 is configured to each group of radar data be associated and view data to be carried out polymerization to be shown.
In sum, device for displaying information shown in disclosure embodiment, by obtain this radar component to appointed area carry out detect obtain, the each self-corresponding radar data of each object in this appointed area, obtain this camera assembly to this appointed area carry out take obtain, the each self-corresponding view data of this each object, is associated as same group of data by the radar data of same for correspondence object and view data, each group of radar data be associated and view data is carried out polymerization and shows; The radar data of same object and view data are condensed together and shows, user is enable to experience the image of object and the object position relative to terminal intuitively, reach the exhibition method of expansion near field recognition result, improve the effect of bandwagon effect and Consumer's Experience.
Fig. 5 is the block diagram of a kind of device for displaying information according to another exemplary embodiment, and this device for displaying information may be used for comprising in the terminal of radar component and camera assembly.As shown in Figure 5, this device for displaying information includes but not limited to: radar data acquisition module 401, image data acquisition module 402, relating module 403 and the first display module 404;
Described radar data acquisition module 401 is configured to obtain described radar component to carry out detecting to appointed area and obtains, each self-corresponding radar data of each object in described appointed area;
Described image data acquisition module 402 is configured to obtain described camera assembly and carries out taking to described appointed area and obtain, each self-corresponding view data of each object described;
Described relating module 403 is configured to the radar data of same for correspondence object and view data to be associated as same group of data;
Described first display module 404 is configured to each group of radar data be associated and view data to be carried out respectively polymerization to be shown.
Optionally, described relating module 403 is configured to identical relative to the positional information of described terminal one group of radar data and view data to be associated as same group of data;
Wherein, described radar data and described view data comprise the positional information of each self-corresponding object relative to described terminal.
Optionally, described first display module 404, comprising: first shows that submodule 404a and second shows submodule 404b;
Described first shows that submodule 404a shape, size and the positional information relative to described terminal be configured to according to corresponding object shows the contour pattern of described corresponding object in visualization interface;
Described second shows that submodule 404b is configured to radar data according to being associated and view data, and the contour pattern of corresponding described corresponding object shows the image of described corresponding object;
Wherein, described radar data comprises the shape of corresponding object, size and the positional information relative to described terminal, and described view data comprises the image of corresponding object.
Optionally, described first display module 404, comprising: recognin module 404c and the 3rd shows submodule 404d;
Described recognin module 404c is configured to carry out image recognition according to the image of corresponding object, determines the name information of described corresponding object;
Described 3rd shows that submodule 404d is configured to radar data according to being associated and view data, shows the positional information of described corresponding object relative to described terminal and the name information of described corresponding object in same speech message;
Wherein, described radar data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object.
Optionally, described device also comprises: the first identification module 405, enquiry module 406 and the second display module 407;
Described first identification module 405 is configured to when the instruction of the detail information receiving at least one object in inquiry each object described, and the image corresponding according at least one object described carries out image recognition;
Described enquiry module 406 is configured to according to the result of the image recognition detail information from network side inquiry at least one object described;
Described second display module 407 is configured to the detail information showing at least one object described;
Wherein, described view data comprises the image of corresponding object.
Optionally, described device also comprises: the second identification module 408, hidden objects determination module 409, data obtaining module 410, position determination module 411 and the 3rd display module 412;
Described second identification module 408 is configured to corresponding object according to the image recognition of corresponding object;
Described hidden objects determination module 409 is configured to the hidden objects determining to be associated with described corresponding object in advance;
Described data obtaining module 410 is configured to obtain corresponding the hiding Info of described hidden objects, described in hide Info and comprise the positional information of described hidden objects relative to described corresponding object and the name information of described hidden objects;
Described position determination module 411 is configured to determine described hidden objects positional information relative to described terminal relative to the positional information of described terminal and described hidden objects relative to the positional information of described corresponding object according to described corresponding object;
Described 3rd display module 412 is configured in same message, show the positional information of described hidden objects relative to described terminal and the name information of described hidden objects;
Wherein, at least one in described radar data and described view data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object.
In one embodiment, this message can be such as speech message, notice class message, short message etc., and the disclosure is not restricted this.
Optionally, described device also comprises:
Not associated object determination module 413, is configured to determine not associated object, described not associated object is in described appointed area, only by assembly detection in described radar component and described camera assembly or the object that photographs;
4th display module 414, is configured to the information of showing described not associated object;
Wherein, when the object that described not associated object is detected by described radar component, the information of described not associated object is the radar data of described not associated object; When the object that described not associated object is photographed by described camera assembly, the information of described not associated object is the view data of described not associated object.But the disclosure is not as limit, when this object is only detected by radar component, except showing in radar data except this object information, can also according to the relative position between this object and terminal, and other relevant informations of this object, in view data, indicate this object.Otherwise, such as when this object be only caught on camera assembly detect time, except showing in view data except this object information, according to information such as the relative position between this object and terminal, images, this object can also be indicated in radar data.
In sum, device for displaying information shown in disclosure embodiment, by obtain this radar component to appointed area carry out detect obtain, the each self-corresponding radar data of each object in this appointed area, obtain this camera assembly to this appointed area carry out take obtain, the each self-corresponding view data of this each object, is associated as same group of data by the radar data of same for correspondence object and view data, each group of radar data be associated and view data is carried out polymerization and shows; The radar data of same object and view data are condensed together and shows, user is enable to experience the image of object and the object position relative to terminal intuitively, reach the exhibition method of expansion near field recognition result, improve the effect of bandwagon effect and Consumer's Experience.
Fig. 6 is the block diagram of a kind of device 600 according to an exemplary embodiment.Such as, device 600 can be mobile phone, computing machine, digital broadcast terminal, messaging devices, routing device, game console, tablet device, Medical Devices, body-building equipment, personal digital assistant, intelligent control device, intelligent appliance equipment, intelligent wearable device etc.
With reference to Fig. 6, device 600 can comprise following one or more assembly: processing components 602, storer 604, power supply module 606, multimedia groupware 608, audio-frequency assembly 610, sensor module 614 and communications component 616.
The integrated operation of the usual control device 600 of processing components 602, such as with display, call, data communication, camera operation and record operate the operation etc. be associated.Processing components 602 can comprise one or more processor 618 to perform instruction, to complete all or part of step of above-mentioned method.In addition, processing components 602 can comprise one or more module, and what be convenient between processing components 602 and other assemblies is mutual.Such as, processing components 602 can comprise multi-media module, mutual with what facilitate between multimedia groupware 608 and processing components 602.
Storer 604 is configured to store various types of data to be supported in the operation of device 600.The example of these data comprises for any application program of operation on device 600 or the instruction of method.Storer 604 can be realized by the volatibility of any type or non-volatile memory device or their combination, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory EPROM (EPROM), programmable read only memory (PROM), ROM (read-only memory) (ROM), magnetic store, flash memory, disk or CD.Also store one or more module in storer 604, this one or more module is configured to be performed by this one or more processor 618, to complete all or part of step in the arbitrary shown method of above-mentioned Fig. 1 to 3.
The various assemblies that power supply module 606 is device 600 provide electric power.Power supply module 606 can comprise power-supply management system, one or more power supply, and other and the assembly generating, manage and distribute electric power for device 600 and be associated.
Multimedia groupware 608 is included in the screen providing an output interface between described device 600 and user.In certain embodiments, screen can comprise liquid crystal display (LCD) and touch panel (TP).If screen comprises touch panel, screen may be implemented as touch-screen, to receive the input signal from user.Touch panel comprises one or more touch sensor with the gesture on sensing touch, slip and touch panel.Described touch sensor can the border of not only sensing touch or sliding action, but also detects the duration relevant to described touch or slide and pressure.
Audio-frequency assembly 610 is configured to export and/or input audio signal.Such as, audio-frequency assembly 610 comprises a microphone (MIC), and when device 600 is in operator scheme, during as call model, logging mode and speech recognition mode, microphone is configured to receive external audio signal.The sound signal received can be stored in storer 604 further or be sent via communications component 616.In certain embodiments, audio-frequency assembly 610 also comprises a loudspeaker, for output audio signal.
Sensor module 614 comprises one or more sensor, for providing the state estimation of various aspects for device 600.Such as, sensor module 614 can detect the opening/closing state of device 600, the relative positioning of assembly, the position change of all right pick-up unit 600 of sensor module 614 or device 600 1 assemblies and the temperature variation of device 600.In certain embodiments, this sensor module 614 can also comprise Magnetic Sensor, pressure transducer or temperature sensor.
Communications component 616 is configured to the communication being convenient to wired or wireless mode between device 600 and other equipment.Device 600 can access the wireless network based on communication standard, as WiFi, 2G or 3G, or their combination.In one exemplary embodiment, communications component 616 receives from the broadcast singal of external broadcasting management system or broadcast related information via broadcast channel.In one exemplary embodiment, described communications component 616 also comprises near-field communication (NFC) module, to promote junction service.Such as, can based on radio-frequency (RF) identification (RFID) technology in NFC module, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, bluetooth (BT) technology and other technologies realize.
In the exemplary embodiment, device 600 can be realized, for performing said method by one or more application specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD) (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components.
In the exemplary embodiment, additionally provide a kind of non-transitory computer-readable recording medium comprising instruction, such as, comprise the storer 604 of instruction, above-mentioned instruction can perform said method by the processor 618 of device 600.Such as, described non-transitory computer-readable recording medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk and optical data storage devices etc.
About the device in above-described embodiment, wherein the concrete mode of modules executable operations performs detailed description in about the embodiment of the method, will not elaborate explanation herein.
Should be understood that, the present invention is not limited to precision architecture described above and illustrated in the accompanying drawings, and can perform various amendment and change not departing from its scope.Scope of the present invention is only limited by appended claim.

Claims (15)

1. an information displaying method, is characterized in that, for comprising in the terminal of radar component and camera assembly, described method comprises:
Obtain described radar component to appointed area carry out detect obtain, each self-corresponding radar data of each object in described appointed area;
Obtain described camera assembly to described appointed area carry out take obtain, each self-corresponding view data of each object described;
The radar data of same for correspondence object and view data are associated as same group of data;
The each group of radar data be associated and view data are carried out polymerization to show.
2. method according to claim 1, it is characterized in that, described radar data and described view data comprise the positional information of each self-corresponding object relative to described terminal, described the radar data of same for correspondence object and view data are associated as same group of data, comprising:
Identical relative to the positional information of described terminal one group of radar data and view data are associated as same group of data.
3. method according to claim 1, is characterized in that, described radar data comprises the shape of corresponding object, size and the positional information relative to described terminal, and described view data comprises the image of corresponding object; Described each group of radar data be associated and view data are carried out polymerization show, comprising:
In visualization interface, the contour pattern of described corresponding object is shown according to the shape of corresponding object, size and the positional information relative to described terminal;
According to the radar data be associated and view data, the contour pattern of corresponding described corresponding object shows the image of described corresponding object.
4. method according to claim 1, is characterized in that, described radar data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object; Described each group of radar data be associated and view data are carried out respectively polymerization show, comprising:
Image according to corresponding object carries out image recognition, determines the name information of described corresponding object;
According to the radar data be associated and view data, in same message, show the positional information of described corresponding object relative to described terminal and the name information of described corresponding object.
5. method according to claim 1, is characterized in that, described view data comprises the image of corresponding object, and described method also comprises:
When the instruction of detail information receiving at least one object in inquiry each object described, the image corresponding according at least one object described carries out image recognition;
According to the result of the image recognition detail information from network side inquiry at least one object described;
Show the detail information of at least one object described.
6. method according to claim 1, it is characterized in that, at least one in described radar data and described view data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object, and described method also comprises:
Corresponding object according to the image recognition of corresponding object;
Determine the hidden objects be associated with described corresponding object in advance;
Obtain corresponding the hiding Info of described hidden objects, described in hide Info and comprise the positional information of described hidden objects relative to described corresponding object and the name information of described hidden objects;
The positional information of described hidden objects relative to described terminal is determined relative to the positional information of described terminal and described hidden objects relative to the positional information of described corresponding object according to described corresponding object;
The positional information of described hidden objects relative to described terminal and the name information of described hidden objects is shown in same message.
7. method according to claim 1, is characterized in that, described method also comprises:
Determine not associated object, described not associated object is in described appointed area, only by assembly detection in described radar component and described camera assembly or the object that photographs;
Show the information of described not associated object;
Wherein, when the object that described not associated object is detected by described radar component, the information of described not associated object comprises the radar data of described not associated object; When the object that described not associated object is photographed by described camera assembly, the information of described not associated object comprises the view data of described not associated object.
8. a device for displaying information, is characterized in that, for comprising in the terminal of radar component and camera assembly, described device comprises:
Radar data acquisition module, for obtain described radar component to appointed area carry out detect obtain, each self-corresponding radar data of each object in described appointed area;
Image data acquisition module, for obtain described camera assembly to described appointed area carry out take obtain, each self-corresponding view data of each object described;
Relating module, for being associated as same group of data by the radar data of same for correspondence object and view data;
First display module, shows for each group that is associated radar data and view data being carried out polymerization.
9. device according to claim 8, is characterized in that,
Described relating module, for being associated as same group of data by identical relative to the positional information of described terminal one group of radar data and view data;
Wherein, described radar data and described view data comprise the positional information of each self-corresponding object relative to described terminal.
10. device according to claim 8, is characterized in that, described first display module, comprising:
First shows submodule, shows the contour pattern of described corresponding object for the shape according to corresponding object, size and the positional information relative to described terminal in visualization interface;
Second shows submodule, the radar data be associated for basis and view data, and the contour pattern of corresponding described corresponding object shows the image of described corresponding object;
Wherein, described radar data comprises the shape of corresponding object, size and the positional information relative to described terminal, and described view data comprises the image of corresponding object.
11. devices according to claim 8, is characterized in that, described first display module, comprising:
Recognin module, carries out image recognition with the image according to corresponding object, determines the name information of described corresponding object;
3rd shows submodule, for according to the radar data be associated and view data, shows the positional information of described corresponding object relative to described terminal and the name information of described corresponding object in same message;
Wherein, described radar data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object.
12. devices according to claim 8, is characterized in that, described device also comprises:
First identification module, for when the instruction of detail information receiving at least one object in inquiry each object described, the image corresponding according at least one object described carries out image recognition;
Enquiry module, for inquiring about the detail information of at least one object described from network side according to the result of image recognition;
Second display module, for showing the detail information of at least one object described;
Wherein, described view data comprises the image of corresponding object.
13. devices according to claim 8, is characterized in that, described device also comprises:
Second identification module, for object corresponding according to the image recognition of corresponding object;
Hidden objects determination module, for determining the hidden objects be associated with described corresponding object in advance;
Data obtaining module, for obtaining corresponding the hiding Info of described hidden objects, described in hide Info and comprise the positional information of described hidden objects relative to described corresponding object and the name information of described hidden objects;
Position determination module, for determining described hidden objects positional information relative to described terminal relative to the positional information of described terminal and described hidden objects relative to the positional information of described corresponding object according to described corresponding object;
3rd display module, for showing the positional information of described hidden objects relative to described terminal and the name information of described hidden objects in same message;
Wherein, at least one in described radar data and described view data comprises the positional information of corresponding object relative to described terminal, and described view data comprises the image of corresponding object.
14. devices according to claim 8, is characterized in that, described device also comprises:
Not associated object determination module, for determining not associated object, described not associated object is in described appointed area, only by assembly detection in described radar component and described camera assembly or the object that photographs;
4th display module, for showing the information of described not associated object;
Wherein, when the object that described not associated object is detected by described radar component, the information of described not associated object comprises the radar data of described not associated object; When the object that described not associated object is photographed by described camera assembly, the information of described not associated object comprises the view data of described not associated object.
15. 1 kinds of device for displaying information, is characterized in that, for comprising in the terminal of radar component and camera assembly, described device comprises:
Processor;
For the storer of storage of processor executable instruction;
Wherein, described processor is configured to:
Obtain described radar component to appointed area carry out detect obtain, each self-corresponding radar data of each object in described appointed area;
Obtain described camera assembly to described appointed area carry out take obtain, each self-corresponding view data of each object described;
The radar data of same for correspondence object and view data are associated as same group of data;
The each group of radar data be associated and view data are carried out polymerization to show.
CN201510736035.6A 2015-11-02 2015-11-02 Information displaying method and device Active CN105301585B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510736035.6A CN105301585B (en) 2015-11-02 2015-11-02 Information displaying method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510736035.6A CN105301585B (en) 2015-11-02 2015-11-02 Information displaying method and device

Publications (2)

Publication Number Publication Date
CN105301585A true CN105301585A (en) 2016-02-03
CN105301585B CN105301585B (en) 2017-12-29

Family

ID=55199065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510736035.6A Active CN105301585B (en) 2015-11-02 2015-11-02 Information displaying method and device

Country Status (1)

Country Link
CN (1) CN105301585B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108171107A (en) * 2017-11-25 2018-06-15 深圳市元征科技股份有限公司 A kind of image-recognizing method and device
CN109189885A (en) * 2018-08-31 2019-01-11 广东小天才科技有限公司 A kind of real-time control method and smart machine based on smart machine camera
CN113534281A (en) * 2020-04-14 2021-10-22 深圳市博利凌科技有限公司 Scanner and method for sensing position of hidden object behind surface of object
CN115104874A (en) * 2022-07-26 2022-09-27 深圳市西昊智能家具有限公司 Control method and device of intelligent chair, computer equipment and storage medium
CN115214629A (en) * 2022-07-13 2022-10-21 小米汽车科技有限公司 Automatic parking method, device, storage medium, vehicle and chip

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07125567A (en) * 1993-11-04 1995-05-16 Mitsubishi Motors Corp Preceding car detecting mechanism of car traveling controller
US20100295718A1 (en) * 2009-03-26 2010-11-25 Tialinx, Inc. Person-Borne Improvised Explosive Device Detection
CN103576154A (en) * 2012-08-01 2014-02-12 通用汽车环球科技运作有限责任公司 Fusion of obstacle detection using radar and camera
CN103890606A (en) * 2011-10-20 2014-06-25 罗伯特·博世有限公司 Methods and systems for creating maps with radar-optical imaging fusion
KR20140078436A (en) * 2012-12-17 2014-06-25 현대자동차주식회사 Sensor fusion system and method thereof
WO2015005001A1 (en) * 2013-07-08 2015-01-15 本田技研工業株式会社 Object recognition device
DE102014012250A1 (en) * 2014-08-19 2015-03-26 Daimler Ag Method for image processing and display

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07125567A (en) * 1993-11-04 1995-05-16 Mitsubishi Motors Corp Preceding car detecting mechanism of car traveling controller
US20100295718A1 (en) * 2009-03-26 2010-11-25 Tialinx, Inc. Person-Borne Improvised Explosive Device Detection
CN103890606A (en) * 2011-10-20 2014-06-25 罗伯特·博世有限公司 Methods and systems for creating maps with radar-optical imaging fusion
CN103576154A (en) * 2012-08-01 2014-02-12 通用汽车环球科技运作有限责任公司 Fusion of obstacle detection using radar and camera
KR20140078436A (en) * 2012-12-17 2014-06-25 현대자동차주식회사 Sensor fusion system and method thereof
WO2015005001A1 (en) * 2013-07-08 2015-01-15 本田技研工業株式会社 Object recognition device
DE102014012250A1 (en) * 2014-08-19 2015-03-26 Daimler Ag Method for image processing and display

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108171107A (en) * 2017-11-25 2018-06-15 深圳市元征科技股份有限公司 A kind of image-recognizing method and device
CN109189885A (en) * 2018-08-31 2019-01-11 广东小天才科技有限公司 A kind of real-time control method and smart machine based on smart machine camera
CN113534281A (en) * 2020-04-14 2021-10-22 深圳市博利凌科技有限公司 Scanner and method for sensing position of hidden object behind surface of object
CN115214629A (en) * 2022-07-13 2022-10-21 小米汽车科技有限公司 Automatic parking method, device, storage medium, vehicle and chip
CN115214629B (en) * 2022-07-13 2024-06-04 小米汽车科技有限公司 Automatic parking method, device, storage medium, vehicle and chip
CN115104874A (en) * 2022-07-26 2022-09-27 深圳市西昊智能家具有限公司 Control method and device of intelligent chair, computer equipment and storage medium
CN115104874B (en) * 2022-07-26 2023-01-03 深圳市西昊智能家具有限公司 Control method and device of intelligent chair, computer equipment and storage medium

Also Published As

Publication number Publication date
CN105301585B (en) 2017-12-29

Similar Documents

Publication Publication Date Title
CN105301585A (en) Information display method and device
JP6392991B2 (en) Spatial parameter identification method, apparatus, program, recording medium, and terminal device using image
CN104697533A (en) Navigation method and device
CN104133956B (en) Handle the method and device of picture
CN109064277B (en) Commodity display method and device
CN105222802A (en) navigation, navigation video generation method and device
CN105376355A (en) Mobile terminal having smart measuring tape and length measuring method thereof
CN105488111A (en) Image search method and device
CN105511631A (en) Gesture recognition method and device
CN105487863A (en) Interface setting method and device based on scene
CN105550224A (en) Article search method, apparatus and system
CN104699250A (en) Display control method, display control device and electronic equipment
CN105627920A (en) Method and device for displaying size
CN104918325A (en) Positioning guide and parking guide methods and device
CN104462418A (en) Page displaying method and device and electronic device
CN111105454A (en) Method, device and medium for acquiring positioning information
CN104965704A (en) Information display method and apparatus
CN105139378A (en) Card boundary detection method and apparatus
CN108108671A (en) Description of product information acquisition method and device
CN103886284A (en) Character attribute information identification method and device and electronic device
CN104156695A (en) Method and device for aligning face image
CN105301183A (en) Air quality detecting method and device
CN104536669A (en) Information displaying method and device
CN105512658A (en) Image recognition method and device for rectangular object
CN105517144A (en) Equipment positioning method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant