WO2016155227A1 - 显示取景信息的方法及装置 - Google Patents

显示取景信息的方法及装置 Download PDF

Info

Publication number
WO2016155227A1
WO2016155227A1 PCT/CN2015/088686 CN2015088686W WO2016155227A1 WO 2016155227 A1 WO2016155227 A1 WO 2016155227A1 CN 2015088686 W CN2015088686 W CN 2015088686W WO 2016155227 A1 WO2016155227 A1 WO 2016155227A1
Authority
WO
WIPO (PCT)
Prior art keywords
framing
information
display
module
smart glasses
Prior art date
Application number
PCT/CN2015/088686
Other languages
English (en)
French (fr)
Inventor
唐明勇
刘华一君
陈涛
Original Assignee
小米科技有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 小米科技有限责任公司 filed Critical 小米科技有限责任公司
Priority to MX2015015743A priority Critical patent/MX357218B/es
Priority to JP2017508736A priority patent/JP6259544B2/ja
Priority to RU2015151619A priority patent/RU2635873C2/ru
Priority to BR112015030257A priority patent/BR112015030257A2/pt
Priority to KR1020157034228A priority patent/KR101701814B1/ko
Priority to US14/955,313 priority patent/US20160295118A1/en
Publication of WO2016155227A1 publication Critical patent/WO2016155227A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/775Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B1/00Optical elements characterised by the material of which they are made; Optical coatings for optical elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/02Viewfinders
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/38Circuits or arrangements for blanking or otherwise eliminating unwanted parts of pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Definitions

  • the present disclosure relates to the field of electronic technologies, and in particular, to a method and apparatus for displaying framing information.
  • Smart wearable devices can not only facilitate the lives of users, but also better match existing electronic products.
  • the related technology can display the images currently captured by the digital camera on the smart glasses, so that the user can view the images currently captured by the digital camera without facing the display screen of the digital camera, thereby reducing the power consumption of the digital camera.
  • the digital camera needs to adjust the current viewing range
  • the user still needs to adjust the digital camera by adjusting the shooting angle of the digital camera by taking the smart glasses off and watching the viewfinder displayed on the display of the digital camera.
  • the operation of the framing range is cumbersome.
  • embodiments of the present disclosure provide a method and apparatus for displaying framing information to simplify the operation of viewing a framing range of a currently captured image frame of a photographing apparatus.
  • a method of displaying framing information comprising:
  • the displaying the view information on the display device may include:
  • the parallel beam is projected onto the display device.
  • the display device may be a smart glasses, and the displaying the view information on the display device may include:
  • the framing information is displayed on the lens of the smart glasses according to the percentage of the repeated pictures.
  • the displaying the framing information on the lens of the smart glasses according to the percentage of the repeated screen may include:
  • the percentage of the repeated picture is greater than the first preset threshold, displaying the framing information on the lens of the smart glasses in a complete finder frame;
  • the framing information is displayed on the lens in a partial finder frame.
  • the method may further include:
  • the display device may be a smart glasses, and the displaying the view information on the display device may include:
  • the framing information is displayed on the lens of the smart glasses according to the similarity value.
  • the determining, according to the similarity value, the displaying the framing information on the lens of the smart glasses may include:
  • the framing information is displayed on the lens of the smart glasses in a complete finder frame;
  • the framing information is displayed on the lens in a partial finder frame.
  • the method may further include:
  • prompt information for guiding the user to adjust the direction of the visual field is displayed on the lens.
  • a device for displaying framing information is provided on a smart glasses, and the device for displaying framing information includes:
  • a determining module configured to determine framing information of the photographing device
  • a display module configured to display the view information determined by the determining module on the display device.
  • the display module can include:
  • a beam conversion sub-module configured to convert the framing information determined by the determining module into a parallel beam, the boundary of the parallel beam being determined by the framing information
  • a projection sub-module configured to project the parallel beam of light converted by the beam conversion sub-module onto the display device.
  • the display module can include:
  • a first determining sub-module configured to determine a framing picture corresponding to the framing information determined by the determining module
  • a second determining submodule configured to determine a percentage of a repeated picture between the view screen and an image frame captured by the camera of the smart glasses
  • the first display sub-module is configured to display the framing information on the lens of the smart glasses according to the percentage of the repeated pictures determined by the second determining sub-module.
  • the first display submodule may include:
  • a second display sub-module configured to display the framing information in the form of a complete framing frame on the smart glasses if the percentage of the repeated pictures determined by the second determining sub-module is greater than a first preset threshold On the lens;
  • a third display sub-module configured to: if the percentage of the repeated screen determined by the second determining sub-module is less than the first preset threshold and greater than a second preset threshold, the framing information is a partial framing frame The way is displayed on the lens of the smart glasses.
  • the apparatus may further include:
  • the first reminding module is configured to display prompt information for guiding the user to adjust the visual field direction on the lens if the percentage of the repeated picture determined by the second determining sub-module is smaller than the second preset threshold.
  • the display module can include:
  • a third determining sub-module configured to determine a framing picture corresponding to the framing information determined by the determining module
  • a fourth determining submodule configured to determine a similarity value between the subject determined by the third determining submodule and a subject in an image frame acquired by an imaging device of the smart glasses;
  • a fourth display submodule configured to display the framing information on the lens of the smart glasses according to the similarity value determined by the fourth determining submodule.
  • the fourth display submodule may include:
  • a fifth display sub-module configured to display the framing information on the lens of the smart glasses in a complete framing frame if the similarity value determined by the fourth determining sub-module is greater than a third preset threshold ;
  • the sixth display sub-module is configured to: if the similarity value determined by the fourth determining sub-module is smaller than the third preset threshold and greater than a fourth preset threshold, the framing information is in a partial finder manner Displayed on the lens.
  • the apparatus may further include:
  • the second reminding module is configured to display prompt information for guiding the user to adjust the visual field direction on the lens if the similarity value determined by the fourth determining sub-module is smaller than the fourth preset threshold.
  • an apparatus for displaying framing information comprising:
  • a memory for storing processor executable instructions
  • processor is configured to:
  • the technical solution provided by the embodiment of the present disclosure may include the following beneficial effects: by displaying the framing information of the photographing device on the display device, the user can directly view the framing range of the photographing device through the display device, and realize the photographing device and the framing display.
  • the separation for the user, simplifies the operation of adjusting the viewing range of the digital camera.
  • FIG. 1A is a flowchart of a method of displaying framing information, according to an exemplary embodiment.
  • FIG. 1B is a schematic diagram of a scenario to which the embodiment of the present disclosure is applied.
  • FIG. 2 is a flowchart of a method of displaying framing information, according to an exemplary embodiment.
  • FIG. 3A is a system diagram of smart glasses and a photographing apparatus to which the embodiment of the present disclosure is applied.
  • FIG. 3A is a system diagram of smart glasses and a photographing apparatus to which the embodiment of the present disclosure is applied.
  • FIG. 3B is a schematic diagram of a framing screen corresponding to the framing information shown in the embodiment of the present disclosure.
  • FIG. 3C is one of schematic diagrams of image screens taken by smart glasses according to an embodiment of the present disclosure.
  • FIG. 3D is a second schematic diagram of an image screen taken by the smart glasses according to an embodiment of the present disclosure.
  • FIG. 3E is a third schematic diagram of an image screen captured by the smart glasses according to an embodiment of the present disclosure.
  • FIG. 4 is a flowchart of a method of displaying framing information, according to an exemplary embodiment.
  • FIG. 5 is a flowchart of a method of displaying framing information according to an embodiment of the present invention.
  • FIG. 6 is a block diagram of an apparatus for displaying framing information, according to an exemplary embodiment.
  • FIG. 7 is a block diagram of another apparatus for displaying framing information, according to an exemplary embodiment.
  • FIG. 8 is a block diagram of still another apparatus for displaying framing information, according to an exemplary embodiment.
  • FIG. 9 is a block diagram of an apparatus suitable for displaying framing information, according to an exemplary embodiment.
  • FIG. 1A is a flowchart of a method for displaying framing information according to an exemplary embodiment
  • FIG. 1B is a schematic diagram of a scene applicable to an embodiment of the present disclosure.
  • the method for displaying framing information may be applied to smart glasses, glasses, and the like.
  • the device for displaying the function is exemplarily described with reference to FIG. 1B.
  • the method for displaying the framing information includes the following steps S101 to S102:
  • step S101 the framing information of the photographing device is determined.
  • the current view information of the photographing device can be acquired by the viewfinder of the photographing device.
  • the framing information may be transmitted to the display device by wireless transmission or wired transmission, such as WIFI, Bluetooth, etc., wired mode, such as USB cable, etc., and the specific manner of the framing information of the shooting device is not in the disclosure. Make restrictions.
  • step S102 the framing information is displayed on the display device.
  • the framing information may be displayed on the display device in a parallel beam manner; in another embodiment, the framing information may also be displayed on the display device as a finder frame.
  • FIG. 1A The embodiment shown in FIG. 1A is exemplarily described below with reference to FIG. 1B.
  • the cube 10 is disposed at a position of the space, and the photographing device 11 and the user are both located on the A side of the cube 10.
  • the photographing apparatus 11 needs to photograph the cube 10
  • the framing information is displayed on the display device 12 so that the user can view the current framing screen of the photographic device 11 on the display device 12.
  • the user can directly view the framing range of the photographing device through the display device, thereby realizing the separation of the photographing device and the framing display, which is simplified for the user. Adjust the operation of the range of the digital camera.
  • displaying the view information on the display device may include:
  • the display device may be smart glasses, and displaying the view information on the display device may include:
  • the framing information is displayed on the lens of the smart glasses according to the percentage of the repeated screen.
  • displaying the framing information on the lens of the smart glasses according to the percentage of the repeated screen may include:
  • the framing information is displayed on the lens of the smart glasses in the form of a complete finder frame
  • the framing information is displayed on the lens in a partial finder frame.
  • the method may further include:
  • the display device may be smart glasses, and displaying the view information on the display device may include:
  • the framing information is displayed on the lens of the smart glasses according to the similar value.
  • determining to display the framing information on the lens of the smart glasses according to the similarity value may include:
  • the framing information is displayed on the lens of the smart glasses in the form of a complete finder frame
  • the framing information is displayed on the lens in a partial finder frame.
  • the method may further include:
  • prompt information for guiding the user to adjust the direction of the visual field is displayed on the lens.
  • the above method provided by the embodiment of the present disclosure can realize the separation of the photographing device and the view display, and simplifies the operation of adjusting the view range of the digital camera for the user.
  • FIG. 2 is a flowchart illustrating a method for displaying framing information according to an exemplary embodiment. This embodiment is exemplified by taking a display device as a spectacles as an example. As shown in FIG. 2, the method includes the following steps:
  • step S201 the framing information of the photographing device is determined.
  • step S201 For the description of step S201, refer to the description of step S101 above, and details are not described herein.
  • step S202 the framing information is converted into a parallel beam, and the boundary of the parallel beam is determined by the framing information.
  • the electro-optical conversion device can convert the framing information into parallel beams, such as lasers, projectors, and the like.
  • step S203 a parallel beam is projected onto the glasses.
  • the framing information of the photographic device can be projected onto the lens of the spectacles by the micro-projection device by providing a micro-projection device, so that the lens can achieve the fluoroscopy effect.
  • the separation of the photographing device and the framing display is realized; in addition, the boundary of the framing information is projected to the glass spectacles due to myopia.
  • the user can view the current viewing range of the shooting device through the glasses, and can also view the real scene in front of the eye through the glasses, so that the user's actual field of view is not affected.
  • FIG. 3A is a system diagram of a smart glasses and a photographing apparatus to which the embodiments of the present disclosure are applied
  • FIG. 3B is a schematic diagram of a viewfinder screen corresponding to the view information shown in the embodiment of the present disclosure
  • FIG. 3C is a smart glasses set according to an embodiment of the present disclosure.
  • FIG. 3D is a schematic diagram of the image screen captured by the smart glasses according to the embodiment of the present disclosure
  • FIG. 3E is a schematic diagram of the image screen captured by the smart glasses according to the embodiment of the present disclosure. As shown in FIG.
  • the smart glasses 31 and the photographing apparatus 11 can communicate by wireless means such as WIFI or infrared, whereby the photographing apparatus 11 can transmit the view information thereof to the smart glasses 31, wherein the photographing apparatus 11 can be set by
  • the finder on the photographing device 11 determines the framing information of the lens of the smart device 32, and transmits the framing information determined by the finder to the smart glasses 31, and the smart glasses 31 display the framing information of the photographing device 11 as a finder frame.
  • the photographing device 11 may be a digital camera, a motion camera, a SLR camera, or the like capable of acquiring digital images.
  • FIG. 3B for the current framing screen 321 of the photographing apparatus 11, the finder of the photographing apparatus 11 can determine the framing information thereof through the framing screen.
  • FIG. 3C shows an image screen 311 collected by the user on the angle smart glasses 31 of the smart glasses 31, and FIG. 3D shows the user's camera on the other angle smart glasses 31 wearing the smart glasses 31.
  • the image screen 312 collected by the device is shown in FIG. 3E as an image screen 313 captured by the user on the angled smart glasses 31 of the smart glasses 31.
  • the framing information of the photographic device 11 can be displayed on the lens 32 of the smart glasses 31 in the manner of the finder frame 30, the user After the smart glasses 31 are worn, the current viewing range of the shooting device 11 can be known through the finder frame 30; in another exemplary scenario, as shown in FIG. 3D, the user wears the smart glasses 31 from the other corner.
  • the range indicates that there is a large deviation between the smart glasses 31 and the current viewing direction of the photographing device 11, and the lens 3 can be present at this time.
  • the prompt information for guiding the user to adjust the visual field direction is displayed on the second, so as to guide the user to adjust the visual field direction until the framing information is substantially consistent with the visual field direction of the user, and then the framing information is displayed on the lens in the manner of the finder frame 30. 32.
  • the embodiment is to shoot
  • the framing information of the device is displayed on the display device, which enables the user to directly view the framing range of the shooting device through the display device, thereby realizing the separation of the shooting device from the framing display, and simplifying the operation of adjusting the framing range of the digital camera for the user.
  • the display device is a smart glasses, since the image currently captured by the photographing device is not present on the lens of the smart glasses, it does not affect the actual field of view of the user, thereby enabling the user to view the current shooting range of the photographing device through the smart glasses. At the same time, it is also possible to see the real scene in front of it, improving the user experience of using smart glasses.
  • FIG. 4 is a flowchart of a method for displaying framing information according to an exemplary embodiment of the present invention; the embodiment uses the above method provided by the embodiment of the present disclosure to use a framing screen corresponding to the framing information and an imaging device of the smart glasses.
  • the percentage of the repeated images between the captured image frames is displayed on the lens of the smart glasses as an example and combined with the above-mentioned FIG. 3A to FIG. 3E. As shown in FIG. 4, the following steps are included:
  • step S401 the framing information of the photographing device is determined.
  • step S401 For the description of step S401, refer to the description of step S101 above, and details are not described herein.
  • step S402 a framing screen corresponding to the framing information is determined.
  • the framing screen can also be transmitted to the smart glasses in the same manner as the above step S401.
  • step S403 the percentage of the repeated screen between the framing screen and the image screen acquired by the camera of the smart glasses is determined.
  • the similarity and/or consistency analysis may be performed on the correspondence between the framing picture corresponding to the framing information and the gradation information in the image image collected by the camera device of the smart glasses, thereby determining the two.
  • the framing picture can be used as a reference, and the range of the image picture captured by the image capturing device of the smart glasses and the picture in the image picture captured by the shooting device can be determined, for example, the image capturing device of the smart glasses is collected.
  • the image screen resolution is 640 ⁇ 480, and the number of pixels that determine the range of the image captured by the image capturing device of the smart glasses and the image captured by the capturing device is 320 ⁇ 120, thereby enabling the two images.
  • step S404 the percentage of the repeated screen is compared with the first preset threshold and the second preset threshold, and the comparison result is determined to display the framing information on the lens of the smart glasses, for example, if the percentage of the repeated screen is greater than the first Step S305 is performed, and if the percentage of the repeated screen is less than the first preset threshold and greater than the second preset threshold, step S306 is performed, and if the percentage of the repeated screen is less than the second preset threshold, step S307 is performed.
  • the first preset threshold and the second preset threshold may be determined according to the resolution of the imaging device and the imaging device of the smart glasses, thereby ensuring the calculation accuracy of the percentage of the repeated pictures.
  • step S405 if the percentage of the repeated picture is greater than the first preset threshold, the framing information is displayed on the lens of the smart glasses in the form of a complete finder frame.
  • a cup appears in the image screen 311 currently captured by the smart glasses 31, thereby indicating the cup when determining that the percentage of the repeated scene in the scene and the image screen of the scene where the cup is located is greater than the first preset threshold.
  • the framing information can be displayed on the lens 32 of the smart glasses 31 in the manner of the complete finder frame 30, the user After the smart glasses 31 are worn, the current viewing range of the photographing device 11 can be known through the complete finder frame 30.
  • the complete finder frame 30 can be displayed on the lens 32 in a colored frame.
  • the framing information 30 is presented on the lens 32 in the form of a green frame when the framing picture is substantially coincident with the image frame.
  • step S406 if the percentage of the repeated picture is smaller than the first preset threshold and greater than the second preset threshold, the framing information is displayed on the lens in a partial finder frame.
  • the image screen 312 collected by the imaging device of the smart glasses 31 from another angle is viewed from the screen content of the framing screen 321 and the image screen 312 of the photographing device 11 through the image.
  • the detection technique may determine that the percentage of the repeated images in the image 321 and the image screen 312 taken by the photographing device 11 will become smaller, for example, when the percentage of the repeated screen is smaller than the first preset threshold and greater than the second preset threshold, the smart glasses are indicated.
  • the framing information 30 is displayed on the lens 32.
  • step S407 if the percentage of the repeated screen is smaller than the second preset threshold, prompt information for guiding the user to adjust the direction of the visual field is displayed on the lens.
  • the image screen 313 collected by the imaging device of the smart glasses 31 from another angle is viewed from the screen images of the image screen 321 and the image screen 313 captured by the photographing device 11, It can be determined by the image detection technology that the percentage of the repeated images in the image 321 and the image screen 312 taken by the photographing device 11 will become smaller, for example, when the second preset threshold is smaller, the cup in the image screen 313 is substantially beyond the intelligence.
  • the field of view of the glasses 31, the smart glasses 31 and the current viewing direction of the photographing device 11 have a large deviation. At this time, the direction in which the user needs to move can be presented on the lens 32, thereby guiding the user to adjust the field of view until the detection.
  • the framing information 30 is displayed on the lens 32, for example, by the arrow prompting the user to adjust its current field of view range, thereby enabling the user to know the current field of view of the photographic device 11. .
  • the embodiment is based on the repeated screen
  • the percentage determines the display mode of the shooting device on the lens of the smart glasses, and does not affect the actual visual field of the user, so that the user can watch the current shooting range of the shooting device through the smart glasses, and can also view The real scene in front of it enhances the user experience with smart glasses.
  • FIG. 5 is a flowchart of a method for displaying framing information according to an exemplary embodiment of the present invention; the embodiment uses the above method provided by the embodiment of the present disclosure to how to use the object and the smart glasses in the framing picture corresponding to the framing information.
  • the similarity value between the main objects in the image screen collected by the camera device is used to display the framing information of the photographing device on the lens of the smart glasses as an example, and is exemplarily described in conjunction with FIG. 3A to FIG. 3E, as shown in FIG. 5 .
  • the indication includes the following steps:
  • step S501 the framing information of the photographing device is determined.
  • step S501 For the description of step S501, refer to the description of step S101 above, and details are not described herein.
  • step S502 a framing screen corresponding to the framing information is determined.
  • step S502 For the description of step S502, refer to the description of step S402 above, and details are not described herein.
  • step S503 a similar value between the subject of the framing picture and the subject in the image frame acquired by the imaging device of the smart glasses is determined.
  • the similarity analysis may be performed on the correspondence between the framing picture collected by the photographic device and the picture content (the subject in the embodiment of the present disclosure) in the image frame collected by the camera device of the smart glasses, thereby determining two Similar values between the two.
  • step S504 the similarity value is compared with the third preset threshold and the fourth preset threshold, by comparing the knots If it is determined that the framing information is displayed on the lens of the smart glasses, for example, if the similarity value is greater than the third preset threshold, step S505 is performed, and if the similarity value is less than the third preset threshold and greater than the fourth preset threshold, step S506 is performed. If the similarity value is less than the fourth preset threshold, step S507 is performed.
  • the third preset threshold and the fourth preset threshold may be determined according to the resolution of the imaging device and the imaging device of the smart glasses, thereby ensuring the calculation accuracy of the percentage of the repeated pictures.
  • step S505 if the similarity value is greater than the third preset threshold, the framing information is displayed on the lens of the smart glasses in the form of a complete finder frame.
  • a cup appears in both the current view screen 321 of the photographing apparatus 11 and the image lake surface 311 currently photographed by the smart glasses 31, if the image is detected by image detection technology.
  • the similarity value is greater than the third preset threshold, and it is determined that the cups appear at substantially the same angle in the respective pictures.
  • the framing lake surface 321 is substantially identical to the framing direction of the image screen 311, and can be intelligent at this time.
  • the framing information of the photographic device 11 is displayed on the lens 32 of the spectacles 11 in the manner of the finder frame 30.
  • the finder frame 30 can be displayed on the lens 33 in a color frame.
  • the finder frame 30 is presented in the form of a green frame on the lens 32. on.
  • step S506 if the similarity value is smaller than the third preset threshold and greater than the fourth preset threshold, the framing information is displayed on the lens in a partial finder frame.
  • the image screen 312 captured by the imaging device of the smart glasses 11 from another angle is viewed from the screen contents of the framing screen 321 and the image screen 312 captured by the photographing device 11, It can be determined by the image detection technology that the similar value of the cup in the framing picture 321 and the image screen 312 taken by the photographing device 11 will become smaller, for example, when the similarity value of the cup is smaller than the third preset threshold and greater than the fourth preset threshold.
  • the framing information may be displayed on the lens 32 to display the framing information, and displayed on the lens 32 in the form of a red frame, thereby providing the user with a small amplitude.
  • the visual direction adjustment is performed, and when the framing information is substantially consistent with the second framing information, the framing information is displayed on the lens 32 in a complete framing frame.
  • step S507 if the similarity value is smaller than the fourth preset threshold, prompt information for guiding the user to adjust the visual field direction is displayed on the lens.
  • the image screen 313 collected by the imaging device of the smart glasses 11 from another angle is viewed from the screen contents of the framing screen 321 and the image screen 313 captured by the photographing device 11, It can be determined by the image detection technology that the similar value of the cup in the framing picture 321 and the image screen 312 taken by the photographing device 11 will become smaller, for example, when the value is smaller than the fourth preset threshold, the cup in the image screen 313 is substantially exceeded.
  • the field of view of the smart glasses 11 , the current view of the smart glasses 11 and the photographing device 11 There is a large deviation in the direction.
  • the direction in which the user needs to move can be prompted on the lens 32, thereby guiding the user to adjust the field of view direction until the finder information is substantially consistent with the second framing information, and then the framing information is
  • the manner in which the full view frame is displayed is displayed on the lens 32, for example, by the arrow prompting the user to adjust their current field of view range so that the user can know the current field of view of the photographing device 11.
  • the present embodiment is based on similar values. Determining how the framing information of the shooting device is displayed on the lens of the smart glasses does not affect the actual visual field of the user, thereby enabling the user to view the current shooting range of the shooting device through the smart glasses while still being able to view the front view The real scene enhances the user experience with smart glasses.
  • FIG. 6 is a block diagram of an apparatus for displaying framing information according to an exemplary embodiment. As shown in FIG. 6, the apparatus for displaying framing information includes:
  • a determining module 61 configured to determine framing information of the photographing device
  • the display module 62 is configured to display the framing information determined by the determining module 61 on the display device.
  • FIG. 7 is a block diagram of another apparatus for displaying framing information according to an embodiment of the present invention.
  • the display module 62 may include:
  • the beam conversion sub-module 621 is configured to convert the framing information determined by the determining module 61 into a parallel beam, and the boundary of the parallel beam is determined by the framing information;
  • the projection sub-module 622 is configured to project a parallel beam converted by the beam conversion sub-module 621 onto the display device.
  • FIG. 8 is a block diagram of another apparatus for displaying framing information according to an embodiment of the present invention.
  • the display module 62 may include:
  • the first determining sub-module 623 is configured to determine a framing picture corresponding to the framing information determined by the determining module 61;
  • a second determining sub-module 624 configured to determine a percentage of a repeated picture between the view screen determined by the first determining sub-module 623 and the image frame captured by the camera of the smart glasses;
  • the first display sub-module 625 is configured to display the framing information on the lens of the smart glasses according to the percentage of the repeated pictures determined by the second determining sub-module 624.
  • the first display sub-module 625 can include:
  • the second display sub-module 6251 is configured to be a percentage of the repeated picture determined by the second determining sub-module 624 The ratio is greater than the first preset threshold, and the framing information is displayed on the lens of the smart glasses in a complete framing frame;
  • the third display sub-module 6252 is configured to display the framing information in a smart spectacles manner as part of the finder frame if the percentage of the repeated pictures determined by the second determining sub-module 624 is less than the first preset threshold and greater than the second preset threshold On the lens.
  • the apparatus may further include:
  • the first reminding module 63 is configured to display prompt information for guiding the user to adjust the visual field direction on the lens if the percentage of the repeated screen determined by the second determining sub-module 624 is less than the second preset threshold.
  • the display module 62 can include:
  • the third determining sub-module 626 is configured to determine a framing picture corresponding to the framing information determined by the determining module 61;
  • the fourth determining sub-module 627 is configured to determine a similarity value between the subject determined by the third determining sub-module 626 and the subject in the image captured by the camera of the smart glasses;
  • the fourth display sub-module 628 is configured to display the framing information on the lens of the smart glasses according to the similarity value determined by the fourth determining sub-module 627.
  • the fourth display sub-module 628 can include:
  • the fifth display sub-module 6281 is configured to display the framing information on the lens of the smart glasses in a complete framing frame if the similarity value determined by the fourth determining sub-module 627 is greater than the third preset threshold;
  • the sixth display sub-module 6282 is configured to display the framing information on the lens in a partial finder frame if the similarity value determined by the fourth determining sub-module 627 is less than the third preset threshold and greater than the fourth preset threshold.
  • the apparatus may further include:
  • the second reminding module 64 is configured to display prompt information for guiding the user to adjust the visual field direction on the lens if the similarity value determined by the fourth determining sub-module 627 is less than the fourth preset threshold.
  • FIG. 9 is a block diagram of an apparatus suitable for displaying framing information, according to an exemplary embodiment.
  • device 900 can be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a gaming console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
  • device 900 can include one or more of the following components: processing component 902, memory 904, power component 906, multimedia component 908, audio component 910, input/output (I/O) interface 912, sensor component 914, And a communication component 916.
  • Processing component 902 typically controls the overall operation of device 900, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • Processing component 902 can include one or more processors 920 to execute instructions to perform all or part of the steps described above. Additionally, processing component 902 can include a Or multiple modules to facilitate interaction between component 902 and other components. For example, processing component 902 can include a multimedia module to facilitate interaction between multimedia component 908 and processing component 902.
  • Memory 904 is configured to store various types of data to support operation at device 900. Examples of such data include instructions for any application or method operating on device 900, contact data, phone book data, messages, pictures, videos, and the like.
  • the memory 904 can be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • ROM Read Only Memory
  • Magnetic Memory Flash Memory
  • Disk Disk or Optical Disk.
  • Power component 906 provides power to various components of device 900.
  • Power component 906 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 900.
  • the multimedia component 908 includes a screen between the device 900 and the user that provides an output interface.
  • the screen can include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen can be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • the multimedia component 908 includes a front camera and/or a rear camera. When the device 900 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 910 is configured to output and/or input an audio signal.
  • audio component 910 includes a microphone (MIC) that is configured to receive an external audio signal when device 900 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 904 or transmitted via communication component 916.
  • the audio component 910 also includes a speaker for outputting an audio signal.
  • the I/O interface 912 provides an interface between the processing component 902 and the peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor assembly 914 includes one or more sensors for providing device 900 with various aspects of status assessment.
  • sensor component 914 can detect an open/closed state of device 900, relative positioning of components, such as the display and keypad of device 900, and sensor component 914 can also detect a change in position of one component of device 900 or device 900. The presence or absence of user contact with device 900, device 900 orientation or acceleration/deceleration, and temperature variation of device 900.
  • Sensor assembly 914 can include a proximity sensor configured for use To detect the presence of nearby objects without any physical contact.
  • Sensor assembly 914 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor component 914 can also include an acceleration sensor, a gyro sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 916 is configured to facilitate wired or wireless communication between device 900 and other devices.
  • the device 900 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
  • the communication component 916 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel.
  • the communication component 916 also includes a near field communication (NFC) module to facilitate short range communication.
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • device 900 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable A gate array
  • controller microcontroller, microprocessor, or other electronic component implementation for performing the above methods.
  • non-transitory computer readable storage medium comprising instructions, such as a memory 904 comprising instructions executable by processor 920 of apparatus 900 to perform the above method.
  • the non-transitory computer readable storage medium may be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Studio Devices (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Eyeglasses (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种显示取景信息的方法及装置,用以在智能眼镜上标识拍摄设备的取景范围的同时不影响用户的实际视野。所述方法包括:确定拍摄设备的取景信息;将所述取景信息显示在所述显示设备上。该方法可以使用户通过显示设备直接观看到拍摄设备的取景范围,实现拍摄设备与取景显示的分离,对用户而言,简化调整数码相机的取景范围的操作。

Description

显示取景信息的方法及装置
本申请基于申请号为CN 201510150277.7、申请日为2015年3月31日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本公开涉及电子技术领域,尤其涉及一种显示取景信息的方法及装置。
背景技术
随着科技的发展,越来越多的智能穿戴设备进入了普通用户的生活,智能穿戴设备不仅可以便利用户的生活,还可以更好的配合现有的电子产品。以智能眼镜为例,相关技术通过将数码相机当前采集的图像呈现在智能眼镜上,可以使用户不用对着数码相机的显示屏观看数码相机当前所采集的图像,从而可以降低数码相机的功耗。当数码相机需要调整当前的取景范围时,用户仍需要通过将智能眼镜摘下来,通过观看数码相机的显示屏上所显示的取景框来调整数码相机的拍摄方位,因此对于用户而言调整数码相机的取景范围的操作较为繁琐。
发明内容
为克服相关技术中存在的问题,本公开实施例提供一种显示取景信息的方法及装置,用以简化观看拍摄设备当前拍摄图像画面的取景范围的操作。
根据本公开实施例的第一方面,提供一种显示取景信息的方法,包括:
确定拍摄设备的取景信息;
将所述取景信息显示在所述显示设备上。
在一实施例中,所述将所述取景信息显示在所述显示设备上,可包括:
将所述取景信息转换为平行光束,所述平行光束的边界由所述取景信息来确定;
将所述平行光束投射到所述显示设备上。
在一实施例中,所述显示设备可以为智能眼镜,所述将所述取景信息显示在所述显示设备上,可包括:
确定所述取景信息对应的取景画面;
确定所述取景画面与所述智能眼镜的摄像装置所采集的图像画面之间的重复画面的百分比;
根据所述重复画面的百分比将所述取景信息显示在所述智能眼镜的镜片上。
在一实施例中,所述根据所述重复画面的百分比将所述取景信息显示在所述智能眼镜的镜片上,可包括:
如果所述重复画面的百分比大于第一预设阈值,将所述取景信息以完整取景框的方式显示在所述智能眼镜的镜片上;
如果所述重复画面的百分比小于所述第一预设阈值并且大于第二预设阈值,将所述取景信息以部分取景框的方式显示在所述镜片上。
在一实施例中,所述方法还可包括:
如果所述重复画面的百分比小于所述第二预设阈值,在所述镜片上显示用于引导用户调整视野方向的提示信息。
在一实施例中,所述显示设备可以为智能眼镜,所述将所述取景信息显示在所述显示设备上,可包括:
确定所述取景信息对应的取景画面;
确定所述取景画面的主体物与所述智能眼镜的摄像装置所采集的图像画面中的主体物之间的相似值;
根据所述相似值将所述取景信息显示在所述智能眼镜的镜片上。
在一实施例中,所述根据所述相似值确定将所述取景信息显示在所述智能眼镜的镜片上,可包括:
如果所述相似值大于第三预设阈值,将所述取景信息以完整取景框的方式显示在所述智能眼镜的镜片上;
如果所述相似值小于所述第三预设阈值并且大于第四预设阈值,将所述取景信息以部分取景框的方式显示在所述镜片上。
在一实施例中,所述方法还可包括:
如果所述相似值小于所述第四预设阈值,在所述镜片上显示用于引导用户调整视野方向的提示信息。
根据本公开实施例的第二方面,提供一种显示取景信息中的装置,应用在智能眼镜上,显示取景信息的装置包括:
确定模块,被配置为确定拍摄设备的取景信息;
显示模块,被配置为将所述确定模块确定的所述取景信息显示在所述显示设备上。
在一实施例中,所述显示模块可包括:
光束转换子模块,被配置为将所述确定模块确定的所述取景信息转换为平行光束,所述平行光束的边界由所述取景信息来确定;
投射子模块,被配置为将所述光束转换子模块转换的所述平行光束投射到所述显示设备上。
在一实施例中,所述显示模块可包括:
第一确定子模块,被配置为确定所述确定模块确定的所述取景信息对应的取景画面;
第二确定子模块,被配置为确定所述取景画面与智能眼镜的摄像装置所采集的图像画面之间的重复画面的百分比;
第一显示子模块,被配置为根据所述第二确定子模块确定的所述重复画面的百分比将所述取景信息显示在所述智能眼镜的镜片上。
在一实施例中,所述第一显示子模块可包括:
第二显示子模块,被配置为如果所述第二确定子模块确定的所述重复画面的百分比大于第一预设阈值,将所述取景信息以完整取景框的方式显示在所述智能眼镜的镜片上;
第三显示子模块,被配置为如果所述第二确定子模块确定的所述重复画面的百分比小于所述第一预设阈值并且大于第二预设阈值,将所述取景信息以部分取景框的方式显示在所述智能眼镜的镜片上。
在一实施例中,所述装置还可包括:
第一提醒模块,被配置为如果所述第二确定子模块确定的所述重复画面的百分比小于所述第二预设阈值,在所述镜片上显示用于引导用户调整视野方向的提示信息。
在一实施例中,所述显示模块可包括:
第三确定子模块,被配置为确定所述确定模块确定的所述取景信息对应的取景画面;
第四确定子模块,被配置为确定所述第三确定子模块确定的所述主体物与智能眼镜的摄像装置所采集的图像画面中的主体物之间的相似值;
第四显示子模块,被配置为根据所述第四确定子模块确定的所述相似值将所述取景信息显示在所述智能眼镜的镜片上。
在一实施例中,所述第四显示子模块可包括:
第五显示子模块,被配置为如果所述第四确定子模块确定的所述相似值大于第三预设阈值,将所述取景信息以完整取景框的方式显示在所述智能眼镜的镜片上;
第六显示子模块,被配置为如果所述第四确定子模块确定的所述相似值小于所述第三预设阈值并且大于第四预设阈值,将所述取景信息以部分取景框的方式显示在所述镜片上。
在一实施例中,所述装置还可包括:
第二提醒模块,被配置为如果所述第四确定子模块确定的所述相似值小于所述第四预设阈值,在所述镜片上显示用于引导用户调整视野方向的提示信息。
根据本公开实施例的第三方面,提供一种显示取景信息的装置,包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,所述处理器被配置为:
确定拍摄设备的取景信息;
将所述取景信息显示在所述显示设备上。
本公开的实施例提供的技术方案可以包括以下有益效果:通过将拍摄设备的取景信息显示在显示设备上,可以使用户通过显示设备直接观看到拍摄设备的取景范围,实现了拍摄设备与取景显示的分离,对用户而言,简化了调整数码相机的取景范围的操作。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本发明的实施例,并与说明书一起用于解释本发明的原理。
图1A是根据一示例性实施例示出的显示取景信息的方法的流程图。
图1B是本公开实施例所适用的一个场景示意图。
图2是根据一示例性实施例一示出的显示取景信息的方法的流程图。
图3A是本公开实施例所适用的智能眼镜与拍摄设备形成的系统图。
图3B是本公开实施例示出的取景信息对应的取景画面的示意图。
图3C是本公开实施例示出的智能眼镜所拍摄的图像画面的示意图之一。
图3D是本公开实施例示出的智能眼镜所拍摄的图像画面的示意图之二。
图3E是本公开实施例示出的智能眼镜所拍摄的图像画面的示意图之三。
图4是根据一示例性实施例二示出的显示取景信息的方法的流程图。
图5是根据一示例性实施例三实处的显示取景信息的方法的流程图。
图6是根据一示例性实施例示出的一种显示取景信息的装置的框图。
图7是根据一示例性实施例示出的另一种显示取景信息的装置的框图。
图8是根据一示例性实施例示出的再一种显示取景信息的装置的框图。
图9是根据一示例性实施例示出的一种适用于显示取景信息的装置的框图。
具体实施方式
图1A是根据一示例性实施例示出的显示取景信息的方法的流程图,图1B是本公开实施例所适用的一个场景示意图,该显示取景信息的方法可以应用在智能眼镜、玻璃眼镜等具有显示功能的设备上,本实施例结合图1B进行示例性说明,如图1A所示,该显示取景信息的方法包括以下步骤S101至S102:
在步骤S101中,确定拍摄设备的取景信息。
在一实施例中,可以通过拍摄设备的取景器来获取拍摄设备当前的取景信息。在一实施例中,取景信息可以通过无线传输或者有线传输的方式传输至显示设备,无线方式例如WIFI、蓝牙等,有线方式例如USB线等等,本公开对拍摄设备的取景信息的具体方式不做限制。
在步骤S102中,将取景信息显示在显示设备上。
在一实施例中,可以将取景信息以平行光束的方式显示在显示设备上;在另一实施例中,还可以将取景信息以取景框的方式显示在显示设备上。
下面结合图1B对图1A所示实施例进行示例性说明,立方体10设置在空间一位置处,拍摄设备11与用户均位于立方体10的A侧面,当拍摄设备11需要对立方体10进行拍摄时,如果拍摄设备11当前处于一位置较高处,此时用户并不能直接从拍摄设备11的显示屏上观看拍摄设备11的取景信息,在此情形下,通过本公开的方法将拍摄设备11当前的取景信息显示在显示设备12上,从而可以使用户在显示设备12上观看拍摄设备11当前的取景画面。
本实施例中,通过将拍摄设备的取景信息显示在显示设备上,可以使用户通过显示设备直接观看到拍摄设备的取景范围,实现了拍摄设备与取景显示的分离,对用户而言,简化了调整数码相机的取景范围的操作。
在一实施例中,将取景信息显示在显示设备上,可包括:
将取景信息转换为平行光束,平行光束的边界由取景信息来确定;
将平行光束投射到显示设备上。
在一实施例中,显示设备可以为智能眼镜,将取景信息显示在显示设备上,可包括:
确定取景信息对应的取景画面;
确定取景画面与智能眼镜的摄像装置所采集的图像画面之间的重复画面的百分比;
根据重复画面的百分比将取景信息显示在智能眼镜的镜片上。
在一实施例中,根据重复画面的百分比将取景信息显示在智能眼镜的镜片上,可包括:
如果重复画面的百分比大于第一预设阈值,将取景信息以完整取景框的方式显示在智能眼镜的镜片上;
如果重复画面的百分比小于第一预设阈值并且大于第二预设阈值,将取景信息以部分取景框的方式显示在镜片上。
在一实施例中,方法还可包括:
如果重复画面的百分比小于第二预设阈值,在镜片上显示用于引导用户调整视野方向的提示信息。
在一实施例中,显示设备可以为智能眼镜,将取景信息显示在显示设备上,可包括:
确定取景信息对应的取景画面;
确定取景画面的主体物与智能眼镜的摄像装置所采集的图像画面中的主体物之间的相似值;
根据相似值将取景信息显示在智能眼镜的镜片上。
在一实施例中,根据相似值确定将取景信息显示在智能眼镜的镜片上,可包括:
如果相似值大于第三预设阈值,将取景信息以完整取景框的方式显示在智能眼镜的镜片上;
如果相似值小于第三预设阈值并且大于第四预设阈值,将取景信息以部分取景框的方式显示在镜片上。
在一实施例中,方法还可包括:
如果相似值小于第四预设阈值,在镜片上显示用于引导用户调整视野方向的提示信息。
具体如何显示拍摄设备的取景信息的,请参考后续实施例。
至此,本公开实施例提供的上述方法,可以实现拍摄设备与取景显示的分离,对用户而言,简化了调整数码相机的取景范围的操作。
图2是根据一示例性实施例一示出的显示取景信息的方法的流程图,本实施例以显示设备为玻璃眼镜为例进行示例性说明,如图2所述,包括如下步骤:
在步骤S201中,确定拍摄设备的取景信息。
步骤S201的描述请参见上述步骤S101的描述,在此不再详述。
在步骤S202中,将取景信息转换为平行光束,平行光束的边界由取景信息来确定。
在一实施例中,可以才将电光转换的装置将取景信息转换为平行光束,例如,激光器,投影仪等。
在步骤S203中,将平行光束投射到玻璃眼镜上。
在一实施例中,可以通过在玻璃眼镜上设置微型投影装置,通过微型投影装置将拍摄设备的取景信息投射到玻璃眼镜的镜片上,从而使镜片能够达到透视的效果。
本实施例中,通过将拍摄设备的取景信息以平行光束的方式投射到玻璃眼睛上,在实现了拍摄设备与取景显示的分离的基础上;此外,由于近视将取景信息的边界投射到了玻璃眼镜上,用户既能够通过玻璃眼镜观看到拍摄设备当前的取景范围的同时,还能够通过玻璃眼镜观看到其眼前的真实场景,因此不会对用户的实际视野产生影响。
本领域技术人员可以理解的是,本公开上述实施例还可以通过将取景信息转换为平行光束,将平行光束通过投影仪投影到显示设备上。
图3A是本公开实施例所适用的智能眼镜与拍摄设备形成的系统图,图3B是本公开实施例示出的取景信息对应的取景画面的示意图,图3C是本公开实施例示出的智能眼镜所拍摄的图像画面的示意图之一,图3D是本公开实施例示出的智能眼镜所拍摄的图像画面的示意图之二,图3E是本公开实施例示出的智能眼镜所拍摄的图像画面的示意图之三;如图3A所示,智能眼镜31与拍摄设备11可以通过WIFI或者红外等无线方式进行通信,由此,拍摄设备11可以将其取景信息发送给智能眼镜31,其中,拍摄设备11可以通过设置在拍摄设备11上的取景器来确定智能设备32的镜头的取景信息,并将取景器所确定的取景信息发送给智能眼镜31,智能眼镜31将拍摄设备11的取景信息以取景框的方式显示在其镜片32上。在一实施例中,拍摄设备11可以为数码相机、运动相机、单反相机等能够采集数字图像的设备。
如图3B所示,为拍摄设备11当前的取景画面321,拍摄设备11的取景器通过该取景画面可以确定其取景信息。图3C所示为用户在佩戴智能眼镜31的一个角度智能眼镜31上的摄像装置所采集到的图像画面311,图3D所示为用户在佩戴智能眼镜31的另一个角度智能眼镜31上的摄像装置所采集到的图像画面312,图3E所示为用户在佩戴智能眼镜31的再一个角度智能眼镜31上的摄像装置所采集到的图像画面313。
例如,在一个示例性场景中,如图3B和图3C所示,拍摄设备11当前的取景画面321和智能眼镜31当前所拍摄的图像画面311中均出现了杯子,并且杯子在各自的画面中以大体相同的角度出现,在此情形下,可以确定取景画面与用户的视野方向大体一致,此时可以在智能眼镜31的镜片32上以取景框30的方式显示拍摄设备11的取景信息,用户在佩戴该智能眼镜31后,可以通过取景框30获知拍摄设备11当前的取景范围;在另一个示例性场景中,如图3D所示,用户佩戴智能眼镜31时从另一个角 度通过智能眼镜31的摄像装置采集到的图像画面312,从拍摄设备11的取景321和图像画面312的画面内容上来看,图像画面312中的杯子已经有一部分超出了智能眼镜31的视场范围,说明智能眼镜31与拍摄设备11当前的取景方向存在偏差,可以在镜片32上以部分取景框的方式显示拍摄设备11的取景画面,从而可以提醒用户在可视方向进行小幅度的调整,当检测到取景画面与用户的视野方向大体一致后,再将取景信息以取景框30的方式显示在镜片32上;在再一个示例性场景中,如图3E所示,用户佩戴智能眼镜31时从再一个角度通过智能眼镜31的摄像装置采集到的图像画面313,从拍摄设备11的取景画面321和图像画面313的画面内容上来看,图像画面313中的杯子基本上超出了用户当前的视场范围,说明智能眼镜31与拍摄设备11当前的取景方向存在较大偏差,此时可以在镜片32上显示用于引导用户调整视野方向的提示信息,以此引导用户调整其视野方向,直至检测到取景信息与用户的视野方向大体一致后,再将取景信息以取景框30的方式显示在镜片32上。
对于不能直接通过拍摄设备的取景器查看拍摄设备的视场范围的场合(例如,拍摄设备为运动相机,用户将运动相机固定在用户的头顶上方或沿着地面拍摄),本实施例通过将拍摄设备的取景信息显示在显示设备上,可以使用户通过显示设备直接观看到拍摄设备的取景范围,实现了拍摄设备与取景显示的分离,对用户而言,简化了调整数码相机的取景范围的操作。如果显示设备为智能眼镜,由于智能眼镜的镜片上没有呈现拍摄设备当前所采集的图像,因此不会对用户的实际视野产生影响,从而能够使用户通过智能眼镜观看到拍摄设备当前的拍摄范围的同时,还能够观看到其眼前的真实场景,提高了用户使用智能眼镜的体验。
图4是根据一示例性实施例二示出的显示取景信息的方法的流程图;本实施例利用本公开实施例提供的上述方法,以根据取景信息对应的取景画面与智能眼镜的摄像装置所采集的图像画面之间的重复画面的百分比将取景信息显示在智能眼镜的镜片上为例并结合上述图3A至图3E进行示例性说明,如图4所示,包括如下步骤:
在步骤S401中,确定拍摄设备的取景信息。
步骤S401的描述可以参见上述步骤S101的描述,在此不再详述。
在步骤S402中,确定取景信息对应的取景画面。
在一实施例中,取景画面同样可以通过与上述步骤S401的方式传输至智能眼镜。
在步骤S403中,确定取景画面与智能眼镜的摄像装置所采集的图像画面之间的重复画面的百分比。
在一实施例中,可以通过对取景信息对应的取景画面与智能眼镜的摄像装置采集的图像画面中的灰度信息的对应关系进行相似性和/或一致性的分析,从而确定二者之 间具有重复画面的百分比,例如,可以将取景画面作为基准,确定智能眼镜的摄像装置采集的图像画面与拍摄设备采集的图像画面中的画面重复的范围,例如,将智能眼镜的摄像装置采集的图像画面分辨率为640×480,确定智能眼镜的摄像装置采集的图像画面与拍摄设备采集的图像画面中的画面重复的范围的像素点的个数为320×120,由此可以则二者影像目标的方法重复画面的百分比为(320×150)/(640×480)=0.156。
在步骤S404中,将重复画面的百分比与第一预设阈值和第二预设阈值进行比较,通过比较结果确定将取景信息显示在智能眼镜的镜片上,例如,如果重复画面的百分比大于第一预设阈值,执行步骤S305,如果重复画面的百分比小于第一预设阈值并且大于第二预设阈值,执行步骤S306,如果重复画面的百分比小于第二预设阈值,执行步骤S307。
在一实施例中,第一预设阈值与第二预设阈值可以根据拍摄设备和智能眼镜的摄像装置的分辨率来确定,由此可以确保重复画面的百分比的计算精确度。
在步骤S405中,如果重复画面的百分比大于第一预设阈值,将取景信息以完整取景框的方式显示在智能眼镜的镜片上。
如图3C所示,智能眼镜31当前所拍摄的图像画面311中出现了杯子,由此在确定杯子所在场景在取景画面和图像画面中的重复画面的百分比大于第一预设阈值时,表明杯子在各自的画面中以大体相同的角度出现,在此情形下,可以确定取景画面与图像画面一致,此时可以在智能眼镜31的镜片32上显示以完整取景框30的方式显示取景信息,用户在佩戴该智能眼镜31后,可以通过完整取景框30获知拍摄设备11当前的取景范围。在一实施例中,可以将完整取景框30以彩色框的方式显示在镜片32上,例如,取景信息30在取景画面与图像画面大体一致时以绿色框的形式呈现在镜片32上。
在步骤S406中,如果重复画面的百分比小于第一预设阈值并且大于第二预设阈值,将取景信息以部分取景框的方式显示在镜片上。
如图3D所示,用户佩戴智能眼镜31时从另一个角度通过智能眼镜31的摄像装置采集到的图像画面312,从拍摄设备11的取景画面321和图像画面312的画面内容上来看,通过图像检测技术可以确定拍摄设备11所拍摄的图像321和图像画面312中的重复画面的百分比将会变小,例如重复画面的百分比小于第一预设阈值并且大于第二预设阈值时,表明智能眼镜31与拍摄设备11当前的取景方向存在偏差,可以在镜片32上显示部分取景信息30,并以红色框的形式呈现在镜片32上,以此提供用户进行小幅度的可视方向调整,当检测到取景信息与第二取景信息大体一致后,再将取景信息30显示在镜片32上。
在步骤S407中,如果重复画面的百分比小于所述第二预设阈值,在镜片上显示用于引导用户调整视野方向的提示信息。
如图3E所示,用户佩戴智能眼镜31时从再一个角度通过智能眼镜31的摄像装置采集到的图像画面313,从拍摄设备11所拍摄的图像画面321和图像画面313的画面内容上来看,通过图像检测技术可以确定拍摄设备11所拍摄的图像321和图像画面312中的重复画面的百分比将会变小,例如小于第二预设阈值时,说明图像画面313中的杯子基本上超出了智能眼镜31的视场范围,智能眼镜31与拍摄设备11当前的取景方向存在较大偏差,此时可以在镜片32上对用户需要移动的方向进行提示,以此引导用户调整其视野方向,直至检测到取景信息与第二取景信息大体一致后,再将取景信息30显示在镜片32上,例如,通过箭头提示用户调整其当前的视场范围,从而使用户能够获知拍摄设备11当前的视场范围。
对于不能直接通过拍摄设备的取景器查看拍摄设备的视场范围的场合(例如,拍摄设备为运动相机,用户将运动相机固定在用户的头顶上方或沿着地面拍摄),本实施例根据重复画面的百分比确定拍摄设备的取景信息在智能眼镜的镜片上的显示方式,不会对用户的实际视野产生影响,从而能够使用户通过智能眼镜观看到拍摄设备当前的拍摄范围的同时,还能够观看到其面前的真实场景,提高了用户使用智能眼镜的体验。
图5是根据一示例性实施例三示出的显示取景信息的方法的流程;本实施例利用本公开实施例提供的上述方法,以如何根据取景信息对应的取景画面中的主体物与智能眼镜的摄像装置所采集的图像画面中的主体物之间的相似值来将拍摄设备的取景信息显示在智能眼镜的镜片上为例并结合上述图3A至图3E进行示例性说明,如图5所示,包括如下步骤:
在步骤S501中,确定拍摄设备的取景信息。
步骤S501的描述可以参见上述步骤S101的描述,在此不再详述。
在步骤S502中,确定取景信息对应的取景画面。
步骤S502的描述可以参见上述步骤S402的描述,在此不再详述。
在步骤S503中,确定取景画面的主体物与智能眼镜的摄像装置所采集的图像画面中的主体物之间的相似值。
在一实施例中,可以通过对拍摄设备采集的取景画面与智能眼镜的摄像装置采集的图像画面中的画面内容(本公开实施例中的主体物)的对应关系进行相似性分析,从而确定二者之间的相似值。
在步骤S504中,将相似值与第三预设阈值和第四预设阈值进行比较,通过比较结 果确定将取景信息显示在智能眼镜的镜片上,例如,如果相似值大于第三预设阈值,执行步骤S505,如果相似值小于第三预设阈值并且大于第四预设阈值,执行步骤S506,如果相似值小于第四预设阈值,执行步骤S507。
在一实施例中,第三预设阈值与第四预设阈值可以根据拍摄设备和智能眼镜的摄像装置的分辨率来确定,由此可以确保重复画面的百分比的计算精确度。
在步骤S505中,如果相似值大于第三预设阈值,将取景信息以完整取景框的方式显示在智能眼镜的镜片上。
如图3C所示,如图3B和图3C所示,拍摄设备11当前的取景画面321和智能眼镜31当前所拍摄的图像湖面311中均出现了杯子,如果通过图像检测技术确定二者之间的相似值大于第三预设阈值,确定并且杯子在各自的画面中以大体相同的角度出现,在此情形下,可以确定取景湖面321与图像画面311的取景方向大体一致,此时可以在智能眼镜11的镜片32上以取景框30的方式显示拍摄设备11的取景信息,用户在佩戴该智能眼镜31后,可以通过取景框30获知拍摄设备11当前的取景范围。在一实施例中,可以将取景框30以彩色框的方式显示在镜片33上,例如,在取景画面321与图像画面311的取景方向大体一致时取景框30以绿色框的形式呈现在镜片32上。
在步骤S506中,如果相似值小于第三预设阈值并且大于第四预设阈值,将取景信息以部分取景框的方式显示在镜片上。
如图3D所示,用户佩戴智能眼镜11时从另一个角度通过智能眼镜11的摄像装置采集到的图像画面312,从拍摄设备11所拍摄的取景画面321和图像画面312的画面内容上来看,通过图像检测技术可以确定拍摄设备11所拍摄的取景画面321和图像画面312中的杯子的相似值将会变小,例如,杯子的相似值小于第三预设阈值并且大于第四预设阈值时,表明智能眼镜31与拍摄设备11当前的取景方向存在偏差,可以在镜片32上显示部分取景框的方式显示取景信息,并以红色框的形式呈现在镜片32上,以此提供用户进行小幅度的可视方向调整,当检测到取景信息与第二取景信息大体一致后,再将取景信息以完整取景框的方式显示在镜片32上。
在步骤S507中,如果相似值小于所述第四预设阈值,在镜片上显示用于引导用户调整视野方向的提示信息。
如图3E所示,用户佩戴智能眼镜11时从再一个角度通过智能眼镜11的摄像装置采集到的图像画面313,从拍摄设备11所拍摄的取景画面321和图像画面313的画面内容上来看,通过图像检测技术可以确定拍摄设备11所拍摄的取景画面321和图像画面312中的杯子的相似值将会变小,例如小于第四预设阈值时,说明图像画面313中的杯子基本上超出了智能眼镜11的视场范围,智能眼镜11与拍摄设备11当前的取景 方向存在较大偏差,此时可以在镜片32上对用户需要移动的方向进行提示,以此引导用户调整其视野方向,直至检测到取景信息与第二取景信息大体一致后,再将取景信息以完整取景框的方式显示在镜片32上,例如,通过箭头提示用户调整其当前的视场范围,从而使用户能够获知拍摄设备11当前的视场范围。
对于不能直接通过拍摄设备的取景器查看拍摄设备的视场范围的场合(例如,拍摄设备为运动相机,用户将运动相机固定在用户的头顶上方或沿着地面拍摄),本实施例根据相似值确定拍摄设备的取景信息在智能眼镜的镜片上的显示方式,不会对用户的实际视野产生影响,从而能够使用户通过智能眼镜观看到拍摄设备当前的拍摄范围的同时,还能够观看到其面前的真实场景,提高了用户使用智能眼镜的体验。
本领域技术人员可以理解的是,上述仅以重复画面的百分比和主体物进行示例性说明,本公开还可以通过取景画面和图像画面中的图像特征、图像画面中的纹理特征等信息来确定二者之间在视场范围的一致程度,上述重复画面的百分比和主体物的示例性说明并不能形成对本公开的限制。
图6是根据一示例性实施例示出的一种显示取景信息的装置的框图,如图6所示,显示取景信息的装置包括:
确定模块61,被配置为确定拍摄设备的取景信息;
显示模块62,被配置为将确定模块61确定的取景信息显示在显示设备上。
图7是根据一示例性实施例示出的另一种显示取景信息的装置的框图,在上述图6所示实施例的基础上,上述显示模块62可包括:
光束转换子模块621,被配置为将确定模块61确定的取景信息转换为平行光束,平行光束的边界由取景信息来确定;
投射子模块622,被配置为将光束转换子模块621转换的平行光束投射到显示设备上。
图8是根据一示例性实施例示出的再一种显示取景信息的装置的框图,在上述图6或图7所示实施例的基础上,上述显示模块62可包括:
第一确定子模块623,被配置为确定确定模块61确定的取景信息对应的取景画面;
第二确定子模块624,被配置为确定第一确定子模块623确定的取景画面与智能眼镜的摄像装置所采集的图像画面之间的重复画面的百分比;
第一显示子模块625,被配置为根据第二确定子模块624确定的重复画面的百分比将取景信息显示在智能眼镜的镜片上。
在一实施例中,第一显示子模块625可包括:
第二显示子模块6251,被配置为如果第二确定子模块624确定的重复画面的百分 比大于第一预设阈值,将取景信息以完整取景框的方式显示在智能眼镜的镜片上;
第三显示子模块6252,被配置为如果第二确定子模块624确定的重复画面的百分比小于第一预设阈值并且大于第二预设阈值,将取景信息以部分取景框的方式显示在智能眼镜的镜片上。
在一实施例中,装置还可包括:
第一提醒模块63,被配置为如果第二确定子模块624确定的重复画面的百分比小于第二预设阈值,在镜片上显示用于引导用户调整视野方向的提示信息。
在一实施例中,显示模块62可包括:
第三确定子模块626,被配置为确定确定模块61确定的取景信息对应的取景画面;
第四确定子模块627,被配置为确定第三确定子模块626确定的主体物与智能眼镜的摄像装置所采集的图像画面中的主体物之间的相似值;
第四显示子模块628,被配置为根据第四确定子模块627确定的相似值将取景信息显示在智能眼镜的镜片上。
在一实施例中,第四显示子模块628可包括:
第五显示子模块6281,被配置为如果第四确定子模块627确定的相似值大于第三预设阈值,将取景信息以完整取景框的方式显示在智能眼镜的镜片上;
第六显示子模块6282,被配置为如果第四确定子模块627确定的相似值小于第三预设阈值并且大于第四预设阈值,将取景信息以部分取景框的方式显示在镜片上.
在一实施例中,装置还可包括:
第二提醒模块64,被配置为如果第四确定子模块627确定的相似值小于第四预设阈值,在镜片上显示用于引导用户调整视野方向的提示信息。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
图9是根据一示例性实施例示出的一种适用于显示取景信息的装置的框图。例如,装置900可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等。
参照图9,装置900可以包括以下一个或多个组件:处理组件902,存储器904,电源组件906,多媒体组件908,音频组件910,输入/输出(I/O)的接口912,传感器组件914,以及通信组件916。
处理组件902通常控制装置900的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理元件902可以包括一个或多个处理器920来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件902可以包括一个 或多个模块,便于处理组件902和其他组件之间的交互。例如,处理部件902可以包括多媒体模块,以方便多媒体组件908和处理组件902之间的交互。
存储器904被配置为存储各种类型的数据以支持在设备900的操作。这些数据的示例包括用于在装置900上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器904可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电力组件906为装置900的各种组件提供电力。电力组件906可以包括电源管理系统,一个或多个电源,及其他与为装置900生成、管理和分配电力相关联的组件。
多媒体组件908包括在所述装置900和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件908包括一个前置摄像头和/或后置摄像头。当设备900处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件910被配置为输出和/或输入音频信号。例如,音频组件910包括一个麦克风(MIC),当装置900处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器904或经由通信组件916发送。在一些实施例中,音频组件910还包括一个扬声器,用于输出音频信号。
I/O接口912为处理组件902和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件914包括一个或多个传感器,用于为装置900提供各个方面的状态评估。例如,传感器组件914可以检测到设备900的打开/关闭状态,组件的相对定位,例如所述组件为装置900的显示器和小键盘,传感器组件914还可以检测装置900或装置900一个组件的位置改变,用户与装置900接触的存在或不存在,装置900方位或加速/减速和装置900的温度变化。传感器组件914可以包括接近传感器,被配置用 来在没有任何的物理接触时检测附近物体的存在。传感器组件914还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件914还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件916被配置为便于装置900和其他设备之间有线或无线方式的通信。装置900可以接入基于通信标准的无线网络,如WiFi,2G或3G,或它们的组合。在一个示例性实施例中,通信部件916经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信部件916还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,装置900可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述方法。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器904,上述指令可由装置900的处理器920执行以完成上述方法。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
本领域技术人员在考虑说明书及实践这里公开的公开后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。

Claims (17)

  1. 一种显示取景信息的方法,应用在显示设备上,其特征在于,所述方法包括:
    确定拍摄设备的取景信息;
    将所述取景信息显示在所述显示设备上。
  2. 根据权利要求1所述的方法,其特征在于,所述将所述取景信息显示在所述显示设备上,包括:
    将所述取景信息转换为平行光束,所述平行光束的边界由所述取景信息来确定;
    将所述平行光束投射到所述显示设备上。
  3. 根据权利要求1所述的方法,其特征在于,所述显示设备为智能眼镜,所述将所述取景信息显示在所述显示设备上,包括:
    确定所述取景信息对应的取景画面;
    确定所述取景画面与所述智能眼镜的摄像装置所采集的图像画面之间的重复画面的百分比;
    根据所述重复画面的百分比将所述取景信息显示在所述智能眼镜的镜片上。
  4. 根据权利要求3所述的方法,其特征在于,所述根据所述重复画面的百分比将所述取景信息显示在所述智能眼镜的镜片上,包括:
    如果所述重复画面的百分比大于第一预设阈值,将所述取景信息以完整取景框的方式显示在所述智能眼镜的镜片上;
    如果所述重复画面的百分比小于所述第一预设阈值并且大于第二预设阈值,将所述取景信息以部分取景框的方式显示在所述镜片上。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    如果所述重复画面的百分比小于所述第二预设阈值,在所述镜片上显示用于引导用户调整视野方向的提示信息。
  6. 根据权利要求1所述的方法,其特征在于,所述显示设备为智能眼镜,所述将所述取景信息显示在所述显示设备上,包括:
    确定所述取景信息对应的取景画面;
    确定所述取景画面的主体物与所述智能眼镜的摄像装置所采集的图像画面中的主体物之间的相似值;
    根据所述相似值将所述取景信息显示在所述智能眼镜的镜片上。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述相似值确定将所述取景信息显示在所述智能眼镜的镜片上,包括:
    如果所述相似值大于第三预设阈值,将所述取景信息以完整取景框的方式显示在所述智能眼镜的镜片上;
    如果所述相似值小于所述第三预设阈值并且大于第四预设阈值,将所述取景信息以部分取景框的方式显示在所述镜片上。
  8. 根据权利要求7所述的方法,其特征在于,所述方法还包括:
    如果所述相似值小于所述第四预设阈值,在所述镜片上显示用于引导用户调整视野方向的提示信息。
  9. 一种显示取景信息的装置,其特征在于,所述装置包括:
    确定模块,被配置为确定拍摄设备的取景信息;
    显示模块,被配置为将所述确定模块确定的所述取景信息显示在所述显示设备上。
  10. 根据权利要求9所述的装置,其特征在于,所述显示模块包括:
    光束转换子模块,被配置为将所述确定模块确定的所述取景信息转换为平行光束,所述平行光束的边界由所述取景信息来确定;
    投射子模块,被配置为将所述光束转换子模块转换的所述平行光束投射到所述显示设备上。
  11. 根据权利要求9所述的装置,其特征在于,所述显示模块包括:
    第一确定子模块,被配置为确定所述确定模块确定的所述取景信息对应的取景画面;
    第二确定子模块,被配置为确定所述取景画面与智能眼镜的摄像装置所采集的图像画面之间的重复画面的百分比;
    第一显示子模块,被配置为根据所述第二确定子模块确定的所述重复画面的百分比将所述取景信息显示在所述智能眼镜的镜片上。
  12. 根据权利要求11所述的装置,其特征在于,所述第一显示子模块包括:
    第二显示子模块,被配置为如果所述第二确定子模块确定的所述重复画面的百分比大于第一预设阈值,将所述取景信息以完整取景框的方式显示在所述智能眼镜的镜片上;
    第三显示子模块,被配置为如果所述第二确定子模块确定的所述重复画面的百分比小于所述第一预设阈值并且大于第二预设阈值,将所述取景信息以部分取景框的方式显示在所述智能眼镜的镜片上。
  13. 根据权利要求12所述的装置,其特征在于,所述装置还包括:
    第一提醒模块,被配置为如果所述第二确定子模块确定的所述重复画面的百分比小于所述第二预设阈值,在所述镜片上显示用于引导用户调整视野方向的提示信息。
  14. 根据权利要求9所述的装置,其特征在于,所述显示模块包括:
    第三确定子模块,被配置为确定所述确定模块确定的所述取景信息对应的取景画面;
    第四确定子模块,被配置为确定所述第三确定子模块确定的所述主体物与智能眼镜的摄像装置所采集的图像画面中的主体物之间的相似值;
    第四显示子模块,被配置为根据所述第四确定子模块确定的所述相似值将所述取景信息显示在所述智能眼镜的镜片上。
  15. 根据权利要求14所述的装置,其特征在于,所述第四显示子模块包括:
    第五显示子模块,被配置为如果所述第四确定子模块确定的所述相似值大于第三预设阈值,将所述取景信息以完整取景框的方式显示在所述智能眼镜的镜片上;
    第六显示子模块,被配置为如果所述第四确定子模块确定的所述相似值小于所述第三预设阈值并且大于第四预设阈值,将所述取景信息以部分取景框的方式显示在所述镜片上。
  16. 根据权利要求15所述的装置,其特征在于,所述装置还包括:
    第二提醒模块,被配置为如果所述第四确定子模块确定的所述相似值小于所述第四预设阈值,在所述镜片上显示用于引导用户调整视野方向的提示信息。
  17. 一种显示取景信息的装置,其特征在于,所述装置包括:
    处理器;
    用于存储处理器可执行指令的存储器;
    其中,所述处理器被配置为:
    确定拍摄设备的取景信息;
    将所述取景信息显示在所述显示设备上。
PCT/CN2015/088686 2015-03-31 2015-08-31 显示取景信息的方法及装置 WO2016155227A1 (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
MX2015015743A MX357218B (es) 2015-03-31 2015-08-31 Metodo y aparato para mostrar informacion de encuadre.
JP2017508736A JP6259544B2 (ja) 2015-03-31 2015-08-31 フレーミング情報の表示方法及び装置
RU2015151619A RU2635873C2 (ru) 2015-03-31 2015-08-31 Способ и устройство для отображения информации кадрирования
BR112015030257A BR112015030257A2 (pt) 2015-03-31 2015-08-31 método e aparelho para exibir informação de enquadramento
KR1020157034228A KR101701814B1 (ko) 2015-03-31 2015-08-31 프레이밍 정보를 디스플레이하기 위한 방법 및 장치
US14/955,313 US20160295118A1 (en) 2015-03-31 2015-12-01 Method and apparatus for displaying framing information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510150277.7 2015-03-31
CN201510150277.7A CN104702848B (zh) 2015-03-31 2015-03-31 显示取景信息的方法及装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/955,313 Continuation US20160295118A1 (en) 2015-03-31 2015-12-01 Method and apparatus for displaying framing information

Publications (1)

Publication Number Publication Date
WO2016155227A1 true WO2016155227A1 (zh) 2016-10-06

Family

ID=53349586

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/088686 WO2016155227A1 (zh) 2015-03-31 2015-08-31 显示取景信息的方法及装置

Country Status (8)

Country Link
US (1) US20160295118A1 (zh)
JP (1) JP6259544B2 (zh)
KR (1) KR101701814B1 (zh)
CN (1) CN104702848B (zh)
BR (1) BR112015030257A2 (zh)
MX (1) MX357218B (zh)
RU (1) RU2635873C2 (zh)
WO (1) WO2016155227A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104702848B (zh) * 2015-03-31 2019-02-12 小米科技有限责任公司 显示取景信息的方法及装置
JP2017060078A (ja) * 2015-09-18 2017-03-23 カシオ計算機株式会社 画像録画システム、ユーザ装着装置、撮像装置、画像処理装置、画像録画方法、及びプログラム
US10499001B2 (en) * 2017-03-16 2019-12-03 Gvbb Holdings S.A.R.L. System and method for augmented video production workflow
CN107101633A (zh) * 2017-04-13 2017-08-29 清华大学 一种可呈现疏散指令的智能穿戴设备及疏散指令呈现方法
CN111324267B (zh) * 2020-02-18 2021-06-22 Oppo(重庆)智能科技有限公司 图像显示方法及相关装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1323487A (zh) * 1998-08-14 2001-11-21 英特尔公司 用于产生可投影的对象取景器的方法和设备
US20060072820A1 (en) * 2004-10-05 2006-04-06 Nokia Corporation System and method for checking framing and sharpness of a digital image
CN203800973U (zh) * 2014-04-10 2014-08-27 哈尔滨吐火罗软件有限公司 一种手机相机/手机摄像机附属的取景定位装置
CN104702848A (zh) * 2015-03-31 2015-06-10 小米科技有限责任公司 显示取景信息的方法及装置
CN104765163A (zh) * 2015-04-27 2015-07-08 小米科技有限责任公司 取景信息的显示方法、装置以及智能眼镜

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07140524A (ja) * 1993-11-15 1995-06-02 Canon Inc カメラのファインダー装置
US6977676B1 (en) * 1998-07-08 2005-12-20 Canon Kabushiki Kaisha Camera control system
US20030014212A1 (en) * 2001-07-12 2003-01-16 Ralston Stuart E. Augmented vision system using wireless communications
JP2004128587A (ja) * 2002-09-30 2004-04-22 Minolta Co Ltd デジタルカメラ
JP2005252732A (ja) * 2004-03-04 2005-09-15 Olympus Corp 撮像装置
JP2006211543A (ja) * 2005-01-31 2006-08-10 Konica Minolta Photo Imaging Inc 撮像画角選定システム及び撮像画角選定方法
RU2329535C2 (ru) * 2006-05-24 2008-07-20 Самсунг Электроникс Ко., Лтд. Способ автоматического кадрирования фотографий
JP2008083289A (ja) * 2006-09-27 2008-04-10 Sony Corp 撮像表示装置、撮像表示方法
US8786675B2 (en) * 2008-01-23 2014-07-22 Michael F. Deering Systems using eye mounted displays
JP4946914B2 (ja) * 2008-02-26 2012-06-06 株式会社ニコン カメラシステム
JP5136209B2 (ja) * 2008-05-23 2013-02-06 セイコーエプソン株式会社 未現像画像データの現像処理装置、現像処理方法、および現像処理のためのコンピュータプログラム
JP5396098B2 (ja) * 2009-02-17 2014-01-22 オリンパス株式会社 撮像システム及び画像処理方法並びに画像処理プログラム
JP2010206643A (ja) * 2009-03-04 2010-09-16 Fujifilm Corp 撮像装置、方法およびプログラム
KR101487944B1 (ko) * 2010-02-24 2015-01-30 아이피플렉 홀딩스 코포레이션 시각 장애인들을 지원하는 증강 현실 파노라마
US20130278631A1 (en) * 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
JP2012114655A (ja) * 2010-11-24 2012-06-14 Canon Inc 被写体追尾カメラシステム
JP5738657B2 (ja) * 2011-04-08 2015-06-24 オリンパス株式会社 撮像装置
US8767083B2 (en) * 2011-05-17 2014-07-01 Fairchild Semiconductor Corporation Remote display glasses camera system and method
TW201331767A (zh) * 2012-07-04 2013-08-01 Sense Digital Co Ltd 快速設定圖樣取像範圍之方法
JP6235777B2 (ja) * 2012-12-19 2017-11-22 カシオ計算機株式会社 撮像装置、撮像方法及びプログラム、並びに、表示装置、表示方法及びプログラム
JP6337431B2 (ja) * 2013-08-28 2018-06-06 株式会社ニコン システム、サーバ、電子機器およびプログラム
KR102119659B1 (ko) * 2013-09-23 2020-06-08 엘지전자 주식회사 영상표시장치 및 그것의 제어 방법
KR102088020B1 (ko) * 2013-09-26 2020-03-11 엘지전자 주식회사 헤드 마운트 디스플레이 및 제어 방법
CN103533247A (zh) * 2013-10-22 2014-01-22 小米科技有限责任公司 一种自拍方法、装置和终端设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1323487A (zh) * 1998-08-14 2001-11-21 英特尔公司 用于产生可投影的对象取景器的方法和设备
US20060072820A1 (en) * 2004-10-05 2006-04-06 Nokia Corporation System and method for checking framing and sharpness of a digital image
CN101065959A (zh) * 2004-10-05 2007-10-31 诺基亚公司 用于检验数字图像的取景和锐度的系统和方法
CN203800973U (zh) * 2014-04-10 2014-08-27 哈尔滨吐火罗软件有限公司 一种手机相机/手机摄像机附属的取景定位装置
CN104702848A (zh) * 2015-03-31 2015-06-10 小米科技有限责任公司 显示取景信息的方法及装置
CN104765163A (zh) * 2015-04-27 2015-07-08 小米科技有限责任公司 取景信息的显示方法、装置以及智能眼镜

Also Published As

Publication number Publication date
MX357218B (es) 2018-06-29
MX2015015743A (es) 2017-03-20
RU2635873C2 (ru) 2017-11-16
JP2017519461A (ja) 2017-07-13
JP6259544B2 (ja) 2018-01-10
CN104702848A (zh) 2015-06-10
CN104702848B (zh) 2019-02-12
RU2015151619A (ru) 2017-06-06
US20160295118A1 (en) 2016-10-06
KR101701814B1 (ko) 2017-02-02
KR20160127631A (ko) 2016-11-04
BR112015030257A2 (pt) 2017-07-25

Similar Documents

Publication Publication Date Title
US9674395B2 (en) Methods and apparatuses for generating photograph
US9948863B2 (en) Self-timer preview image presentation method and apparatus, and terminal
CN108419016B (zh) 拍摄方法、装置及终端
CN106210496B (zh) 照片拍摄方法及装置
EP3179711A2 (en) Method and apparatus for preventing photograph from being shielded
TWI749593B (zh) 去除圖像中的反光的方法、電子設備和電腦可讀儲存媒體
WO2017016146A1 (zh) 图像显示方法及装置
WO2021047077A1 (zh) 基于多摄像模块的图像处理方法、装置、设备及介质
EP3076660B1 (en) Method and apparatus for displaying framing information
WO2016011747A1 (zh) 肤色调整方法和装置
WO2017124899A1 (zh) 一种信息处理方法及装置、电子设备
WO2017012269A1 (zh) 通过图像确定空间参数的方法、装置及终端设备
WO2017092128A1 (zh) 预览图像的显示方法和装置
WO2016155227A1 (zh) 显示取景信息的方法及装置
EP3544286B1 (en) Focusing method, device and storage medium
WO2016192325A1 (zh) 视频文件的标识处理方法及装置
WO2016101481A1 (zh) 自动对焦方法及装置
KR101906748B1 (ko) 홍채 이미지 획득 방법 및 장치, 및 홍채 인식 장치
WO2015180683A1 (zh) 移动终端及摄像参数的设置方法和装置、计算机存储介质
US11265529B2 (en) Method and apparatus for controlling image display
US20130293682A1 (en) Image capture device, image capture method, and program
KR102512787B1 (ko) 촬영 프리뷰 이미지를 표시하는 방법, 장치 및 매체
CN112866555B (zh) 拍摄方法、装置、设备及存储介质
US9619016B2 (en) Method and device for displaying wallpaper image on screen
WO2023225910A1 (zh) 视频显示方法及装置、终端设备及计算机存储介质

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: MX/A/2015/015743

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2017508736

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2015151619

Country of ref document: RU

Ref document number: 1020157034228

Country of ref document: KR

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112015030257

Country of ref document: BR

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15887184

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 112015030257

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20151202

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15887184

Country of ref document: EP

Kind code of ref document: A1