WO2019124681A1 - Camera for internet broadcasting and uploading - Google Patents

Camera for internet broadcasting and uploading Download PDF

Info

Publication number
WO2019124681A1
WO2019124681A1 PCT/KR2018/010733 KR2018010733W WO2019124681A1 WO 2019124681 A1 WO2019124681 A1 WO 2019124681A1 KR 2018010733 W KR2018010733 W KR 2018010733W WO 2019124681 A1 WO2019124681 A1 WO 2019124681A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
unit
image
user
server
Prior art date
Application number
PCT/KR2018/010733
Other languages
French (fr)
Korean (ko)
Inventor
이민구
Original Assignee
주식회사 더에스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 더에스 filed Critical 주식회사 더에스
Publication of WO2019124681A1 publication Critical patent/WO2019124681A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a camera for Internet broadcasting and uploading.
  • SNS Due to the popularization of SNS, users who upload pictures or images photographed through cameras (for example, cameras) to the SNS are increasing every year. SNS users take pictures through a smartphone or camera, edit the pictures on a computer, and upload them to the SNS.
  • the process of uploading photos or images taken by a smartphone is not complicated, but the image quality is relatively poor compared to a general camera.
  • the general camera is relatively good in image quality, the Android OS is not installed, or the communication module or application is not optimized for SNS picture upload.
  • a problem to be solved by the present invention is to provide a camera for Internet broadcasting and uploading.
  • a problem to be solved by the present invention is to provide a camera in which the process up to shooting is not complicated by providing a shooting environment optimized for a user with one button.
  • a camera comprising: a photographing unit attached to a left side or a right side of a camera to photograph a photograph or an image; A mount hole located at a lower portion of the camera and mated with a mount for supporting the camera to be fixed; A transmitting / receiving unit for transmitting the photographed image to a server through data streaming every predetermined period of a first period; And a controller for controlling the devices included in the camera.
  • the camera may further include a battery detecting unit for measuring a remaining battery level and a temperature sensor for measuring a temperature inside the camera, wherein when the remaining battery level is less than or equal to a threshold value, Is equal to or less than a threshold value, the streaming period is changed from a first period to a second period, and the second period is longer than the first period.
  • a battery detecting unit for measuring a remaining battery level
  • a temperature sensor for measuring a temperature inside the camera, wherein when the remaining battery level is less than or equal to a threshold value, Is equal to or less than a threshold value, the streaming period is changed from a first period to a second period, and the second period is longer than the first period.
  • the method may further include a face recognizing unit that recognizes a face of a person in the photographed image, and when the mosaic mode is selected and the face of a specific person is selected from the photographed image, And after the mosaic processing of the face of the person excluding the first person in the photographed image, data can be streamed to the server through the transceiver.
  • a face recognizing unit that recognizes a face of a person in the photographed image, and when the mosaic mode is selected and the face of a specific person is selected from the photographed image, And after the mosaic processing of the face of the person excluding the first person in the photographed image, data can be streamed to the server through the transceiver.
  • the apparatus further includes a motion recognition unit that recognizes the motion of the user.
  • the mode is switched to the photographing mode.
  • the second motion is recognized in the photographing mode.
  • a timer is set according to the type of the third motion, and a photograph is taken after the timer.
  • control unit uploads the photographed image using the third motion to the user's SNS account set in the control unit.
  • the control unit recognizes the user's face on the current photographic image to extract the user's eyes, nose, and lips, and sets a first reference line from the brow to the nose and lips center And corrects the angle of the image so that the angle of the first reference line is maintained.
  • the control unit sets a second reference line connecting both eyes of the user on the current photographic image, and sets an image photographed through the photographing unit to an angle of the first reference line and the second reference line The angle of the image is corrected.
  • the camera transmits the image received from the camera to the broadcast platform on which the server is linked and broadcasts it.
  • the server requests advertisement transmission to the broadcasting platform when the communication state between the camera, the server or the server and the broadcasting platform is poor.
  • the control unit When the fifth motion of the user is recognized, the control unit requests the server to store the photographed image until the sixth motion of the user is recognized as a highlight image, And transmitting the highlight image to the broadcasting platform when the communication state between the server or the server and the broadcasting platform is poor, and broadcasting the highlight image until the communication state is normalized.
  • the present invention it is possible to upload a photographed image directly to the SNS without storing the photographed image / image on a computer and uploading it to the SNS.
  • a multi-channel live streaming server capable of interlocking a plurality of cameras can be used to check images captured by a plurality of cameras at a glance, .
  • the present invention can be used as a black box detachable from a vehicle.
  • FIG. 1 is a conceptual diagram showing an embodiment of a network for carrying out the present invention.
  • FIG. 2 is a block diagram showing an embodiment of apparatuses constituting the network of the present invention.
  • FIG. 3 is a block diagram illustrating an embodiment of a camera according to an embodiment of the present invention.
  • FIG. 4 is a perspective view of a camera according to a first embodiment of the present invention.
  • FIG. 5 is a perspective view of a camera according to a second embodiment of the present invention.
  • FIG. 6 is a first perspective view of a camera according to a third embodiment of the present invention.
  • FIG. 7 is a second perspective view of a camera according to a third embodiment of the present invention.
  • FIG. 8 is a conceptual diagram illustrating an operation process of a display screen according to an embodiment of the present invention.
  • FIG. 9 is a conceptual diagram showing an embodiment of devices that support the use of the camera of the present invention.
  • FIG. 10 is a conceptual diagram illustrating a screen storage method according to an embodiment of the present invention.
  • spatially relative can be used to easily describe a correlation between an element and other elements.
  • Spatially relative terms should be understood in terms of the directions shown in the drawings, including the different directions of components at the time of use or operation. For example, when inverting an element shown in the figures, an element described as “below” or “beneath” of another element may be placed “above” another element .
  • the exemplary term “below” can include both downward and upward directions.
  • the components can also be oriented in different directions, so that spatially relative terms can be interpreted according to orientation.
  • FIG. 1 is a conceptual diagram showing an embodiment of a network for carrying out the present invention.
  • a system for supporting operation of a camera of the present invention includes at least one of a camera 100, a server 200, and a user terminal 300.
  • the camera 100, the server 200, and the user terminal 300 are connected to the network 400.
  • Fig. 3 to 7 The camera 100 of the present invention is specifically described in Figs. 3 to 7 and 9. Fig.
  • Examples of the server 200 of the present invention include a cloud server, an IP Multimedia Subsystem (IMS) server, a telephony application server, an instant messaging server, a media gateway control function (MGCF) server, (MSG) server, and a CSCF (Call Session Control Function) server.
  • the server 200 may include an object capable of transmitting and receiving data such as a PC (Personal Computer), a notebook computer, and a tablet PC May be implemented as a device.
  • Examples of the terminal 300 of the present invention include a desktop computer, a laptop computer, a tablet personal computer, a wireless phone, a mobile phone, a smart phone, (MS), Machine-Type Communication (MTC), Machine-to-Machine (M2M), Device-to-Device (D2D), User Equipment A wireless terminal, a wireless terminal (WT), an access terminal (AT), a wireless transmit / receive unit (WTRU), a subscriber station (SS), a subscriber A personal digital assistant (PDA) having a wireless communication function, a portable game machine having a wireless communication function, a navigation device, a digital camera Digital Camera), DMB (Digital Multimedia Broadcasting) player, Digital A digital audio recorder, a digital audio player, a digital picture recorder, a digital picture player, a digital video recorder, a digital video recorder, Player), music storage and playback appliances with wireless communication capabilities, Internet appliances capable of wireless Internet access and browsing, as well as portable units or terminals incorporating combinations of such functions.
  • MTC
  • the network 400 of the present invention means a data communication network for transmitting and receiving data between the camera 100, the server 200 and the user terminal 300, and the type thereof is not particularly limited.
  • IP Internet Protocol
  • IP Internet Protocol
  • All IP network that integrates different IP networks.
  • the communication between the camera 100, the server 200, and the user terminal 300 is performed by wireless Internet such as WiFi (wireless fidelity), 802.11x (for example, 802.11a, 802.11b, 802.11g, 802.11n , Two-generation (2G) mobile devices such as a mobile Internet such as WiBro (wireless broadband internet) or WiMax (world interoperability for microwave access), global system for mobile communication (GSM) or code division multiple access (3G) mobile communication network such as a wideband code division multiple access (WCDMA) or a CDMA2000, a 3.5G mobile communication network such as a high speed downlink packet access (HSDPA) or a high speed uplink packet access (HSUPA) 5G (Five Generation) mobile communication network, Ultra Wide Band (UWB), Bluetooth, ZigBee (3G) mobile communication network such as an LTE-evolution network or an LTE- Zigbee), satellite communication network, or one of these Also it can be made via one or more bonds.
  • wireless Internet such as WiFi (wireless
  • the camera 100 can perform data streaming through the device-to-device (D2D) communication with other cameras 100 in the vicinity.
  • the network 400 is connected between the cameras 100 for D2D communication.
  • FIG. 2 is a block diagram showing an embodiment of apparatuses constituting the network of the present invention.
  • the device 10 may be the server 200, the user terminal 300, etc., shown in FIG.
  • the device 10 may include at least one processor 11, a memory 12 and a transceiver 13 connected to the network 400 for performing communication. Further, the device 10 may further include an input interface device 14, an output interface device 15, a storage device 16, and the like. Each component included in the device 10 may be connected by a bus 17 and communicate with each other.
  • the output interface device 15 may be a display.
  • the display displays and outputs the information processed by the terminal 300.
  • the display 300 displays the connection information, advertisement information, or request for re-inputting the connection information necessary for wired / wireless connection through a UI (User Interface) or GUI (Graphic User Interface) Can be displayed.
  • UI User Interface
  • GUI Graphic User Interface
  • the display may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a 3D display, , And there may be two or more displays depending on the implementation.
  • the terminal 300 may be provided with an external display and an internal display at the same time.
  • the processor 11 may execute a program command stored in at least one of the memory 12 and the storage device 16.
  • the processor 11 may refer to a central processing unit (CPU), a graphics processing unit (GPU), or a dedicated processor on which methods in accordance with embodiments of the present invention are performed.
  • Each of the memory 12 and the storage device 16 may be constituted of at least one of a volatile storage medium and a non-volatile storage medium.
  • the memory 12 may comprise at least one of a read only memory (ROM) and a random access memory (RAM).
  • random access memory examples include Fast Page Mode DRAM (FPM), Window RAM (RAM), Extended Data Out (EDO) RAM, BEDO (Burst EDO) RAM, MDRAM (Multibank DRAM), SGRAM (Synchronous Graphics RAM) (Synchronous Dynamic RAM), Direct Rambus DRAM (DRDRAM), Double Data Rate (DDR) SDRAM, and Pseudostatic RAM (PSRAM).
  • FPM Fast Page Mode DRAM
  • RAM Window RAM
  • EDO Extended Data Out RAM
  • MDRAM Multibank DRAM
  • SGRAM Synchronous Graphics RAM
  • DRAM Direct Rambus DRAM
  • DDR Double Data Rate SDRAM
  • PSRAM Pseudostatic RAM
  • FIG. 3 is a block diagram illustrating an embodiment of a camera according to an embodiment of the present invention.
  • the camera 100 includes a control unit 105, a memory 110, a photographing unit 115, a flash unit 116, a display 120, an input unit 125, a quick input unit 126, An infrared sensor 140, a storage unit 145, a distance sensor unit 150, an illuminance sensor unit 155, a position sensor unit 160, a humidity sensor unit 165, And includes at least one of a sensing unit 170, a cable connection unit 175, a mount hole 180, a transceiver unit 185, and a motion recognition unit 190.
  • the control unit 105 includes a memory 110, a photographing unit 115, a flash unit 116, a display 120, an input unit 125, a quick input unit 126, a microphone 130, a speaker 135, A humidity sensor unit 165, a battery sensing unit 170, a cable connection unit 175, a storage unit 145, a distance sensor unit 150, an illuminance sensor unit 155, a position sensor unit 160, The mount hole 180, the transmission / reception unit 185, and the motion recognition unit 190, as shown in FIG.
  • the control unit 105 may execute a program command stored in at least one of the memory 110 and the storage unit 145.
  • the processor 105 may refer to a central processing unit (CPU), a graphics processing unit (GPU), or a dedicated processor on which methods in accordance with embodiments of the present invention are performed.
  • the control unit 105 may terminate the camera 100 if an object is not detected within a predetermined distance for a predetermined period of time through the distance sensor unit 150.
  • the camera 100 may include a gyro sensor unit 152 or a direction sensor unit 153.
  • the direction sensor unit 153 can determine the direction in which the camera 100 looks through the compass function and the GPS function.
  • the control unit 105 can recognize whether the camera 100 has been tilted through the gyro sensor unit 152.
  • the storage unit 145 may store orientation information in the photographed photograph or image. That is, the gyro sensor unit 152 or the direction sensor unit 153 can recognize whether the photographing direction is the first direction or the second direction.
  • control unit 105 can acquire a moral vector value perpendicular to the horizon through the gyro sensor unit 152, and stores the obtained information in the storage unit 145 Can be stored.
  • control unit 105 can acquire information related to the NS pole through the direction sensor unit 153, and store the acquired information in the storage unit 145 .
  • the controller 105 acquires information on the battery charge level through the battery detector 170 and may change the streaming cycle from the first cycle to the second cycle when the acquired degree of charge of the battery is lower than a predetermined criterion . Where the second period may be longer than the first period.
  • the control unit 105 acquires information on the degree of battery charging through the battery sensing unit 170 and may change the streaming cycle to 0.1 seconds when the acquired degree of charge of the battery is lower than a predetermined standard.
  • the control unit 105 can adjust the magnification of the lens included in the photographing unit 115 to correspond to the amount of light measured by the illuminance sensor unit 155 and the relative distance measured by the distance sensor unit 150.
  • the control unit 105 may provide the outside or inside temperature of the camera 100 measured by the temperature sensor unit 151 through the display 120. If the internal temperature of the camera 100 measured through the temperature sensor unit 151 is higher than a predetermined reference, the control unit 105 may change the streaming cycle from the first cycle to the second cycle. Where the second period may be longer than the first period.
  • the control unit 105 may change the streaming period from the first period to the second period when the communication speed of the camera 100 is lower than a predetermined reference. Where the second period may be longer than the first period.
  • the control unit 105 may control the storage unit 115 to store the location information of the camera 100 in the storage unit 145 when the photographing unit 115 photographs a photograph or an image.
  • the control unit 105 can cut off the air passage communicating with the outside so that moisture is not penetrated into the camera 100 when the humidity around the camera 100 is equal to or higher than a predetermined reference. Specifically, when the humidity around the camera 100 is equal to or higher than a predetermined reference, the control unit 105 shrinks the waterproof material previously mounted on the input unit 125, the quick input unit 126, the microphone 130, and the speaker 135 So that the air passage communicating with the outside can be blocked.
  • the control unit 105 can control the camera 100 based on the command received by the infrared ray receiving unit 140.
  • Each of the memory 110 and the storage unit 145 may be constituted by at least one of a volatile storage medium and a non-volatile storage medium.
  • the memory 110 may comprise at least one of read-only memory (ROM) and random access memory (RAM).
  • RAM read-only memory
  • RAM random access memory
  • Examples of the random access memory include Fast Page Mode DRAM (FPM), Window RAM (RAM), Extended Data Out (EDO) RAM, BEDO (Burst EDO) RAM, MDRAM (Multibank DRAM), SGRAM (Synchronous Graphics RAM) (Synchronous Dynamic RAM), Direct Rambus DRAM (DRDRAM), Double Data Rate (DDR) SDRAM, and Pseudostatic RAM (PSRAM).
  • the storage unit 145 may include an open source for storing the photographed image and customizing the image for each user.
  • the storage unit 145 may include an extension module, an audio SDK, and the like so that the user can customize with a little knowledge. Also, the storage unit 145 can be controlled based on the Android OS.
  • Each button included in the camera 100 can be changed in function according to a user's command.
  • functions of the 'power button' and the 'quick input unit 126' buttons of the camera 100 may be mutually changed. That is, the 'quick input unit 126 button' can be changed to function as a 'power button'.
  • the 'home button' may be changed to have a function of 'view shot photo', which is an instruction related to the quick input unit 126.
  • the user can change the interface included in the camera 100 itself.
  • the user can change the function of the application included in the camera 100 itself.
  • the photographing unit 115 may be attached to the left or right side of the camera 100 to take a photograph or an image.
  • the photographing unit 115 may include an image sensor.
  • the photographing unit 115 may also include an infrared image sensor.
  • the thermal image captured by the photographing unit 115 can be accessed (accessed) from the Android OS in real time.
  • the camera 100 is equipped with a push to talk (PPT) app linked to a disaster safety net, and can be interlocked using wired / wireless audio headset and Bluetooth communication.
  • PPT push to talk
  • Push to talk is a concept that is often used in radio communications. It can mean “hold and talk”.
  • the button on the transceiver can be a PTT button. When you press the PTT button, the voice is transmitted to the other radio terminal whose radio frequency is set. You can not receive voice signals while pressing the PTT button.
  • the PTT function is implemented as software (SW) by the radio communication terminal maker and application developer.
  • SW software
  • PTT service changed to data concept, not frequency signal, various additional services appeared. Unlike PTT, which only transmits voice, it can transmit characters as well as images.
  • the camera 100 of the present invention is equipped with a push to talk (PPT) app linked to a disaster safety net, and can transmit images as well as characters using wired / wireless audio headsets and Bluetooth communication.
  • PPT push to talk
  • the photographing unit 115 may include a camera 100 lens.
  • the photographing unit 115 may include a camera 100 lens having a 140-degree angle of view.
  • the flash unit 116 may be a device that supports light before shooting according to the illuminance around the camera 100.
  • the flash unit 116 may support a function of illuminating a light immediately before shooting if a dark environment, i.e., a lower illuminance than a predetermined reference is measured. Whether or not the flash unit 116 is operated through the flash on or off option can be preset. When the camera 100 is set to flash on, the flash unit 116 may operate during shooting.
  • the display 120 displays information to be processed by the camera 100 and transmits a user interface (UI) or a graphical user interface (GUI) And so on.
  • UI user interface
  • GUI graphical user interface
  • the display 120 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display), and there may be two or more displays depending on the implementation.
  • the camera 100 may be provided with an external display and an internal display at the same time.
  • the input unit 125 may include a power button including an on or off function of the camera 100, a touch pad included in the display 120, a multitasking button arranged side by side below the display 120, Button, and a cancel button.
  • the multitasking button may include the ability to present all currently running applications at display 120 at one time.
  • the home button may include a function of moving to a page set as a home button page, and the like.
  • the cancel button may include a function of undoing or moving to an upper page, and the like.
  • a screen shot of the display screen By pressing the home button and the power button at the same time, a screen shot of the display screen can be obtained.
  • a screen shot of the display screen can be obtained by simultaneously pressing the multi-tasking button and the power button.
  • a screen shot of the display screen can be obtained by simultaneously pressing the cancel button and the power button.
  • a screen shot of the display screen can be obtained by simultaneously pressing the home button and the multi-tasking button.
  • the quick input unit 126 may be a button that can be switched to an interface preset by the user so that the user can take an image immediately.
  • the user can directly go to the image shooting mode set by the user through the quick input unit 126 button.
  • the user can directly go to the photographing mode to which the flash function on, which is set in advance by the user via the quick input unit 126 button, is taken 5 seconds later, and the enlargement of the screen magnification is applied.
  • the microphone 130 can receive a user's voice or a sound source generated in the surroundings and change the received electronic signal into an electronic signal.
  • the microphone 130 can be used mainly for a user who performs BJ broadcasting
  • the speaker 135 may provide effect sounds, processed sounds, shot sounds, and the like, which may occur in the camera 100.
  • the infrared ray receiver 140 can recognize nearby objects or the like through infrared rays.
  • the infrared ray receiver 140 can recognize nearby objects or the like through infrared rays in a dark environment where visible light is blocked.
  • the infrared ray receiving unit 140 can receive infrared rays.
  • the infrared ray receiver 140 can receive a specific command.
  • the distance sensor unit 150 can measure the distance between the object near the camera 100 and the camera 100.
  • the distance sensor unit 150 may use a triangulation method (infrared ray type, natural light ray type), an ultrasonic wave method, or the like to measure the distance between points.
  • the ultrasonic method is a method of transmitting ultrasound with sharp direction to the object to be measured and measuring the time until the reflected wave from the object is received to find the distance.
  • a piezoelectric element can be used as the receiving sensor.
  • the distance sensor unit 150 may measure a relative distance, which indicates a distance between the camera 100 and an object positioned in the same direction as the photographing unit 115.
  • the illuminance sensor unit 155 can measure the amount of light irradiated to the photographing unit 115. If the photosensitivity of the sensor is significantly different from the visibility, the illuminance correction filter can be added to the sensor or the measured value can be corrected by the correction coefficient. As the illuminance sensor, various kinds of photovoltaic cells can be used, but a phototube can also be used for the measurement of very low illuminance.
  • the illuminance measured from the illuminance sensor unit 155 can be used to change the magnification of the lens included in the photographing unit.
  • the position sensor 160 can determine the position of the camera 100 through a GPS module or the like.
  • the humidity sensor unit 165 can measure the humidity in the vicinity of the camera 100.
  • the humidity sensor unit 165 can measure moisture using changes in electrical resistance or capacitance caused by absorption by porous ceramics or a polymer membrane. In addition, the humidity sensor unit 165 can measure moisture using a change in the resonance frequency of the vibrator due to a change in the weight of the absorbing material provided on the vibrator.
  • Examples of the humidity sensor unit 165 include a dry humidity sensor, a hair hygrometer, a lithium chloride humidity sensor, an electrolytic humidity sensor (P2O5 humidity sensor), a polymer membrane humidity sensor, a quartz oscillation type humidity sensor, an aluminum oxide humidity sensor, , A thermistor humidity sensor, a microwave humidity sensor, a condensation sensor, a dew point sensor, and the like.
  • the camera 100 may further include a temperature sensor unit 151 for measuring the temperature inside or outside the camera 100.
  • the internal or external temperature of the camera 100 measured at the temperature sensor unit 151 can be used to change the streaming period.
  • the camera 100 includes a gyro sensor unit 152.
  • the gyro sensor unit 152 can measure the tilt of the camera based on the gravity of the camera 100.
  • the battery detection unit 170 may measure the degree of charge of the battery included in the camera 100. [ The battery detection unit 170 may provide information through the processor 105 so that the remaining battery level is visually represented on the display 120. [
  • the cable connector 175 may be female-mated with a wired connector connectable to an external device.
  • Examples of the cable connection 175 may include a micro USB input port, a micro HDMI, an external expansion pin (Uboot port, SPI port, UART port I2C port, debugging port, GPIO port, audio port) It is not.
  • micro USB input port may include micro USB 2.0 (OTG).
  • OTG micro USB 2.0
  • the external expansion pin include USB host 1ch, SPI-1ch, I2C-1ch, GPIO-4pin, UART-1ch, a headset port, and the like.
  • the transmission / reception unit 185 may transmit the photographed image to the server through data streaming every predetermined period of the first period.
  • the mount hole 190 is located under the camera 100 and can be male and female combined with a mount that supports the camera 100 to be fixed.
  • the controller 105 includes a battery sensor unit for measuring the remaining battery power and a temperature sensor unit for measuring the internal temperature of the camera 100.
  • the streaming period may be changed from the first period to the second period when the communication speed corresponds to at least one of a threshold value and a communication speed lower than a threshold value. In this case, the second period is longer than the first period.
  • the streaming cycle is lengthened to reduce the battery consumption, so that the speed at which the battery is consumed can be decreased or the internal temperature of the camera 100 can be lowered.
  • the camera 100 may further include a face recognition unit for recognizing a face of a person in the photographed image.
  • the control unit 105 registers the selected specific person as a first person, and selects a person other than the first person And then streamed to the server through the transmission / reception unit.
  • the camera 100 further includes a motion recognition unit 190 for recognizing the motion of the user.
  • the motion recognition unit 190 switches to the photographing mode.
  • the motion recognition unit 190 switches to the timer photographing mode.
  • a timer can be set according to the type of the third motion, and a picture can be taken after the timer.
  • the user can set the timer and photograph the camera 100 only by recognizing the motion after the camera 100 is located as far as the tripod.
  • each motion is stored in advance, or the user can set each motion.
  • a motion indicating a number can be set.
  • control unit 105 of the camera 100 uploads the image photographed using the third motion to the SNS account of the user set in the control unit 105.
  • the camera 100 may correct the photographed image through the reference line of the camera 100 using the shake correction mode.
  • the control unit 105 recognizes the user's face on the photographed image and extracts the user's eyes, nose, and lips. And adjusts the angle of the image so that the angle of the first reference line is maintained in the image photographed through the photographing unit 115.
  • the image is shaken as the bicycle and the user's body shake.
  • the shake correction mode the first reference line from the forehead to the center of the nose and lips is set in the user's face of the shot image, and the image is corrected so that the angle of the first reference line is maintained, .
  • a second reference line connecting both eyes of the user is set in the current photographic image, and an image photographed through the photographing unit 115 is referred to as a first reference line and a second reference line
  • the angle of the image is corrected so that the angle of the reference line is maintained.
  • the camera 100 transmits the image received from the camera 100 to a broadcast platform on which the server is linked.
  • the server when the server receives a broadcast mode from a user, the server can transmit a broadcast to a broadcast platform such as Youtube linked to the server.
  • a broadcast platform such as Youtube linked to the server.
  • the server requests advertisement transmission to the broadcasting platform.
  • the communication state between the camera 100 and the server may be poor due to the specific circumstances, and the communication state between the server and the broadcasting platform may be poor. In this case, even if the broadcasting is transmitted, . Therefore, the server can send the prepared advertisement image instead of broadcasting the shot image.
  • the camera 100 further includes a motion recognition unit 190 for recognizing the motion of the user.
  • a motion recognition unit 190 for recognizing the motion of the user.
  • the highlight image is transmitted to the broadcasting platform, and the highlight image is broadcast until the communication state is normalized.
  • the above configuration easily records a highlight image through the fifth motion and the sixth motion from the user and stores the highlight image on the server.
  • an advertisement may be transmitted to a broadcasting platform, or a highlight image selected and stored from a user may be transmitted so that viewers can watch a highlight image until the communication state is normalized It is effective.
  • FIG. 4 is a perspective view of a camera according to a first embodiment of the present invention.
  • the display 120 is located on the left side based on the photographing direction of the first photographing unit 115-1 and the camera 100 where the battery and the battery detecting unit 170 are located on the right side based on the photographing direction, Can be confirmed.
  • the photographing unit 115, the flash unit 116, and the infrared ray receiving unit 140 may be positioned on one plane, but the present invention is not limited thereto.
  • the flash unit 116 and the infrared ray receiving unit 140 are protruded in the first embodiment and the photographing unit 115 can be easily detached from the photographing unit 115 because the photographing unit 115 can be easily broken if only the photographing unit 115 is protruded alone.
  • the flash unit 116, and the infrared ray receiving unit 140 can be formed in one plane as a whole.
  • the display 120 may be positioned broadly on the side of the camera 100.
  • the touch pad of the input unit 125 may be positioned on the display 120 and function as a touch pad.
  • the multitasking button, the home button, and the cancel button in the input unit 125 may be located on the lower side of the touch pad.
  • the cable connection portion 175 shown in FIG. 4 may be a micro USB among the cable connection portions 175.
  • the camera 100 may include a plurality of photographing portions.
  • the first photographing unit 115-1 included in the right side of the camera 100 includes a lens and can take a photograph or an image in the lateral direction of the display 120.
  • the second photographing unit 115-2 includes a lens and can take a photograph or an image in the same direction as the light emitted from the display 120. [ That is, the second photographing unit 115-2 can allow the user to photograph the face of the user while looking at his / her face.
  • the flash unit 116 may be positioned between the photographing unit 115 and the infrared ray receiving unit 140.
  • the flash portion 116 may be polygonal or circular.
  • the flash portion 116 may preferably be rectangular or circular. In FIG. 4, the flash unit 116 is shown in a circular shape, but is not limited to this shape.
  • the input unit 125 and the quick input unit 126 may be positioned on the upper side of the camera 100 in parallel.
  • the infrared ray receiving unit 140 may be located below the photographing unit 115.
  • the microphone 130 may be positioned below the infrared receiver 140.
  • FIG. 5 is a perspective view of a camera according to a second embodiment of the present invention.
  • the battery and battery sensing unit 170 are located on the left side of the photographing unit 115, and the camera 100 where the display 120 is located on the right side is determined based on the photographing direction have.
  • the photographing unit 115, the flash unit 116, and the infrared ray receiving unit 140 may be located on one plane, but the present invention is not limited thereto.
  • the flash unit 116 and the infrared ray receiving unit 140 are protruded in the first embodiment and the photographing unit 115 can be easily detached from the photographing unit 115 because the photographing unit 115 can be easily broken if only the photographing unit 115 is protruded alone.
  • the flash unit 116, and the infrared ray receiving unit 140 can be formed in one plane as a whole.
  • the speaker 135 may be located in a plane parallel to the battery and the battery sensing unit 170, but is not limited thereto.
  • the cable connection 175 shown in FIG. 5 may be a micro HDMI among the cable connection 175 .
  • the input unit 125 and the quick input unit 126 may be positioned on the upper side of the camera 100 in parallel.
  • the flash unit 116 may be positioned between the photographing unit 115 and the infrared ray receiving unit 140.
  • the flash unit 116 is represented by a circular shape, but the present invention is not limited thereto.
  • the flash unit 116 can receive information about the dark environment from the illumination sensor unit 155 and determine whether to illuminate the image when shooting.
  • FIG. 6 is a first perspective view of a camera according to a third embodiment of the present invention.
  • the battery and battery sensing unit 170 are located on the left side of the photographing unit 115, and the camera 100 where the display 120 is located on the right side is determined based on the photographing direction have.
  • the flash unit 116 and the infrared ray receiving unit 140 are located on one plane, and only the photographing unit 115 is protruded alone.
  • the display 120 may be positioned broadly on the side of the camera 100.
  • the touch pad of the input unit 125 may be positioned on the display 120 and function as a touch pad.
  • the multitasking button, the home button, and the cancel button in the input unit 125 may be located on the lower side of the touch pad.
  • the cable connection portion 175 shown in FIG. 6 may be a micro USB among the cable connection portions 175.
  • the photographing unit 115 included in the right side of the camera 100 includes a lens and can take a photograph or an image.
  • the flash unit 116 may be positioned between the photographing unit 115 and the infrared ray receiving unit 140.
  • the flash portion 116 may be polygonal or circular.
  • the flash portion 116 may preferably be rectangular or circular.
  • the flash unit 116 is represented by a quadrangle, but the present invention is not limited thereto.
  • the infrared ray receiving unit 140 may be located below the photographing unit 115.
  • the microphone 130 may be positioned below the infrared receiver 140.
  • FIG. 7 is a second perspective view of a camera according to a third embodiment of the present invention.
  • a battery and a battery detecting unit 170 are located on the left side based on a photographing direction of the photographing unit 115, and a camera 100) can be confirmed.
  • the flash unit 116 and the infrared ray receiving unit 140 are located on one plane, and only the photographing unit 115 is protruded alone.
  • the cable connection portion 175 illustrated in FIG. 7 may be an external expansion pin among the cable connection portions 175.
  • the mount hole 180 may be located below the camera 100.
  • FIG. 8 is a conceptual diagram illustrating an operation process of a display screen according to an embodiment of the present invention.
  • the screen of the display 120 includes a setting button for performing a button function as a touch pad, a screen locking button, a quick menu button, A photograph button, a gallery button, and the like.
  • 8B and 8C illustrate that when a photo or an image stored in the storage unit 145 is loaded, the uploaded image can be automatically provided when the corresponding photo or image is touched for a predetermined time or more It can be a picture.
  • FIG. 8B shows a screen in which the corresponding photo or image is clicked
  • FIG. 8C shows a screen in which a button for uploading the corresponding photo or image to the SNS is displayed.
  • FIG. 9 is a conceptual diagram showing an embodiment of devices that support the use of the camera of the present invention.
  • the camera 100 has a function of setting the humidity sensor 165 to recognize the humidity in the water and block the opening of the control unit 105 But may cover the moisture barrier device 100 of the camera 100 for a source of moisture.
  • a battery charger which can be a device that can separate the battery included in the camera 100 and charge it separately.
  • FIG. 10 is a conceptual diagram illustrating a screen storage method according to an embodiment of the present invention.
  • control unit 105 can recognize the arrangement of the images to be photographed.
  • the control unit 105 can generate a virtual screen based on the arrangement of the screens to be photographed. If the screen recognized by the photographing unit 105 is similar to a virtual screen stored in advance, the control unit 105 can provide information of the virtual screen to the user.
  • control unit 105 can provide the user with information about when the picture similar to the corresponding picture was photographed.
  • the camera 100 captures the real world image 1000 by photographing the real world space 1000.
  • the plurality of real objects 1100, 1200, 1300, 1400 may include any two-dimensional or three-dimensional object.
  • the plurality of real objects 1100, 1200, 1300, and 1400 may have different or similar shapes.
  • the camera 100 can distinguish objects based on this morphological difference.
  • the camera 100 may identify a plurality of objects 2100, 2200, 2300, 2400 in the camera image 4000.
  • the camera 100 may extract contours of a plurality of objects 2100, 2200, 2300, and 2400.
  • the camera 100 may extract the contours of the plurality of objects 2100, 2200, 2300, 2400 through blocks 3100, 3200 that recognize a plurality of objects in the camera image 4000.
  • the camera 100 determines an object matched with a pre-stored image among the plurality of objects 2100, 2200, 2300, and 2400 using the vector value of the outline of the image stored in advance.
  • the computing device 100 may provide the image corresponding to the plurality of objects 2100, 2200, 2300, and 2400 and the information of the corresponding image through the display.
  • the steps of a method or algorithm described in connection with the embodiments of the present invention may be embodied directly in hardware, in software modules executed in hardware, or in a combination of both.
  • the software module may be a random access memory (RAM), a read only memory (ROM), an erasable programmable ROM (EPROM), an electrically erasable programmable ROM (EEPROM), a flash memory, a hard disk, a removable disk, a CD- May reside in any form of computer readable recording medium known in the art to which the invention pertains.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Theoretical Computer Science (AREA)

Abstract

A camera is provided. The camera comprises: a photographing unit attached to the left side or the right side of the camera to take a picture or an video; a mount hole, located at a lower part of the camera, and engaged with, in a male and female manner, a mount for supporting the camera so as to fix the same; a transmitting and receiving unit for transmitting the photographed video to a server through data streaming at every first interval which is a predetermined interval; and a controller for controlling devices comprised in the camera.

Description

인터넷 브로드캐스팅 및 업로딩을 위한 카메라Cameras for Internet Broadcasting and Uploading
본 발명은 인터넷 브로드캐스팅 및 업로딩을 위한 카메라에 관한 것이다.The present invention relates to a camera for Internet broadcasting and uploading.
SNS의 대중화로 인하여 카메라(예를 들어, 카메라)를 통해 촬영된 사진 또는 영상을 SNS에 올리는 사용자들이 매년 늘어가는 추세이다. SNS 사용자들은 스마트폰 또는 카메라 등을 통해 사진을 촬영하고 해당 사진을 컴퓨터에서 편집하여 SNS에 업로드 한다.Due to the popularization of SNS, users who upload pictures or images photographed through cameras (for example, cameras) to the SNS are increasing every year. SNS users take pictures through a smartphone or camera, edit the pictures on a computer, and upload them to the SNS.
일반적으로 스마트폰은 촬영한 사진 또는 영상의 업로드 절차가 복잡하지 않지만, 일반 카메라에 비해서 화질이 상대적으로 좋지 않다. 반면, 일반 카메라는 화질이 상대적으로 좋은 대신, 안드로이드 OS가 설치되어 있지 않거나, 통신 모듈 또는 애플리케이션이 SNS 사진 업로드에 최적화 되어 있지 않다는 문제가 있다.Generally, the process of uploading photos or images taken by a smartphone is not complicated, but the image quality is relatively poor compared to a general camera. On the other hand, there is a problem that the general camera is relatively good in image quality, the Android OS is not installed, or the communication module or application is not optimized for SNS picture upload.
본 발명이 해결하고자 하는 과제는 인터넷 브로드캐스팅 및 업로딩을 위한 카메라를 제공하는 것이다.A problem to be solved by the present invention is to provide a camera for Internet broadcasting and uploading.
본 발명이 해결하고자 하는 과제는 휴대 및 촬영 영상의 업로드가 용이한 카메라를 제공하는 것이다.SUMMARY OF THE INVENTION It is an object of the present invention to provide a camera which is easy to carry and upload a photographed image.
본 발명이 해결하고자 하는 과제는 환경에 따라 지속 가능한 최적의 촬영환경을 자동으로 설정할 수 있는 카메라를 제공하는 것이다.SUMMARY OF THE INVENTION It is an object of the present invention to provide a camera capable of automatically setting an optimal shooting environment that is sustainable according to an environment.
본 발명이 해결하고자 하는 과제는 버튼 하나로 사용자에게 최적화된 촬영 환경을 제공함으로써 촬영까지의 프로세스가 복잡하지 않은 카메라를 제공하는 것이다.A problem to be solved by the present invention is to provide a camera in which the process up to shooting is not complicated by providing a shooting environment optimized for a user with one button.
본 발명이 해결하고자 하는 과제들은 이상에서 언급된 과제로 제한되지 않으며, 언급되지 않은 또 다른 과제들은 아래의 기재로부터 통상의 기술자에게 명확하게 이해될 수 있을 것이다.The problems to be solved by the present invention are not limited to the above-mentioned problems, and other problems which are not mentioned can be clearly understood by those skilled in the art from the following description.
상술한 과제를 해결하기 위한 본 발명의 일 실시예에 따른 카메라는, 카메라로서, 상기 카메라의 좌측 또는 우측 측면에 부착되어 사진 또는 영상을 촬영하는 촬영부; 상기 카메라 하부에 위치하며, 상기 카메라가 고정될 수 있도록 지지하는 마운트와 암수 결합되는 마운트홀; 상기 촬영된 영상을 미리 정해진 주기인 제1주기마다 데이터 스트리밍을 통해 서버로 전송하는 송수신부; 및 상기 카메라에 포함된 장치들을 제어하는 제어부를 포함한다.According to an aspect of the present invention, there is provided a camera comprising: a photographing unit attached to a left side or a right side of a camera to photograph a photograph or an image; A mount hole located at a lower portion of the camera and mated with a mount for supporting the camera to be fixed; A transmitting / receiving unit for transmitting the photographed image to a server through data streaming every predetermined period of a first period; And a controller for controlling the devices included in the camera.
또한, 상기 카메라는, 배터리 잔량을 측정하는 배터리감지부 및 카메라 내부 온도를 측정하는 온도센서부를 포함하며, 상기 제어부는, 상기 배터리 잔량이 임계치 이하인 경우, 상기 카메라 내부 온도가 임계치 이상인 경우 및 통신속도가 임계치 이하인 경우 중 적어도 하나에 해당할 경우, 상기 스트리밍 주기를 제1주기에서 제2주기로 변경하되, 상기 제2주기는 상기 제1주기보다 긴 것을 특징으로 한다.The camera may further include a battery detecting unit for measuring a remaining battery level and a temperature sensor for measuring a temperature inside the camera, wherein when the remaining battery level is less than or equal to a threshold value, Is equal to or less than a threshold value, the streaming period is changed from a first period to a second period, and the second period is longer than the first period.
또한, 상기 촬영된 영상에서 사람의 얼굴을 인식하는 얼굴 인식부를 더 포함하며, 상기 제어부는, 모자이크 모드를 선택받고, 상기 촬영된 영상에서 특정 인물의 얼굴을 선택받으면, 상기 특정 인물을 제1인물로 등록하고, 상기 촬영된 영상에서 상기 제1인물을 제외한 인물의 얼굴을 모자이크 처리한 후에 상기 송수신부를 통해 상기 서버로 데이터 스트리밍할 수 있다.The method may further include a face recognizing unit that recognizes a face of a person in the photographed image, and when the mosaic mode is selected and the face of a specific person is selected from the photographed image, And after the mosaic processing of the face of the person excluding the first person in the photographed image, data can be streamed to the server through the transceiver.
또한, 사용자의 모션을 인식하는 모션인식부를 더 포함하며, 사용자의 제1모션이 인식되면 사진 촬영모드로 전환하고, 상기 사진 촬영모드에서 제2모션이 인식되면 타이머 촬영모드로 전환하고, 상기 타이머 촬영모드에서 제3모션이 인식되면, 상기 제3모션의 종류에 따라서 타이머를 설정하고, 상기 타이머 후에 사진을 촬영한다.The apparatus further includes a motion recognition unit that recognizes the motion of the user. When the first motion of the user is recognized, the mode is switched to the photographing mode. When the second motion is recognized in the photographing mode, When the third motion is recognized in the photographing mode, a timer is set according to the type of the third motion, and a photograph is taken after the timer.
또한, 상기 제어부는, 사용자의 제4모션이 인식되면, 상기 제어부에 설정된 사용자의 SNS계정에 상기 제3모션을 이용하여 촬영된 사진을 업로드한다.In addition, when the fourth motion of the user is recognized, the control unit uploads the photographed image using the third motion to the user's SNS account set in the control unit.
또한, 상기 제어부는, 제1흔들림 보정모드를 선택받으면, 현재 촬영 영상에서 사용자의 얼굴을 인식하여 상기 사용자의 눈, 코 및 입술을 추출하고, 미간으로부터 코, 입술 가운데까지의 제1기준선을 설정하며, 상기 촬영부를 통해 촬영되는 영상을 상기 제1기준선의 각도가 유지되도록 상기 영상의 각도를 보정한다.In addition, when the first shake correction mode is selected, the control unit recognizes the user's face on the current photographic image to extract the user's eyes, nose, and lips, and sets a first reference line from the brow to the nose and lips center And corrects the angle of the image so that the angle of the first reference line is maintained.
또한, 상기 제어부는, 제2흔들림 보정모드를 선택받으면, 현재 촬영 영상에서 사용자의 양쪽 눈을 잇는 제2기준선을 설정하며, 상기 촬영부를 통해 촬영되는 영상을 상기 제1기준선 및 제2기준선의 각도가 유지되도록 상기 영상의 각도를 보정한다.When the second shake correction mode is selected, the control unit sets a second reference line connecting both eyes of the user on the current photographic image, and sets an image photographed through the photographing unit to an angle of the first reference line and the second reference line The angle of the image is corrected.
또한, 상기 카메라는, 사용자로부터 방송모드를 입력받으면, 상기 서버가 연동되어 있는 방송 플랫폼으로 상기 카메라로부터 수신된 영상을 전송하여 방송하도록 한다.In addition, when the user receives a broadcast mode from the user, the camera transmits the image received from the camera to the broadcast platform on which the server is linked and broadcasts it.
또한, 상기 서버는, 상기 카메라와 상기 서버 또는 상기 서버와 상기 방송 플랫폼 간의 통신상태가 불량인 경우, 상기 방송 플랫폼으로 광고 송출을 요청하는 것을 특징으로 한다.In addition, the server requests advertisement transmission to the broadcasting platform when the communication state between the camera, the server or the server and the broadcasting platform is poor.
또한, 사용자의 모션을 인식하는 모션인식부를 더 포함하며, 사용자의 제5모션이 인식되면, 사용자의 제6모션이 인식될때까지의 촬영 영상을 서버에 하이라이트 영상으로 저장하도록 요청하며, 상기 카메라와 상기 서버 또는 상기 서버와 상기 방송 플랫폼 간의 통신상태가 불량인 경우, 상기 방송 플랫폼으로 상기 하이라이트 영상을 전송하여, 상기 통신상태가 정상화될때까지 상기 하이라이트 영상을 방송하도록 한다.When the fifth motion of the user is recognized, the control unit requests the server to store the photographed image until the sixth motion of the user is recognized as a highlight image, And transmitting the highlight image to the broadcasting platform when the communication state between the server or the server and the broadcasting platform is poor, and broadcasting the highlight image until the communication state is normalized.
본 발명에 의하면 촬영된 사진/영상을 컴퓨터로 저장하고 SNS에 업로드 할 필요 없이 촬영된 영상을 바로 SNS에 업로드 할 수 있는 효과가 있다.According to the present invention, it is possible to upload a photographed image directly to the SNS without storing the photographed image / image on a computer and uploading it to the SNS.
또한, 본 발명에 의하면 스트리밍 서버를 이용하여 1인 미디어 방송을 용이하게 할 수 있도록 하는 효과가 있다.In addition, according to the present invention, it is possible to facilitate one-person media broadcasting using a streaming server.
또한, 본 발명에 의하면 습도 인식을 통해 수중에서도 기기의 손상 없이 영상의 촬영이 가능하므로 수상스포츠 중에도 촬영하고 싶은 영상을 쉽게 촬영할 수 있는 효과가 있다.In addition, according to the present invention, it is possible to take an image without any damage to the apparatus in the water through the humidity recognition, so that it is possible to easily capture an image to be photographed even in water sports.
또한, 본 발명에 의하면 복수의 카메라를 연동할 수 있는 멀티채널 라이브 스트리밍 서버를 이용하여 복수의 카메라에서 촬영되는 영상을 한눈에 확인할 수 있고, 이를 통해 CCTV 등 다양한 분야에서 유용하게 활용될 수 있는 효과가 있다.In addition, according to the present invention, a multi-channel live streaming server capable of interlocking a plurality of cameras can be used to check images captured by a plurality of cameras at a glance, .
또한, 본 발명에 의하면 차량에서 탈부착 가능한 블랙박스로도 이용할 수 있는 효과가 있다.Further, according to the present invention, there is an effect that the present invention can be used as a black box detachable from a vehicle.
본 발명의 효과들은 이상에서 언급된 효과로 제한되지 않으며, 언급되지 않은 또 다른 효과들은 아래의 기재로부터 통상의 기술자에게 명확하게 이해될 수 있을 것이다.The effects of the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description.
도 1은 본 발명의 실시를 위한 네트워크의 일 실시예를 도시한 개념도이다.1 is a conceptual diagram showing an embodiment of a network for carrying out the present invention.
도 2는 본 발명의 네트워크를 구성하는 장치들의 일 실시예를 도시한 블록도이다.2 is a block diagram showing an embodiment of apparatuses constituting the network of the present invention.
도 3은 본 발명의 일 실시예에 따른 카메라의 일 실시예를 도시한 블록도이다.3 is a block diagram illustrating an embodiment of a camera according to an embodiment of the present invention.
도 4는 본 발명의 제1실시예에 따른 카메라의 사시도이다.4 is a perspective view of a camera according to a first embodiment of the present invention.
도 5는 본 발명의 제2실시예에 따른 카메라의 사시도이다.5 is a perspective view of a camera according to a second embodiment of the present invention.
도 6은 본 발명의 제3실시예에 따른 카메라의 제1사시도이다.6 is a first perspective view of a camera according to a third embodiment of the present invention.
도 7은 본 발명의 제3실시예에 따른 카메라의 제2사시도이다.7 is a second perspective view of a camera according to a third embodiment of the present invention.
도 8은 본 발명의 일 실시예에 따른 디스플레이 화면의 동작 과정을 도시한 개념도이다.8 is a conceptual diagram illustrating an operation process of a display screen according to an embodiment of the present invention.
도 9는 본 발명의 카메라의 사용을 지원하는 장치들의 실시예를 도시한 개념도이다.FIG. 9 is a conceptual diagram showing an embodiment of devices that support the use of the camera of the present invention.
도 10은 본 발명의 일 실시예에 따른 화면기억 방법을 도시한 개념도이다.10 is a conceptual diagram illustrating a screen storage method according to an embodiment of the present invention.
본 발명의 이점 및 특징, 그리고 그것들을 달성하는 방법은 첨부되는 도면과 함께 상세하게 후술되어 있는 실시예들을 참조하면 명확해질 것이다. 그러나, 본 발명은 이하에서 개시되는 실시예들에 제한되는 것이 아니라 서로 다른 다양한 형태로 구현될 수 있으며, 단지 본 실시예들은 본 발명의 개시가 완전하도록 하고, 본 발명이 속하는 기술 분야의 통상의 기술자에게 본 발명의 범주를 완전하게 알려주기 위해 제공되는 것이며, 본 발명은 청구항의 범주에 의해 정의될 뿐이다. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. It should be understood, however, that the invention is not limited to the disclosed embodiments, but may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein, Is provided to fully convey the scope of the present invention to a technician, and the present invention is only defined by the scope of the claims.
본 명세서에서 사용된 용어는 실시예들을 설명하기 위한 것이며 본 발명을 제한하고자 하는 것은 아니다. 본 명세서에서, 단수형은 문구에서 특별히 언급하지 않는 한 복수형도 포함한다. 명세서에서 사용되는 "포함한다(comprises)" 및/또는 "포함하는(comprising)"은 언급된 구성요소 외에 하나 이상의 다른 구성요소의 존재 또는 추가를 배제하지 않는다. 명세서 전체에 걸쳐 동일한 도면 부호는 동일한 구성 요소를 지칭하며, "및/또는"은 언급된 구성요소들의 각각 및 하나 이상의 모든 조합을 포함한다. 비록 "제1", "제2" 등이 다양한 구성요소들을 서술하기 위해서 사용되나, 이들 구성요소들은 이들 용어에 의해 제한되지 않음은 물론이다. 이들 용어들은 단지 하나의 구성요소를 다른 구성요소와 구별하기 위하여 사용하는 것이다. 따라서, 이하에서 언급되는 제1 구성요소는 본 발명의 기술적 사상 내에서 제2 구성요소일 수도 있음은 물론이다.The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms " comprises "and / or" comprising "used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element. Like reference numerals refer to like elements throughout the specification and "and / or" include each and every combination of one or more of the elements mentioned. Although "first "," second "and the like are used to describe various components, it is needless to say that these components are not limited by these terms. These terms are used only to distinguish one component from another. Therefore, it goes without saying that the first component mentioned below may be the second component within the technical scope of the present invention.
다른 정의가 없다면, 본 명세서에서 사용되는 모든 용어(기술 및 과학적 용어를 포함)는 본 발명이 속하는 기술분야의 통상의 기술자에게 공통적으로 이해될 수 있는 의미로 사용될 수 있을 것이다. 또한, 일반적으로 사용되는 사전에 정의되어 있는 용어들은 명백하게 특별히 정의되어 있지 않는 한 이상적으로 또는 과도하게 해석되지 않는다.Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense that is commonly understood by one of ordinary skill in the art to which this invention belongs. In addition, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.
공간적으로 상대적인 용어인 "아래(below)", "아래(beneath)", "하부(lower)", "위(above)", "상부(upper)" 등은 도면에 도시되어 있는 바와 같이 하나의 구성요소와 다른 구성요소들과의 상관관계를 용이하게 기술하기 위해 사용될 수 있다. 공간적으로 상대적인 용어는 도면에 도시되어 있는 방향에 더하여 사용시 또는 동작 시 구성요소들의 서로 다른 방향을 포함하는 용어로 이해되어야 한다. 예를 들어, 도면에 도시되어 있는 구성요소를 뒤집을 경우, 다른 구성요소의 "아래(below)"또는 "아래(beneath)"로 기술된 구성요소는 다른 구성요소의 "위(above)"에 놓여질 수 있다. 따라서, 예시적인 용어인 "아래"는 아래와 위의 방향을 모두 포함할 수 있다. 구성요소는 다른 방향으로도 배향될 수 있으며, 이에 따라 공간적으로 상대적인 용어들은 배향에 따라 해석될 수 있다.The terms spatially relative, "below", "beneath", "lower", "above", "upper" And can be used to easily describe a correlation between an element and other elements. Spatially relative terms should be understood in terms of the directions shown in the drawings, including the different directions of components at the time of use or operation. For example, when inverting an element shown in the figures, an element described as "below" or "beneath" of another element may be placed "above" another element . Thus, the exemplary term "below" can include both downward and upward directions. The components can also be oriented in different directions, so that spatially relative terms can be interpreted according to orientation.
이하, 첨부된 도면을 참조하여 본 발명의 실시예를 상세하게 설명한다. Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
도 1은 본 발명의 실시를 위한 네트워크의 일 실시예를 도시한 개념도이다.1 is a conceptual diagram showing an embodiment of a network for carrying out the present invention.
도 1을 참조하면, 본 발명의 카메라의 동작을 지원하기 위한 시스템은 카메라(100), 서버(200), 사용자 단말(300) 중 적어도 하나를 포함한다. 카메라(100), 서버(200), 사용자 단말(300)은 네트워크(400)로 연결되어 있다.Referring to FIG. 1, a system for supporting operation of a camera of the present invention includes at least one of a camera 100, a server 200, and a user terminal 300. The camera 100, the server 200, and the user terminal 300 are connected to the network 400.
본 발명의 카메라(100)는 도 3 내지 7 및 9에 구체적으로 기재되어 있다.The camera 100 of the present invention is specifically described in Figs. 3 to 7 and 9. Fig.
본 발명의 서버(200)의 실시예로는 클라우드(Cloud) 서버, IMS(IP Multimedia Subsystem) 서버, 텔레포니 애플리케이션(Telephony Application) 서버, IM(Instant Messaging) 서버, MGCF(Media Gateway Control Function) 서버, MSG(Messaging Gateway) 서버, CSCF(Call Session Control Function) 서버를 포함될 수 있으며, 서버(200)는 PC(Personal Computer), 노트북 컴퓨터, 태블릿 PC(Tablet Personal Computer) 등 데이터를 송수신할 수 있는 객체를 지칭하는 장치로 구현될 수도 있다.Examples of the server 200 of the present invention include a cloud server, an IP Multimedia Subsystem (IMS) server, a telephony application server, an instant messaging server, a media gateway control function (MGCF) server, (MSG) server, and a CSCF (Call Session Control Function) server. The server 200 may include an object capable of transmitting and receiving data such as a PC (Personal Computer), a notebook computer, and a tablet PC May be implemented as a device.
본 발명의 단말(300)의 실시예로는 데스크탑 컴퓨터(Desktop Computer), 랩탑 컴퓨터(Laptop Computer), 태블릿 PC(Tablet Personal computer), 무선전화기(Wireless Phone), 모바일폰(Mobile Phone), 스마트폰(Smart Phone), 이동국(MS), 머신 타입 통신(Machine-Type Communication; MTC), M2M(Machine-to-Machine), D2D(Device-to-Device), 사용자 장비(User Equipment: UE), 무선 기기(wireless device), 무선 터미널(wireless terminal; WT), 액세스 터미널(Access Terminal: AT), 무선 송수신 유닛(Wireless Transmit/Receive Unit: WTRU), 가입자 스테이션(Subscriber Station: SS), 가입자 유닛(Subscriber Unit: SU), 사용자 터미널(User Terminal: UT), PMP(Portable Multimedia Player), 무선 통신 기능을 가지는 개인 휴대용 단말기(PDA), 무선 통신 기능을 가지는 휴대용 게임기, 네비게이션(Navigation) 장치, 디지털 카메라(Digital Camera), DMB (Digital Multimedia Broadcasting) 재생기, 디지털 음성 녹음기(Digital Audio Recorder), 디지털 음성 재생기(Digital Audio Player), 디지털 영상 녹화기(Digital Picture Recorder), 디지털 영상 재생기(Digital Picture Player), 디지털 동영상 녹화기(Digital Video Recorder), 디지털 동영상 재생기(Digital Video Player), 무선 통신 기능을 가지는 음악 저장 및 재생 가전 제품, 무선 인터넷 접속 및 브라우징이 가능한 인터넷 가전제품뿐만 아니라 그러한 기능들의 조합들을 통합하고 있는 휴대형 유닛 또는 단말기들이 포함될 수 있으나, 이에 한정되는 것은 아니다.Examples of the terminal 300 of the present invention include a desktop computer, a laptop computer, a tablet personal computer, a wireless phone, a mobile phone, a smart phone, (MS), Machine-Type Communication (MTC), Machine-to-Machine (M2M), Device-to-Device (D2D), User Equipment A wireless terminal, a wireless terminal (WT), an access terminal (AT), a wireless transmit / receive unit (WTRU), a subscriber station (SS), a subscriber A personal digital assistant (PDA) having a wireless communication function, a portable game machine having a wireless communication function, a navigation device, a digital camera Digital Camera), DMB (Digital Multimedia Broadcasting) player, Digital A digital audio recorder, a digital audio player, a digital picture recorder, a digital picture player, a digital video recorder, a digital video recorder, Player), music storage and playback appliances with wireless communication capabilities, Internet appliances capable of wireless Internet access and browsing, as well as portable units or terminals incorporating combinations of such functions.
본 발명의 네트워크(400)는 카메라(100), 서버(200), 사용자 단말(300) 사이의 데이터 송수신을 위한 데이터 통신망을 의미하며, 그 종류는 특별히 제한되지 않는다. The network 400 of the present invention means a data communication network for transmitting and receiving data between the camera 100, the server 200 and the user terminal 300, and the type thereof is not particularly limited.
예를 들어, 인터넷 프로토콜(IP)을 통하여 대용량 데이터의 송수신 서비스를 제공하는 아이피(IP: Internet Protocol)망 또는 서로 다른 IP 망을 통합한 올 아이피(All IP) 망 일 수 있다. For example, it may be an IP (Internet Protocol) network that provides a large capacity data transmission / reception service through an Internet Protocol (IP), or an All IP network that integrates different IP networks.
본 발명에서 카메라(100), 서버(200), 사용자 단말(300) 사이의 통신은 WiFi(wireless fidelity)와 같은 무선인터넷, 802.11x(예를 들면, 802.11a, 802.11b, 802.11g, 802.11n, 802.11ac 등), WiBro(wireless broadband internet) 또는 WiMax(world interoperability for microwave access)와 같은 휴대인터넷, GSM(global system for mobile communication) 또는 CDMA(code division multiple access)와 같은 2G(Two Generation) 이동통신망, WCDMA(wideband code division multiple access) 또는 CDMA2000과 같은 3G(Three Generation) 이동통신망, HSDPA(high speed downlink packet access) 또는 HSUPA(high speed uplink packet access)와 같은 3.5G 이동통신망, LTE(long term evolution)망 또는 LTE-Advanced(LTE-A)망과 같은 4G(Four Generation) 이동통신망, 및 5G(Five Generation) 이동통신망, 초광대역 통신(UWB: Ultra Wide Band), 블루투스(Bluetooth), 지그비(Zigbee), 위성 통신망 중 하나로 이루어지거나 또는 이들 중 적어도 하나 이상의 결합을 통해 이루어질 수 있다.In the present invention, the communication between the camera 100, the server 200, and the user terminal 300 is performed by wireless Internet such as WiFi (wireless fidelity), 802.11x (for example, 802.11a, 802.11b, 802.11g, 802.11n , Two-generation (2G) mobile devices such as a mobile Internet such as WiBro (wireless broadband internet) or WiMax (world interoperability for microwave access), global system for mobile communication (GSM) or code division multiple access (3G) mobile communication network such as a wideband code division multiple access (WCDMA) or a CDMA2000, a 3.5G mobile communication network such as a high speed downlink packet access (HSDPA) or a high speed uplink packet access (HSUPA) 5G (Five Generation) mobile communication network, Ultra Wide Band (UWB), Bluetooth, ZigBee (3G) mobile communication network such as an LTE-evolution network or an LTE- Zigbee), satellite communication network, or one of these Also it can be made via one or more bonds.
서버(200)와 카메라(100)가 가용 통신 범위를 벗어난 경우, 카메라(100)는 주위의 다른 카메라(100)와 D2D(device to device) 통신을 통해 데이터 스트리밍을 수행할 수 있다. 네트워크(400)가 카메라(100) 간 연결된 것은 D2D 통신을 위함이다.When the server 200 and the camera 100 are out of the available communication range, the camera 100 can perform data streaming through the device-to-device (D2D) communication with other cameras 100 in the vicinity. The network 400 is connected between the cameras 100 for D2D communication.
도 2는 본 발명의 네트워크를 구성하는 장치들의 일 실시예를 도시한 블록도이다.2 is a block diagram showing an embodiment of apparatuses constituting the network of the present invention.
도 2를 참조하면, 디바이스(10)는 도 1에 도시된 서버(200), 사용자 단말(300) 등일 수 있다. 디바이스(10)는 적어도 하나의 프로세서(11), 메모리(12) 및 네트워크(400)와 연결되어 통신을 수행하는 송수신 장치(13)를 포함할 수 있다. 또한, 디바이스(10)는 입력 인터페이스 장치(14), 출력 인터페이스 장치(15), 저장 장치(16) 등을 더 포함할 수 있다. 디바이스(10)에 포함된 각각의 구성 요소들은 버스(bus)(17)에 의해 연결되어 서로 통신을 수행할 수 있다.Referring to FIG. 2, the device 10 may be the server 200, the user terminal 300, etc., shown in FIG. The device 10 may include at least one processor 11, a memory 12 and a transceiver 13 connected to the network 400 for performing communication. Further, the device 10 may further include an input interface device 14, an output interface device 15, a storage device 16, and the like. Each component included in the device 10 may be connected by a bus 17 and communicate with each other.
출력 인터페이스 장치(15)는 디스플레이일 수 있다. 여기서 디스플레이는 단말(300)에서 처리되는 정보를 표시 출력하며, 구체적으로 유무선 접속에 필요한 접속 정보, 광고 정보 또는 접속 정보 재입력 요청 지시 등을 UI(User Interface) 또는 GUI(Graphic User Interface) 등으로 표시할 수 있다. The output interface device 15 may be a display. Here, the display displays and outputs the information processed by the terminal 300. Specifically, the display 300 displays the connection information, advertisement information, or request for re-inputting the connection information necessary for wired / wireless connection through a UI (User Interface) or GUI (Graphic User Interface) Can be displayed.
이외에도 디스플레이는 액정 디스플레이(liquid crystal display), 박막 트랜지스터 액정 디스플레이(thin film transistor-liquid crystal display), 유기 발광 다이오드(organic light-emitting diode), 플렉서블 디스플레이(flexible display), 3차원 디스플레이(3D display) 중에서 적어도 하나를 포함할 수도 있으며, 구현 형태에 따라 디스플레이가 2개 이상 존재할 수도 있다. 예를 들어, 단말(300)에는 외부 디스플레이와 내부 디스플레이가 동시에 구비될 수 있다.In addition, the display may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a 3D display, , And there may be two or more displays depending on the implementation. For example, the terminal 300 may be provided with an external display and an internal display at the same time.
프로세서(11)는 메모리(12) 및 저장 장치(16) 중에서 적어도 하나에 저장된 프로그램 명령(program command)을 실행할 수 있다. 프로세서(11)는 중앙 처리 장치(central processing unit, CPU), 그래픽 처리 장치(graphics processing unit, GPU), 또는 본 발명의 실시예들에 따른 방법들이 수행되는 전용의 프로세서를 의미할 수 있다. 메모리(12) 및 저장 장치(16) 각각은 휘발성 저장 매체 및 비휘발성 저장 매체 중에서 적어도 하나로 구성될 수 있다. 예를 들어, 메모리(12)는 읽기 전용 메모리(read only memory, ROM) 및 랜덤 액세스 메모리(random access memory, RAM) 중에서 적어도 하나로 구성될 수 있다. 랜덤 액세스 메모리의 실시예로는 FPM(Fast Page Mode DRAM), WRAM(Window RAM), EDO(Extended Data Out) RAM, BEDO(Burst EDO) RAM, MDRAM(Multibank DRAM), SGRAM(SynchronousGraphics RAM), SDRAM(Synchronous Dynamic RAM), DRDRAM(Direct Rambus DRAM), DDR(Double DataRate) SDRAM, PSRAM(Pseudostatic RAM) 등이 포함될 수 있다.The processor 11 may execute a program command stored in at least one of the memory 12 and the storage device 16. [ The processor 11 may refer to a central processing unit (CPU), a graphics processing unit (GPU), or a dedicated processor on which methods in accordance with embodiments of the present invention are performed. Each of the memory 12 and the storage device 16 may be constituted of at least one of a volatile storage medium and a non-volatile storage medium. For example, the memory 12 may comprise at least one of a read only memory (ROM) and a random access memory (RAM). Examples of the random access memory include Fast Page Mode DRAM (FPM), Window RAM (RAM), Extended Data Out (EDO) RAM, BEDO (Burst EDO) RAM, MDRAM (Multibank DRAM), SGRAM (Synchronous Graphics RAM) (Synchronous Dynamic RAM), Direct Rambus DRAM (DRDRAM), Double Data Rate (DDR) SDRAM, and Pseudostatic RAM (PSRAM).
도 3은 본 발명의 일 실시예에 따른 카메라의 일 실시예를 도시한 블록도이다.3 is a block diagram illustrating an embodiment of a camera according to an embodiment of the present invention.
도 3을 참조하면, 카메라(100)는 제어부(105), 메모리(110), 촬영부(115), 플래쉬부(116), 디스플레이(120), 입력부(125), 퀵입력부(126), 마이크(130), 스피커(135), 적외선수신부(140), 저장부(145), 거리센서부(150), 조도센서부(155), 위치센서부(160), 습도센서부(165), 배터리감지부(170), 케이블연결부(175), 마운트홀(180), 송수신부(185), 모션인식부(190) 중 적어도 하나를 포함한다.3, the camera 100 includes a control unit 105, a memory 110, a photographing unit 115, a flash unit 116, a display 120, an input unit 125, a quick input unit 126, An infrared sensor 140, a storage unit 145, a distance sensor unit 150, an illuminance sensor unit 155, a position sensor unit 160, a humidity sensor unit 165, And includes at least one of a sensing unit 170, a cable connection unit 175, a mount hole 180, a transceiver unit 185, and a motion recognition unit 190.
제어부(105)는 메모리(110), 촬영부(115), 플래쉬부(116), 디스플레이(120), 입력부(125), 퀵입력부(126), 마이크(130), 스피커(135), 적외선수신부(140), 저장부(145), 거리센서부(150), 조도센서부(155), 위치센서부(160), 습도센서부(165), 배터리감지부(170), 케이블연결부(175), 마운트홀(180), 송수신부(185), 모션인식부(190) 중 적어도 하나를 제어할 수 있다.The control unit 105 includes a memory 110, a photographing unit 115, a flash unit 116, a display 120, an input unit 125, a quick input unit 126, a microphone 130, a speaker 135, A humidity sensor unit 165, a battery sensing unit 170, a cable connection unit 175, a storage unit 145, a distance sensor unit 150, an illuminance sensor unit 155, a position sensor unit 160, The mount hole 180, the transmission / reception unit 185, and the motion recognition unit 190, as shown in FIG.
제어부(105)는 메모리(110) 및 저장부(145) 중에서 적어도 하나에 저장된 프로그램 명령(program command)을 실행할 수 있다. 프로세서(105)는 중앙 처리 장치(central processing unit, CPU), 그래픽 처리 장치(graphics processing unit, GPU), 또는 본 발명의 실시예들에 따른 방법들이 수행되는 전용의 프로세서를 의미할 수 있다. The control unit 105 may execute a program command stored in at least one of the memory 110 and the storage unit 145. [ The processor 105 may refer to a central processing unit (CPU), a graphics processing unit (GPU), or a dedicated processor on which methods in accordance with embodiments of the present invention are performed.
제어부(105)는 거리센서부(150)를 통해 미리 정해진 시간 동안 미리 정해진 거리 이내에 물체가 파악되지 않는 경우 카메라(100)를 종료시킬 수 있다.The control unit 105 may terminate the camera 100 if an object is not detected within a predetermined distance for a predetermined period of time through the distance sensor unit 150. [
카메라(100)는 자이로센서부(152) 또는 방향센서부(153)를 포함할 수 있다. 방향센서부(153)는 나침반 기능 및 GPS 기능을 통해 카메라(100)가 바라보는 방향을 결정할 수 있다. The camera 100 may include a gyro sensor unit 152 or a direction sensor unit 153. The direction sensor unit 153 can determine the direction in which the camera 100 looks through the compass function and the GPS function.
제어부(105)는 자이로센서부(152)를 통해 카메라(100)가 어느 정도 기울었는지 여부를 인지할 수 있다. 저장부(145)는 촬영된 사진 또는 영상에 방향정보를 저장할 수 있다. 즉 자이로센서부(152) 또는 방향센서부(153)는 촬영방향이 제1방향인지, 제2방향인지 여부를 인지할 수 있다.The control unit 105 can recognize whether the camera 100 has been tilted through the gyro sensor unit 152. The storage unit 145 may store orientation information in the photographed photograph or image. That is, the gyro sensor unit 152 or the direction sensor unit 153 can recognize whether the photographing direction is the first direction or the second direction.
예를 들어, 카메라(100)가 하늘을 촬영하는 경우 제어부(105)는 지평선과 수직한 법석 벡터 값을 자이로센서부(152)를 통해 획득할 수 있고, 획득된 정보를 저장부(145)에 저장할 수 있다.For example, when the camera 100 photographs the sky, the control unit 105 can acquire a moral vector value perpendicular to the horizon through the gyro sensor unit 152, and stores the obtained information in the storage unit 145 Can be stored.
다른 예로, 카메라(100)가 북쪽을 촬영하는 경우 제어부(105)는 NS극에 관련된 정보를 방향센서부(153)를 통해 획득할 수 있고, 획득된 정보를 저장부(145)에 저장할 수 있다.As another example, when the camera 100 photographs the north, the control unit 105 can acquire information related to the NS pole through the direction sensor unit 153, and store the acquired information in the storage unit 145 .
제어부(105)는 배터리감지부(170)를 통해 배터리 충전 정도에 관한 정보를 획득하고, 획득한 배터리의 충전 정도가 미리 정해진 기준보다 낮을 경우, 스트리밍 주기를 제1주기에서 제2주기로 변경할 수 있다. 여기서 제2주기는 제1주기보다 길 수 있다. The controller 105 acquires information on the battery charge level through the battery detector 170 and may change the streaming cycle from the first cycle to the second cycle when the acquired degree of charge of the battery is lower than a predetermined criterion . Where the second period may be longer than the first period.
예를 들어, 카메라(100)의 일반적인 스트리밍 주기가 0.01초라고 가정한다. 제어부(105)는 배터리감지부(170)를 통해 배터리 충전 정도에 관한 정보를 획득하고, 획득한 배터리의 충전 정도가 미리 정해진 기준보다 낮을 경우, 스트리밍 주기를 0.1초로 변경할 수 있다.For example, assume that the normal streaming period of the camera 100 is 0.01 second. The control unit 105 acquires information on the degree of battery charging through the battery sensing unit 170 and may change the streaming cycle to 0.1 seconds when the acquired degree of charge of the battery is lower than a predetermined standard.
제어부(105)는 조도센서부(155)에서 측정되는 빛의 양과, 거리센서부(150)에서 측정되는 상대거리에 대응되도록 촬영부(115)에 포함된 랜즈의 배율을 조절할 수 있다.The control unit 105 can adjust the magnification of the lens included in the photographing unit 115 to correspond to the amount of light measured by the illuminance sensor unit 155 and the relative distance measured by the distance sensor unit 150. [
제어부(105)는 온도센서부(151)에서 측정된 카메라(100)의 외부 또는 내부 온도를 디스플레이(120)를 통해 제공할 수 있다. 또한 제어부(105)는 온도센서부(151)를 통해 측정된 카메라(100)의 내부 온도가 미리 정해진 기준보다 높을 경우, 스트리밍 주기를 제1주기에서 제2주기로 변경할 수 있다. 여기서 제2주기는 제1주기보다 길 수 있다.The control unit 105 may provide the outside or inside temperature of the camera 100 measured by the temperature sensor unit 151 through the display 120. If the internal temperature of the camera 100 measured through the temperature sensor unit 151 is higher than a predetermined reference, the control unit 105 may change the streaming cycle from the first cycle to the second cycle. Where the second period may be longer than the first period.
제어부(105)는 카메라(100)의 통신속도가 미리 정해진 기준보다 낮은 경우, 스트리밍 주기를 상기 제1주기에서 제2주기로 변경할 수 있다. 여기서 제2주기는 제1주기보다 길 수 있다.The control unit 105 may change the streaming period from the first period to the second period when the communication speed of the camera 100 is lower than a predetermined reference. Where the second period may be longer than the first period.
제어부(105)는 촬영부(115)가 사진 또는 영상을 촬영하는 경우, 카메라(100)의 위치 정보를 함께 저정부(145)에 저장하도록 제어할 수 있다.The control unit 105 may control the storage unit 115 to store the location information of the camera 100 in the storage unit 145 when the photographing unit 115 photographs a photograph or an image.
제어부(105)는 카메라(100) 주위의 습도가 미리 정해진 기준 이상인 경우, 카메라(100)에 수분이 침투되지 않도록 외부와 통하는 공기 통로를 차단할 수 있다. 구체적으로, 제어부(105)는 카메라(100) 주위의 습도가 미리 정해진 기준 이상인 경우, 입력부(125), 퀵입력부(126), 마이크(130), 스피커(135)에 미리 탑재된 방수 소재를 수축시켜 외부와 통하는 공기 통로를 차단할 수 있다.The control unit 105 can cut off the air passage communicating with the outside so that moisture is not penetrated into the camera 100 when the humidity around the camera 100 is equal to or higher than a predetermined reference. Specifically, when the humidity around the camera 100 is equal to or higher than a predetermined reference, the control unit 105 shrinks the waterproof material previously mounted on the input unit 125, the quick input unit 126, the microphone 130, and the speaker 135 So that the air passage communicating with the outside can be blocked.
제어부(105)는 적외선수신부(140)가 수신한 명령을 기초로 카메라(100)를 제어할 수 있다.The control unit 105 can control the camera 100 based on the command received by the infrared ray receiving unit 140. [
메모리(110) 및 저장부(145) 각각은 휘발성 저장 매체 및 비휘발성 저장 매체 중에서 적어도 하나로 구성될 수 있다. 예를 들어, 메모리(110)는 읽기 전용 메모리(read only memory, ROM) 및 랜덤 액세스 메모리(random access memory, RAM) 중에서 적어도 하나로 구성될 수 있다. 랜덤 액세스 메모리의 실시예로는 FPM(Fast Page Mode DRAM), WRAM(Window RAM), EDO(Extended Data Out) RAM, BEDO(Burst EDO) RAM, MDRAM(Multibank DRAM), SGRAM(SynchronousGraphics RAM), SDRAM(Synchronous Dynamic RAM), DRDRAM(Direct Rambus DRAM), DDR(Double DataRate) SDRAM, PSRAM(Pseudostatic RAM) 등이 포함될 수 있다.Each of the memory 110 and the storage unit 145 may be constituted by at least one of a volatile storage medium and a non-volatile storage medium. For example, the memory 110 may comprise at least one of read-only memory (ROM) and random access memory (RAM). Examples of the random access memory include Fast Page Mode DRAM (FPM), Window RAM (RAM), Extended Data Out (EDO) RAM, BEDO (Burst EDO) RAM, MDRAM (Multibank DRAM), SGRAM (Synchronous Graphics RAM) (Synchronous Dynamic RAM), Direct Rambus DRAM (DRDRAM), Double Data Rate (DDR) SDRAM, and Pseudostatic RAM (PSRAM).
저장부(145)는 촬영된 영상을 저장하고, 사용자 별 커스터마이징(customizing)이 가능한 오픈 소스(open source)를 포함할 수 있다. 저장부(145)는 사용자가 조금의 지식만으로도 커스터마이징이 가능하도록 확장모듈, 음향SDK 등을 포함할 수 있다. 또한 저장부(145)는 안드로이드 OS를 기반으로 제어될 수 있다.The storage unit 145 may include an open source for storing the photographed image and customizing the image for each user. The storage unit 145 may include an extension module, an audio SDK, and the like so that the user can customize with a little knowledge. Also, the storage unit 145 can be controlled based on the Android OS.
카메라(100)에 포함된 각 버튼은 사용자의 명령에 따라 기능이 변경될 수 있다. 예를 들어, 카메라(100)의 '전원 버튼'과 '퀵입력부(126) 버튼' 기능은 상호 변경될 수 있다. 즉 '퀵입력부(126) 버튼'은 '전원 버튼'의 기능을 하도록 변경될 수 있다.Each button included in the camera 100 can be changed in function according to a user's command. For example, functions of the 'power button' and the 'quick input unit 126' buttons of the camera 100 may be mutually changed. That is, the 'quick input unit 126 button' can be changed to function as a 'power button'.
다른 예로, '홈 버튼'은 퀵입력부(126)와 관련된 명령인 '촬영된 사진 바로보기' 기능을 하도록 변경될 수 있다.As another example, the 'home button' may be changed to have a function of 'view shot photo', which is an instruction related to the quick input unit 126.
또한, 사용자는 카메라(100)에 포함된 인터페이스 자체를 변경할 수 있다. 또한 사용자는 카메라(100)에 포함된 애플리케이션의 기능 자체를 변경할 수 있다.In addition, the user can change the interface included in the camera 100 itself. In addition, the user can change the function of the application included in the camera 100 itself.
촬영부(115)는 카메라(100) 좌측 또는 우측 측면에 부착되어 사진 또는 영상을 촬영할 수 있다.The photographing unit 115 may be attached to the left or right side of the camera 100 to take a photograph or an image.
촬영부(115)는 이미지 센서를 포함할 수 있다. 또한 촬영부(115)는 열화상 이미지 센서를 포함할 수 있다. 촬영부(115)로부터 촬영되는 열화상 이미지는 실시간으로 안드로이드 OS에서 접근(접속) 가능할 수 있다.The photographing unit 115 may include an image sensor. The photographing unit 115 may also include an infrared image sensor. The thermal image captured by the photographing unit 115 can be accessed (accessed) from the Android OS in real time.
카메라(100)는 재난안전망과 연동된 푸시 투 토크(Push to Talk; PPT)앱을 탑재하여 유무선 오디오 해드셋과 블루투스 통신 등을 이용하여 연동될 수 있다.The camera 100 is equipped with a push to talk (PPT) app linked to a disaster safety net, and can be interlocked using wired / wireless audio headset and Bluetooth communication.
푸시투토크(PTT: Push To Talk)는 무전 통신에서 많이 쓰는 개념으로, `누르고 이야기한다`는 의미일 수 있다. 예를 들어, 무전기에 달려 있는 버튼은 PTT 버튼일 수 있다. PTT 버튼을 누르고 말하면 무전 주파수를 맞춘 상대방의 무전 단말기로 음성이 전송된다. PTT 버튼을 누르고 있는 동안에는 음성 신호를 받을 수 없다.Push to talk (PTT) is a concept that is often used in radio communications. It can mean "hold and talk". For example, the button on the transceiver can be a PTT button. When you press the PTT button, the voice is transmitted to the other radio terminal whose radio frequency is set. You can not receive voice signals while pressing the PTT button.
무전 통신 단말기 제조사와 애플리케이션 개발사가 PTT 기능을 소프트웨어(SW)로 구현한 프로그램이 다수 존재한다. 여기서, 사용자는 애플리케이션만 다운 받으면 스마트폰 화면에 있는 PTT 버튼을 누르고 무전 통신을 할 수 있다.There are a number of programs that the PTT function is implemented as software (SW) by the radio communication terminal maker and application developer. Here, when the user downloads the application only, the user can press the PTT button on the smartphone screen and perform radio communication.
PTT 서비스가 주파수 신호가 아닌 데이터 개념으로 바뀌면서 다양한 부가 서비스도 등장했다. 기존의 음성만 전달하던 PTT와 달리 문자뿐만 아니라 영상까지 전송할 수 있다. As PTT service changed to data concept, not frequency signal, various additional services appeared. Unlike PTT, which only transmits voice, it can transmit characters as well as images.
본 발명의 카메라(100)는 재난안전망과 연동된 푸시 투 토크(Push to Talk; PPT)앱을 탑재하여 유무선 오디오 해드셋과 블루투스 통신 등을 이용하여 문자뿐만 아니라 영상을 전송할 수 있다.The camera 100 of the present invention is equipped with a push to talk (PPT) app linked to a disaster safety net, and can transmit images as well as characters using wired / wireless audio headsets and Bluetooth communication.
촬영부(115)는 카메라(100) 렌즈를 포함할 수 있으며, 구체적인 실시예로 촬영부(115)는 140도 화각을 가지는 카메라(100) 렌즈를 포함할 수 있다. The photographing unit 115 may include a camera 100 lens. In a specific example, the photographing unit 115 may include a camera 100 lens having a 140-degree angle of view.
플래쉬부(116)는 카메라(100) 주변의 조도에 따라 촬영 전 빛을 지원하는 장치일 수 있다. 예를 들어 플래쉬부(116)는 어두운 환경, 즉 미리 정해진 기준보다 낮은 조도가 측정되는 경우 촬영 직전 빛을 비추는 기능을 지원할 수 있다. 플래쉬 on 또는 off 옵션을 통해 플래쉬부(116)가 동작할 것인지 여부가 미리 설정될 수 있다. 카메라(100)에 플래쉬 on 설정이 되어 있는 경우 촬영시 플래쉬부(116)가 동작할 수 있다.The flash unit 116 may be a device that supports light before shooting according to the illuminance around the camera 100. For example, the flash unit 116 may support a function of illuminating a light immediately before shooting if a dark environment, i.e., a lower illuminance than a predetermined reference is measured. Whether or not the flash unit 116 is operated through the flash on or off option can be preset. When the camera 100 is set to flash on, the flash unit 116 may operate during shooting.
디스플레이(120)는 카메라(100)에서 처리되는 정보를 표시 출력하며, 구체적으로 유무선 접속에 필요한 접속 정보, 광고 정보 또는 접속 정보 재입력 요청 지시 등을 UI(User Interface) 또는 GUI(Graphic User Interface) 등으로 표시할 수 있다. The display 120 displays information to be processed by the camera 100 and transmits a user interface (UI) or a graphical user interface (GUI) And so on.
이외에도 디스플레이(120)는 액정 디스플레이(liquid crystal display), 박막 트랜지스터 액정 디스플레이(thin film transistor-liquid crystal display), 유기 발광 다이오드(organic light-emitting diode), 플렉서블 디스플레이(flexible display), 3차원 디스플레이(3D display) 중에서 적어도 하나를 포함할 수도 있으며, 구현 형태에 따라 디스플레이가 2개 이상 존재할 수도 있다. 예를 들어, 카메라(100)에는 외부 디스플레이와 내부 디스플레이가 동시에 구비될 수 있다.In addition, the display 120 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display 3D display), and there may be two or more displays depending on the implementation. For example, the camera 100 may be provided with an external display and an internal display at the same time.
입력부(125)는 카메라(100)의 온(on) 또는 오프(off) 기능을 포함하는 전원버튼, 디스플레이(120)에 포함된 터치패드, 디스플레이(120) 아래쪽에 나란히 배치된 멀티태스킹 버튼, 홈버튼, 취소버튼을 포함한다.The input unit 125 may include a power button including an on or off function of the camera 100, a touch pad included in the display 120, a multitasking button arranged side by side below the display 120, Button, and a cancel button.
멀티태스킹 버튼은 현재 실행 중인 모든 애플리케이션을 한번에 디스플레이(120)에서 제공하는 기능을 포함할 수 있다. 홈버튼은 홈버튼 페이지로 설정된 페이지로 이동하도록 하는 기능 등을 포함할 수 있다. 취소 버튼은 실행 취소 또는 상위 페이지로 이동하는 기능 등을 포함할 수 있다.The multitasking button may include the ability to present all currently running applications at display 120 at one time. The home button may include a function of moving to a page set as a home button page, and the like. The cancel button may include a function of undoing or moving to an upper page, and the like.
홈버튼과 전원버튼을 동시에 누르면 디스플레이 화면의 스크린샷을 획득할 수 있다. 또는 멀티태스킹 버튼과 전원버튼을 동시에 누르면 디스플레이 화면의 스크린샷을 획득할 수 있다. 또는 취소 버튼과 전원버튼을 동시에 누르면 디스플레이 화면의 스크린샷을 획득할 수 있다. 또는 홈버튼과 멀티태스킹 버튼을 동시에 누르면 디스플레이 화면의 스크린샷을 획득할 수 있다.By pressing the home button and the power button at the same time, a screen shot of the display screen can be obtained. Alternatively, a screen shot of the display screen can be obtained by simultaneously pressing the multi-tasking button and the power button. Alternatively, a screen shot of the display screen can be obtained by simultaneously pressing the cancel button and the power button. Alternatively, a screen shot of the display screen can be obtained by simultaneously pressing the home button and the multi-tasking button.
퀵입력부(126)는 즉각적으로 영상을 촬영할 수 있도록 상기 사용자에 의해 미리 설정된 인터페이스로 전환될 수 있도록 하는 버튼일 수 있다. 예를 들어, 사용자는 퀵입력부(126) 버튼을 통해 사용자가 설정해 놓은 영상 촬영 모드로 바로 이동할 수 있다. 또는 사용자는 퀵입력부(126) 버튼을 통해 사용자가 미리 설정해 놓은 플래쉬 기능 on, 5초 후에 촬영, 화면 배율 확대 등이 적용된 사진촬영 모드로 바로 이동할 수 있다. The quick input unit 126 may be a button that can be switched to an interface preset by the user so that the user can take an image immediately. For example, the user can directly go to the image shooting mode set by the user through the quick input unit 126 button. Alternatively, the user can directly go to the photographing mode to which the flash function on, which is set in advance by the user via the quick input unit 126 button, is taken 5 seconds later, and the enlargement of the screen magnification is applied.
마이크(130)는 사용자의 음성 또는 주위에서 발생하는 음원을 입력 받아 전자적 신호로 변경할 수 있다. 마이크(130)는 주로 BJ 방송을 하는 사용자를 위해 이용될 수 있다The microphone 130 can receive a user's voice or a sound source generated in the surroundings and change the received electronic signal into an electronic signal. The microphone 130 can be used mainly for a user who performs BJ broadcasting
스피커(135)는 카메라(100)에서 발생할 수 있는 효과음, 처리음, 촬영음 등을 제공할 수 있다. The speaker 135 may provide effect sounds, processed sounds, shot sounds, and the like, which may occur in the camera 100.
적외선수신부(140)는 적외선을 통해 주변 물체 등을 인지할 수 있다. 적외선수신부(140)는 가시광선이 차단된 어두운 환경에서 적외선을 통해 주변 물체 등을 인지할 수 있다. 적외선수신부(140)는 적외선을 수신할 수 있다. 적외선수신부(140)는 특정 명령을 수신할 수 있다.The infrared ray receiver 140 can recognize nearby objects or the like through infrared rays. The infrared ray receiver 140 can recognize nearby objects or the like through infrared rays in a dark environment where visible light is blocked. The infrared ray receiving unit 140 can receive infrared rays. The infrared ray receiver 140 can receive a specific command.
거리센서부(150)는 카메라(100) 근처에 있는 물체와 카메라(100) 사이의 거리를 측정할 수 있다. 거리센서부(150)는 점간의 거리를 측정하는 방식으로 3각 측량방식(적외선 이용식, 자연광 이용식), 초음파 방식 등을 사용할 수 있다.The distance sensor unit 150 can measure the distance between the object near the camera 100 and the camera 100. [ The distance sensor unit 150 may use a triangulation method (infrared ray type, natural light ray type), an ultrasonic wave method, or the like to measure the distance between points.
초음파 방식은 피측정물에 지향성이 날카로운 초음파를 송신하여 피측정물로부터의 반사파를 수신하기까지의 시간을 측정하여 거리를 아는 방식인데, 수신센서는 압전소자가 사용될 수 있다.The ultrasonic method is a method of transmitting ultrasound with sharp direction to the object to be measured and measuring the time until the reflected wave from the object is received to find the distance. A piezoelectric element can be used as the receiving sensor.
또한, 거리센서부(150)는 촬영부(115)와 같은 방향에 위치한 물체와 카메라(100) 사이의 거리를 지칭하는 상대거리를 측정할 수 있다.In addition, the distance sensor unit 150 may measure a relative distance, which indicates a distance between the camera 100 and an object positioned in the same direction as the photographing unit 115.
조도센서부(155)는 촬영부(115)로 조사되는 빛의 양을 측정할 수 있다. 조도는 측정 시감도(視感度) 곡선에 가까운 광감도가 요구되는데, 센서의 광감도가 시감도와 현저히 다른 경우에는 시감도 보정 필터를 센서에 덧붙이든가 또는 보정 계수에 의하여 측정치를 보정할 수 있다. 조도 센서로서는 여러 가지 광전지를 이용할 수 있지만, 매우 낮은 조도의 측정에는 광전관(光電管) 등도 사용될 수 있다.The illuminance sensor unit 155 can measure the amount of light irradiated to the photographing unit 115. If the photosensitivity of the sensor is significantly different from the visibility, the illuminance correction filter can be added to the sensor or the measured value can be corrected by the correction coefficient. As the illuminance sensor, various kinds of photovoltaic cells can be used, but a phototube can also be used for the measurement of very low illuminance.
조도센서부(155)로부터 측정된 조도는 촬영부에 포함된 렌즈의 배율을 변경시키는데 사용될 수 있다.The illuminance measured from the illuminance sensor unit 155 can be used to change the magnification of the lens included in the photographing unit.
위치센서부(160)는 GPS 모듈 등을 통해 카메라(100)의 위치를 파악할 수 있다. The position sensor 160 can determine the position of the camera 100 through a GPS module or the like.
습도센서부(165)는 카메라(100) 근처의 습도를 측정할 수 있다. The humidity sensor unit 165 can measure the humidity in the vicinity of the camera 100. [
습도센서부(165)는 다공질 세라믹스나 고분자막으로 흡수됨으로써 일어나는 전기저항이나 정전용량의 변화를 이용하여 수분을 측정할 수 있다. 또한 습도센서부(165)는 진동자에 설치한 흡수 물질의 중량변화에 의한 진동자의 공진주파수의 변화를 이용하여 수분을 측정할 수 있다.The humidity sensor unit 165 can measure moisture using changes in electrical resistance or capacitance caused by absorption by porous ceramics or a polymer membrane. In addition, the humidity sensor unit 165 can measure moisture using a change in the resonance frequency of the vibrator due to a change in the weight of the absorbing material provided on the vibrator.
습도센서부(165)는 실시예로는, 건습구 습도계, 모발 습도계, 염화 리튬 습도센서, 전해 습도센서(P2O5 습도센서), 고분자막 습도 센서, 수정진동식 습도센서, 산화알루미늄 습도센서, 세라믹 습도센서, 서미스터 습도센서, 마이크로파 습도센서, 결로센서, 노점센서 등이 있을 수 있으나 이에 제한되지 않는다.Examples of the humidity sensor unit 165 include a dry humidity sensor, a hair hygrometer, a lithium chloride humidity sensor, an electrolytic humidity sensor (P2O5 humidity sensor), a polymer membrane humidity sensor, a quartz oscillation type humidity sensor, an aluminum oxide humidity sensor, , A thermistor humidity sensor, a microwave humidity sensor, a condensation sensor, a dew point sensor, and the like.
카메라(100)는 카메라(100)의 내부 또는 외부의 온도를 측정하는 온도센서부(151)를 더 포함할 수 있다. 온도센서부(151)에서 측정된 카메라(100)의 내부 또는 외부 온도는 스트리밍 주기를 변경시키는데 사용될 수 있다.The camera 100 may further include a temperature sensor unit 151 for measuring the temperature inside or outside the camera 100. The internal or external temperature of the camera 100 measured at the temperature sensor unit 151 can be used to change the streaming period.
카메라(100)는 자이로센서부(152)를 포함한다. 자이로센서부(152)는 카메라(100)의 중력을 기초로 카메라의 기울어짐 등을 측정할 수 있다.The camera 100 includes a gyro sensor unit 152. The gyro sensor unit 152 can measure the tilt of the camera based on the gravity of the camera 100. [
배터리감지부(170)는 카메라(100)에 포함된 배터리의 충전 정도를 측정할 수 있다. 배터리감지부(170)는 프로세서(105)를 통해 배터리 잔량이 디스플레이(120)에 시각적으로 표현되도록 정보를 제공할 수 있다.The battery detection unit 170 may measure the degree of charge of the battery included in the camera 100. [ The battery detection unit 170 may provide information through the processor 105 so that the remaining battery level is visually represented on the display 120. [
케이블연결부(175)는 외부 기기와 연결 가능한 유선 커넥터와 암수 결합 될 수 있다. 케이블연결부(175)의 실시예로는 마이크로 USB 입력포트, 마이크로 HDMI, 외부확장핀(Uboot 포트, SPI 포트, UART 포트 I2C 포트, 디버깅 포트, GPIO 포트, 오디오 포트) 등이 포함될 수 있으나 이에 제한되는 것은 아니다.The cable connector 175 may be female-mated with a wired connector connectable to an external device. Examples of the cable connection 175 may include a micro USB input port, a micro HDMI, an external expansion pin (Uboot port, SPI port, UART port I2C port, debugging port, GPIO port, audio port) It is not.
마이크로 USB 입력포트의 실시예로는 micro USB 2.0(OTG)가 포함될 수 있다. 외부확장핀의 실시예로는 USB host 1ch, SPI-1ch, I2C-1ch, GPIO-4pin, UART-1ch, 헤드셋 포트 등이 포함될 수 있다.An example of a micro USB input port may include micro USB 2.0 (OTG). Examples of the external expansion pin include USB host 1ch, SPI-1ch, I2C-1ch, GPIO-4pin, UART-1ch, a headset port, and the like.
송수신부(185)는 촬영된 영상을 미리 정해진 주기인 제1주기마다 데이터 스트리밍을 통해 서버로 전송할 수 있다.The transmission / reception unit 185 may transmit the photographed image to the server through data streaming every predetermined period of the first period.
마운트홀(190)은 카메라(100) 하부에 위치하며, 카메라(100)가 고정될 수 있도록 지지하는 마운트와 암수 결합될 수 있다.The mount hole 190 is located under the camera 100 and can be male and female combined with a mount that supports the camera 100 to be fixed.
또한, 카메라(100)는 배터리 잔량을 측정하는 배터리감지부 및 카메라(100) 내부 온도를 측정하는 온도센서부를 포함하며, 제어부(105)는 배터리 잔량이 임계치 이하인 경우, 카메라(100) 내부 온도가 임계치 이상인 경우 및 통신속도가 임계치 이하인 경우 중 적어도 하나에 해당할 경우, 스트리밍 주기를 제1주기에서 제2주기로 변경할 수 있다. 이때, 제2주기는 제1주기보다 긴 것을 특징으로 한다.The controller 105 includes a battery sensor unit for measuring the remaining battery power and a temperature sensor unit for measuring the internal temperature of the camera 100. When the remaining battery power is less than a threshold value, The streaming period may be changed from the first period to the second period when the communication speed corresponds to at least one of a threshold value and a communication speed lower than a threshold value. In this case, the second period is longer than the first period.
따라서, 스트리밍 주기를 길게 하여 배터리 소모를 적게 하도록 함으로써, 배터리가 소모되는 속도를 늦추거나, 카메라(100) 내부 온도가 내려가도록 할 수 있다.Accordingly, the streaming cycle is lengthened to reduce the battery consumption, so that the speed at which the battery is consumed can be decreased or the internal temperature of the camera 100 can be lowered.
또한, 카메라(100)는 촬영된 영상에서 사람의 얼굴을 인식하는 얼굴 인식부를 더 포함할 수 있다.In addition, the camera 100 may further include a face recognition unit for recognizing a face of a person in the photographed image.
그리고, 제어부(105)는 사용자로부터 모자이크 모드를 선택받고, 촬영된 영상에서 특정 인물의 얼굴을 선택받으면, 선택받은 특정 인물을 제1인물로 등록하고, 촬영된 영상에서 제1인물을 제외한 다른 인물의 얼굴을 모자이크 처리한 후에 송수신부를 통해 서버로 스트리밍한다.When the user selects the mosaic mode and receives a selection of a face of a specific person from the photographed image, the control unit 105 registers the selected specific person as a first person, and selects a person other than the first person And then streamed to the server through the transmission / reception unit.
최근 들어, SNS, Youtube와 같이 다수의 제3자가 볼 수 있는 곳에 사진, 영상을 올리는 경우에 초상권 침해, 사생활 침해와 같은 문제점들이 나타나고 있다.In recent years, problems such as infringement of privacy rights and infringement of privacy have appeared when uploading pictures and images to a place where many third parties can view such as SNS and Youtube.
따라서, 사용자로부터 입력받은 얼굴만을 모자이크 없이 스트리밍하고, 그 외 인물들의 얼굴은 모자이크 처리한 후에 스트리밍함으로써, 위와 같은 문제점을 해결하도록 할 수 있다.Accordingly, the above-described problems can be solved by streaming only the face received from the user without mosaic, and mosaicing the face of the other persons and streaming the same.
일 실시예로, 카메라(100)는 사용자의 모션을 인식하는 모션인식부(190)를 더 포함한다.In one embodiment, the camera 100 further includes a motion recognition unit 190 for recognizing the motion of the user.
모션인식부(190)는 사용자의 제1모션이 인식되면 사진 촬영모드로 전환하고, 사진 촬영모드에서 제2모션이 인식되면 타이머 촬영모드로 전환되고, 타이머 촬영모드에서 제3모션이 인식되면, 제3모션의 종류에 따라서 타이머를 설정하고, 타이머 후에 사진을 촬영할 수 있다.When the first motion of the user is recognized, the motion recognition unit 190 switches to the photographing mode. When the second motion is recognized in the photographing mode, the motion recognition unit 190 switches to the timer photographing mode. When the third motion is recognized in the timer photographing mode, A timer can be set according to the type of the third motion, and a picture can be taken after the timer.
위와 같이 순차적으로 약속되어 있는 모션을 인식시킴으로써, 사용자는 카메라(100)를 삼각대와 같이 먼 곳에 위치시킨 후에 모션인식만으로 타이머를 설정하고 촬영할 수 있다.By recognizing the motions sequentially promised as above, the user can set the timer and photograph the camera 100 only by recognizing the motion after the camera 100 is located as far as the tripod.
이때, 각 모션들은 미리 저장되어 있거나, 사용자가 각각을 설정할 수 있으며, 예를 들어, 제3모션의 경우 숫자를 지칭하는 모션이 설정될 수 있다.At this time, each motion is stored in advance, or the user can set each motion. For example, in the case of the third motion, a motion indicating a number can be set.
그리고, 카메라(100)의 제어부(105)는 사용자의 제4모션이 인식되면, 제어부(105)에 설정된 사용자의 SNS계정에 상기 제3모션을 이용하여 촬영된 영상을 업로드한다.When the fourth motion of the user is recognized, the control unit 105 of the camera 100 uploads the image photographed using the third motion to the SNS account of the user set in the control unit 105. [
일 실시예로, 카메라(100)는 흔들림 보정모드를 사용하여 카메라(100)의 기준선을 통해 촬영 영상을 보정할 수 있다.In one embodiment, the camera 100 may correct the photographed image through the reference line of the camera 100 using the shake correction mode.
보다 상세하게는, 제어부(105)는 제1흔들림 보정모드를 선택받으면, 촬영 영상에서 사용자의 얼굴을 인식하여 사용자의 눈, 코 및 입술을 추출하고, 미간으로부터 코, 입술의 가운데까지의 제1기준선을 설정하며, 촬영부(115)를 통해 촬영되는 영상을 제1기준선의 각도가 유지되도록 영상의 각도를 보정한다.More specifically, when the first shake correction mode is selected, the control unit 105 recognizes the user's face on the photographed image and extracts the user's eyes, nose, and lips. And adjusts the angle of the image so that the angle of the first reference line is maintained in the image photographed through the photographing unit 115.
예를 들어, 자전거에 마련되어 있는 마운트와 마운트홀을 암수결합하여 카메라(100)를 고정하고, 촬영을 진행하여도 자전거와 사용자의 신체가 흔들림에 따라서 영상이 흔들리게 된다.For example, even if the camera 100 is fixed by combining the mount and the mount hole provided on the bicycle, the image is shaken as the bicycle and the user's body shake.
따라서, 흔들림 보정모드를 선택하고, 촬영 영상의 사용자 얼굴에서 미간으로부터 코, 입술 가운데까지의 제1기준선을 설정하고, 제1기준선의 각도가 유지되도록 영상을 보정함으로써, 영상의 흔들림 현상을 최소화할 수 있다.Therefore, by selecting the shake correction mode, the first reference line from the forehead to the center of the nose and lips is set in the user's face of the shot image, and the image is corrected so that the angle of the first reference line is maintained, .
또한, 보다 상세하게는, 제2흔들림 보정모드를 선택받으면, 현재 촬영 영상에서 사용자의 양쪽 눈을 잇는 제2기준선을 설정하며, 촬영부(115)를 통해 촬영되는 영상을 제1기준선 및 제2기준선의 각도가 유지되도록 영상의 각도를 보정하도록 한다.More specifically, when the second shake correction mode is selected, a second reference line connecting both eyes of the user is set in the current photographic image, and an image photographed through the photographing unit 115 is referred to as a first reference line and a second reference line The angle of the image is corrected so that the angle of the reference line is maintained.
또한, 카메라(100)는 사용자로부터 방송모드를 입력받으면, 서버가 연동되어 있는 방송 플랫폼으로 카메라(100)로부터 수신된 영상을 전송하여 방송하도록 한다.In addition, when the user inputs a broadcast mode from the user, the camera 100 transmits the image received from the camera 100 to a broadcast platform on which the server is linked.
예를 들어, 서버는 사용자로부터 방송모드를 입력받으면, 서버에 연동되어 있는 Youtube와 같은 방송 플랫폼으로 방송을 송출하도록 할 수 있다.For example, when the server receives a broadcast mode from a user, the server can transmit a broadcast to a broadcast platform such as Youtube linked to the server.
또한, 서버는 카메라(100)와 서버 또는 서버와 방송 플랫폼 간의 통신상태가 불량인 경우, 방송 플랫폼으로 광고 송출을 요청하는 것을 특징으로 한다.In addition, when the communication state between the camera 100, the server or the server and the broadcasting platform is poor, the server requests advertisement transmission to the broadcasting platform.
예를 들어, 특정 사정으로 인하여 카메라(100)와 서버의 통신상태가 불량할 수 있고, 서버와 방송 플랫폼의 통신상태가 불량할 수 있는데, 이 경우에는 방송을 송출하도록 하여도 제대로된 방송이 진행되지 않을 가능성이 높다. 따라서, 서버는 촬영 영상을 방송하는 대신에 준비되어 있는 광고 영상이 송출되도록 할 수 있다.For example, the communication state between the camera 100 and the server may be poor due to the specific circumstances, and the communication state between the server and the broadcasting platform may be poor. In this case, even if the broadcasting is transmitted, . Therefore, the server can send the prepared advertisement image instead of broadcasting the shot image.
또한, 카메라(100)는 사용자의 모션을 인식하는 모션인식부(190)를 더 포함하며, 사용자의 제5모션이 인식되면, 사용자의 제6모션이 인식될 때까지의 촬영 영상을 서버에 하이라이트 영상으로 저장하도록 요청한다.The camera 100 further includes a motion recognition unit 190 for recognizing the motion of the user. When the fifth motion of the user is recognized, the photographed image until the sixth motion of the user is recognized is highlighted to the server To be saved as an image.
그리고, 카메라(100)와 서버 또는 서버와 방송 플랫폼 간의 통신상태가 불량인 경우, 방송 플랫폼으로 상기 하이라이트 영상을 전송하여, 통신상태가 정상화될 때까지 상기 하이라이트 영상을 방송하도록 한다.When the communication state between the camera 100 and the server or the server and the broadcasting platform is poor, the highlight image is transmitted to the broadcasting platform, and the highlight image is broadcast until the communication state is normalized.
사용자가 야외에 나가서 촬영을 하고, 이를 방송플랫폼으로 방송하는 경우에는 여러가지 이유로 인하여 통신상태가 불량한 상황이 자주 발생하게 된다. 이때, 제대로 된 방송이 이루어지지 않는 경우에 시청자들이 다른 방송을 보러 가거나 방송을 종료하는 일이 발생할 수 있다.When a user goes outdoors and shoots and broadcasts them on a broadcasting platform, a communication state is often poor due to various reasons. At this time, when proper broadcasting is not performed, it may occur that the viewers go to another broadcasting or terminate the broadcasting.
위와 같은 구성은 이와 같은 현상을 해결하기 위한 것으로, 사용자로부터 제5모션, 제6모션을 통해서 손쉽게 하이라이트 영상을 녹화하여 서버에 저장하도록 한다.In order to solve such a problem, the above configuration easily records a highlight image through the fifth motion and the sixth motion from the user and stores the highlight image on the server.
그리고, 통신상태 불량 현상이 일어날 경우에, 방송 플랫폼으로 광고를 송출하도록 할 수도 있고, 사용자로부터 선택되어 저장된 하이라이트 영상을 송출하도록 함으로써, 시청자들이 통신상태가 정상화될 때까지 하이라이트 영상을 시청할 수 있도록 하는 효과가 있다.In addition, when a communication failure phenomenon occurs, an advertisement may be transmitted to a broadcasting platform, or a highlight image selected and stored from a user may be transmitted so that viewers can watch a highlight image until the communication state is normalized It is effective.
도 4는 본 발명의 제1실시예에 따른 카메라의 사시도이다.4 is a perspective view of a camera according to a first embodiment of the present invention.
도 4를 참조하면, 제1촬영부(115-1) 촬영 방향을 기초로 왼쪽에 디스플레이(120)가 위치하고, 촬영 방향을 기초로 오른쪽에 배터리 및 배터리 감지부(170)가 위치한 카메라(100)가 확인될 수 있다.4, the display 120 is located on the left side based on the photographing direction of the first photographing unit 115-1 and the camera 100 where the battery and the battery detecting unit 170 are located on the right side based on the photographing direction, Can be confirmed.
제1실시예에 따른 카메라(100)는 촬영부(115), 플래쉬부(116), 적외선수신부(140)가 한 평면 상에 위치할 수 있으나, 이에 한정되는 것은 아니다. 촬영부(115)만 단독으로 돌출되어 있을 경우 촬영부(115)가 쉽게 파손될 수 있기 때문에, 제1실시예에서는 플래쉬부(116), 적외선수신부(140)가 돌출되어 있고, 촬영부(115)는 돌출되지 않은 부분에 포함되어, 전체적으로 촬영부(115), 플래쉬부(116), 적외선수신부(140)가 하나의 평면을 이룰 수 있다.In the camera 100 according to the first embodiment, the photographing unit 115, the flash unit 116, and the infrared ray receiving unit 140 may be positioned on one plane, but the present invention is not limited thereto. The flash unit 116 and the infrared ray receiving unit 140 are protruded in the first embodiment and the photographing unit 115 can be easily detached from the photographing unit 115 because the photographing unit 115 can be easily broken if only the photographing unit 115 is protruded alone. The flash unit 116, and the infrared ray receiving unit 140 can be formed in one plane as a whole.
카메라(100)의 각 구성의 위치로써, 디스플레이(120)는 카메라(100)의 측면에 넓게 위치하고 있을 수 있다. 입력부(125) 중 터치패드는 디스플레이(120) 위에 위치하여 터치패드의 역할을 할 수 있다. 또한 입력부(125) 중 멀티태스킹 버튼, 홈 버튼, 취소 버튼은 터치패드의 아래쪽에 나란히 위치하고 있을 수 있다.As the position of each configuration of the camera 100, the display 120 may be positioned broadly on the side of the camera 100. [ The touch pad of the input unit 125 may be positioned on the display 120 and function as a touch pad. In addition, the multitasking button, the home button, and the cancel button in the input unit 125 may be located on the lower side of the touch pad.
도 4에 표현된 케이블연결부(175)는 케이블연결부(175) 중에서 마이크로 USB일 수 있다. 카메라(100)는 복수의 촬영부를 포함할 수 있다. 카메라(100) 우측 측면에 포함된 제1촬영부(115-1)는 랜즈를 포함하며, 디스플레이(120)의 측면 방향으로 사진 또는 영상을 촬영할 수 있다. 제2촬영부(115-2)는 랜즈를 포함하며, 디스플레이(120)에서 방사되는 빛과 동일한 방향으로 사진 또는 영상을 촬영할 수 있다. 즉, 제2촬영부(115-2)는 사용자가 자신의 얼굴을 보면서 사용자의 얼굴을 촬영할 수 있도록 할 수 있다.The cable connection portion 175 shown in FIG. 4 may be a micro USB among the cable connection portions 175. The camera 100 may include a plurality of photographing portions. The first photographing unit 115-1 included in the right side of the camera 100 includes a lens and can take a photograph or an image in the lateral direction of the display 120. [ The second photographing unit 115-2 includes a lens and can take a photograph or an image in the same direction as the light emitted from the display 120. [ That is, the second photographing unit 115-2 can allow the user to photograph the face of the user while looking at his / her face.
플래쉬부(116)는 촬영부(115)와 적외선수신부(140) 중간에 위치할 수 있다. 플래쉬부(116)는 다각형 또는 원형일 수 있다. 플래쉬부(116)는 바람직하게는 사각형 또는 원형일 수 있다. 도 4에서는 플래쉬부(116)를 원형으로 표현하였으나 이러한 모양에 제한되는 것은 아니다. The flash unit 116 may be positioned between the photographing unit 115 and the infrared ray receiving unit 140. The flash portion 116 may be polygonal or circular. The flash portion 116 may preferably be rectangular or circular. In FIG. 4, the flash unit 116 is shown in a circular shape, but is not limited to this shape.
입력부(125)와 퀵입력부(126)는 나란하게 카메라(100)의 상부에 위치할 수 있다. 적외선수신부(140)는 촬영부(115) 아래쪽에 위치할 수 있다. 마이크(130)는 적외선수신부(140) 아래에 위치할 수 있다.The input unit 125 and the quick input unit 126 may be positioned on the upper side of the camera 100 in parallel. The infrared ray receiving unit 140 may be located below the photographing unit 115. The microphone 130 may be positioned below the infrared receiver 140.
도 5는 본 발명의 제2실시예에 따른 카메라의 사시도이다.5 is a perspective view of a camera according to a second embodiment of the present invention.
도 5를 참조하면, 촬영부(115) 촬영 방향을 기초로 왼쪽에 배터리 및 배터리 감지부(170)가 위치하고, 촬영 방향을 기초로 오른쪽에 디스플레이(120)가 위치한 카메라(100)가 확인될 수 있다.5, the battery and battery sensing unit 170 are located on the left side of the photographing unit 115, and the camera 100 where the display 120 is located on the right side is determined based on the photographing direction have.
제2실시예에 따른 카메라(100)는 촬영부(115), 플래쉬부(116), 적외선수신부(140)가 한 평면 상에 위치할 수 있으나, 이에 한정되는 것은 아니다. 촬영부(115)만 단독으로 돌출되어 있을 경우 촬영부(115)가 쉽게 파손될 수 있기 때문에, 제1실시예에서는 플래쉬부(116), 적외선수신부(140)가 돌출되어 있고, 촬영부(115)는 돌출되지 않은 부분에 포함되어, 전체적으로 촬영부(115), 플래쉬부(116), 적외선수신부(140)가 하나의 평면을 이룰 수 있다.In the camera 100 according to the second embodiment, the photographing unit 115, the flash unit 116, and the infrared ray receiving unit 140 may be located on one plane, but the present invention is not limited thereto. The flash unit 116 and the infrared ray receiving unit 140 are protruded in the first embodiment and the photographing unit 115 can be easily detached from the photographing unit 115 because the photographing unit 115 can be easily broken if only the photographing unit 115 is protruded alone. The flash unit 116, and the infrared ray receiving unit 140 can be formed in one plane as a whole.
스피커(135)는 배터리 및 배터리감지부(170)와 나란한 평면에 위치할 수 있으나, 이에 제한되는 것은 아니다, 도 5에 표현된 케이블연결부(175)는 케이블연결부(175) 중에서 마이크로 HDMI일 수 있다.The speaker 135 may be located in a plane parallel to the battery and the battery sensing unit 170, but is not limited thereto. The cable connection 175 shown in FIG. 5 may be a micro HDMI among the cable connection 175 .
입력부(125)와 퀵입력부(126)는 나란하게 카메라(100)의 상부에 위치할 수 있다. 플래시부(116)는 촬영부(115)와 적외선수신부(140) 사이에 위치할 수 있다. 도 5에서는 플래쉬부(116)를 원형으로 표현하였으나 이러한 모양에 제한되는 것은 아니다.The input unit 125 and the quick input unit 126 may be positioned on the upper side of the camera 100 in parallel. The flash unit 116 may be positioned between the photographing unit 115 and the infrared ray receiving unit 140. In FIG. 5, the flash unit 116 is represented by a circular shape, but the present invention is not limited thereto.
플래시부(116)는 어두운 환경인지에 관한 정보를 조도센서부(155)로부터 수신하여 촬영 시 조명을 비출 것인지 여부를 결정할 수 있다. The flash unit 116 can receive information about the dark environment from the illumination sensor unit 155 and determine whether to illuminate the image when shooting.
도 6은 본 발명의 제3실시예에 따른 카메라의 제1사시도이다.6 is a first perspective view of a camera according to a third embodiment of the present invention.
도 6을 참조하면, 촬영부(115) 촬영 방향을 기초로 왼쪽에 배터리 및 배터리 감지부(170)가 위치하고, 촬영 방향을 기초로 오른쪽에 디스플레이(120)가 위치한 카메라(100)가 확인될 수 있다.6, the battery and battery sensing unit 170 are located on the left side of the photographing unit 115, and the camera 100 where the display 120 is located on the right side is determined based on the photographing direction have.
제3실시예에 따른 카메라(100)는 플래쉬부(116)와 적외선수신부(140)가 한 평면 상에 위치하고, 촬영부(115)만 단독으로 돌출되어 있을 수 있다.In the camera 100 according to the third embodiment, the flash unit 116 and the infrared ray receiving unit 140 are located on one plane, and only the photographing unit 115 is protruded alone.
카메라(100)의 각 구성의 위치로써, 디스플레이(120)는 카메라(100)의 측면에 넓게 위치하고 있을 수 있다. 입력부(125) 중 터치패드는 디스플레이(120) 위에 위치하여 터치패드의 역할을 할 수 있다. 또한 입력부(125) 중 멀티태스킹 버튼, 홈 버튼, 취소 버튼은 터치패드의 아래쪽에 나란히 위치하고 있을 수 있다. As the position of each configuration of the camera 100, the display 120 may be positioned broadly on the side of the camera 100. [ The touch pad of the input unit 125 may be positioned on the display 120 and function as a touch pad. In addition, the multitasking button, the home button, and the cancel button in the input unit 125 may be located on the lower side of the touch pad.
도 6에 표현된 케이블연결부(175)는 케이블연결부(175) 중에서 마이크로 USB일 수 있다. 카메라(100) 우측 측면에 포함된 촬영부(115)는 랜즈를 포함하며, 사진 또는 영상을 촬영할 수 있다. The cable connection portion 175 shown in FIG. 6 may be a micro USB among the cable connection portions 175. The photographing unit 115 included in the right side of the camera 100 includes a lens and can take a photograph or an image.
플래쉬부(116)는 촬영부(115)와 적외선수신부(140) 중간에 위치할 수 있다. 플래쉬부(116)는 다각형 또는 원형일 수 있다. 플래쉬부(116)는 바람직하게는 사각형 또는 원형일 수 있다. 도 6에서는 플래쉬부(116)를 사각형으로 표현하였으나 이러한 모양에 제한되는 것은 아니다.The flash unit 116 may be positioned between the photographing unit 115 and the infrared ray receiving unit 140. The flash portion 116 may be polygonal or circular. The flash portion 116 may preferably be rectangular or circular. In FIG. 6, the flash unit 116 is represented by a quadrangle, but the present invention is not limited thereto.
적외선수신부(140)는 촬영부(115) 아래쪽에 위치할 수 있다. 마이크(130)는 적외선수신부(140) 아래에 위치할 수 있다.The infrared ray receiving unit 140 may be located below the photographing unit 115. The microphone 130 may be positioned below the infrared receiver 140.
도 7은 본 발명의 제3실시예에 따른 카메라의 제2사시도이다.7 is a second perspective view of a camera according to a third embodiment of the present invention.
도 7을 참조하면, 도 7을 참조하면, 촬영부(115) 촬영 방향을 기초로 왼쪽에 배터리 및 배터리 감지부(170)가 위치하고, 촬영 방향을 기초로 오른쪽에 디스플레이(120)가 위치한 카메라(100)가 확인될 수 있다.Referring to FIG. 7, referring to FIG. 7, a battery and a battery detecting unit 170 are located on the left side based on a photographing direction of the photographing unit 115, and a camera 100) can be confirmed.
제3실시예에 따른 카메라(100)는 플래쉬부(116)와 적외선수신부(140)가 한 평면 상에 위치하고, 촬영부(115)만 단독으로 돌출되어 있을 수 있다.In the camera 100 according to the third embodiment, the flash unit 116 and the infrared ray receiving unit 140 are located on one plane, and only the photographing unit 115 is protruded alone.
도 7에 표현된 케이블연결부(175)는 케이블연결부(175) 중에서 외부확장핀일 수 있다. 마운트홀(180)은 카메라(100)의 하부에 위치할 수 있다.The cable connection portion 175 illustrated in FIG. 7 may be an external expansion pin among the cable connection portions 175. The mount hole 180 may be located below the camera 100.
도 8은 본 발명의 일 실시예에 따른 디스플레이 화면의 동작 과정을 도시한 개념도이다.8 is a conceptual diagram illustrating an operation process of a display screen according to an embodiment of the present invention.
도 8을 참조하면, 도 8의 (a)는 카메라(100)의 일반적인 화면이며, 디스플레이(120) 화면에는 터치패드로서 버튼기능을 수행하는 셋팅 버튼, 화면잠금 버튼, 퀵메뉴 버튼, 동영상 촬영버튼, 사진 촬영버튼, 갤러리 버튼 등이 포함될 수 있으나 이에 제한되지 않는다.8A is a general screen of the camera 100. The screen of the display 120 includes a setting button for performing a button function as a touch pad, a screen locking button, a quick menu button, A photograph button, a gallery button, and the like.
도 8의 (b)와 (c)는 저장부(145)에 저장된 사진 또는 영상을 불러오는 경우 해당 사진 또는 영상을 미리 정해진 시간 이상 터치하고 있을 경우, 자동으로 업로드 화면이 제공될 수 있다는 점을 나타내는 그림일 수 있다.8B and 8C illustrate that when a photo or an image stored in the storage unit 145 is loaded, the uploaded image can be automatically provided when the corresponding photo or image is touched for a predetermined time or more It can be a picture.
도 8의 (b)는 해당 사진 또는 영상을 클릭하는 화면일 수 있으며, 도 8의 (c)는 해당 사진 또는 영상을 SNS에 바로 업로들 할 수 있는 버튼이 표시되는 화면일 수 있다.8B shows a screen in which the corresponding photo or image is clicked, and FIG. 8C shows a screen in which a button for uploading the corresponding photo or image to the SNS is displayed.
도 9는 본 발명의 카메라의 사용을 지원하는 장치들의 실시예를 도시한 개념도이다.FIG. 9 is a conceptual diagram showing an embodiment of devices that support the use of the camera of the present invention.
도 9를 참조하면, 도 9의 (a)는 수분차단장치로서, 카메라(100)는 수중에서도 습도센서(165)가 습도를 인식하여 제어부(105)가 오픈된 구멍을 차단하도록 설정하는 기능을 포함하지만 원천적인 수분 차단을 위해 카메라(100) 수분차단장치를 덮어 씌울 수 있다.9 (a) is a moisture barrier device. The camera 100 has a function of setting the humidity sensor 165 to recognize the humidity in the water and block the opening of the control unit 105 But may cover the moisture barrier device 100 of the camera 100 for a source of moisture.
도 9의 (b)는 배터리충전기로서, 카메라(100)에 포함된 배터리를 분리하여 따로 충전할 수 있는 기기일 수 있다.9 (b) is a battery charger, which can be a device that can separate the battery included in the camera 100 and charge it separately.
도 10은 본 발명의 일 실시예에 따른 화면기억 방법을 도시한 개념도이다.10 is a conceptual diagram illustrating a screen storage method according to an embodiment of the present invention.
도 10을 참조하면, 제어부(105)는 촬영되는 화면들의 배치를 인식할 수 있다. 제어부(105)는 촬영되는 화면들의 배치를 기초로 가상화면을 생성 수 있다. 제어부(105)는 촬영부(105)에 인식되는 화면이 미리 저장된 가상화면과 유사할 경우, 해당 가상화면의 정보를 사용자에게 제공할 수 있다.Referring to FIG. 10, the control unit 105 can recognize the arrangement of the images to be photographed. The control unit 105 can generate a virtual screen based on the arrangement of the screens to be photographed. If the screen recognized by the photographing unit 105 is similar to a virtual screen stored in advance, the control unit 105 can provide information of the virtual screen to the user.
예를 들어, 사용자가 카메라(100)로 특정한 배경화면을 촬영하고자 하는 경우, 제어부(105)는 해당 사진과 유사한 사진이 어느 시점에 촬영되었는지에 관한 정보를 사용자에게 제공할 수 있다.For example, when the user wants to photograph a specific background screen with the camera 100, the control unit 105 can provide the user with information about when the picture similar to the corresponding picture was photographed.
카메라(100)가 현실 세계의 공간(1000)을 촬영하여 현실 세계 이미지를 획득한다. 예시적으로, 현실 세계 공간(1000)에 복수의 현실 객체들(1100, 1200, 1300, 1400)이 존재한다고 가정하자. 복수의 현실 객체들(1100, 1200, 1300, 1400)은 2차원의 또는 3차원의 임의의 사물을 포함할 수 있다. 복수의 현실 객체들(1100, 1200, 1300, 1400)은 서로 다른 또는 유사한 형태를 가질 수 있다. 카메라(100)는 이러한 형태적 차이에 기초하여 객체를 구별할 수 있다.The camera 100 captures the real world image 1000 by photographing the real world space 1000. Illustratively, assume that there are a plurality of real objects 1100, 1200, 1300, 1400 in the real world space 1000. The plurality of real objects 1100, 1200, 1300, and 1400 may include any two-dimensional or three-dimensional object. The plurality of real objects 1100, 1200, 1300, and 1400 may have different or similar shapes. The camera 100 can distinguish objects based on this morphological difference.
카메라(100)는 카메라 이미지(4000) 내의 복수의 객체들(2100, 2200, 2300, 2400)을 식별할 수 있다. 카메라(100)는 복수의 객체들(2100, 2200, 2300, 2400)의 윤곽선을 추출할 수 있다. 카메라(100)는 카메라 이미지(4000)내의 복수의 객체들을 인식하는 블록(3100, 3200)을 통해 복수의 객체들(2100, 2200, 2300, 2400)의 윤곽선을 추출할 수 있다.The camera 100 may identify a plurality of objects 2100, 2200, 2300, 2400 in the camera image 4000. The camera 100 may extract contours of a plurality of objects 2100, 2200, 2300, and 2400. The camera 100 may extract the contours of the plurality of objects 2100, 2200, 2300, 2400 through blocks 3100, 3200 that recognize a plurality of objects in the camera image 4000. [
그리고, 카메라(100)는 미리 저장된 이미지의 윤곽선의 벡터 값을 이용하여 복수의 객체들(2100, 2200, 2300, 2400) 중 미리 저장된 이미지와 매칭되는 객체를 결정한다. Then, the camera 100 determines an object matched with a pre-stored image among the plurality of objects 2100, 2200, 2300, and 2400 using the vector value of the outline of the image stored in advance.
컴퓨팅 장치(100)는 복수의 객체들(2100, 2200, 2300, 2400)과 대응되는 이미지와 해당 이미지의 정보를 디스플레이를 통해 제공할 수 있다.The computing device 100 may provide the image corresponding to the plurality of objects 2100, 2200, 2300, and 2400 and the information of the corresponding image through the display.
본 발명의 실시예와 관련하여 설명된 방법 또는 알고리즘의 단계들은 하드웨어로 직접 구현되거나, 하드웨어에 의해 실행되는 소프트웨어 모듈로 구현되거나, 또는 이들의 결합에 의해 구현될 수 있다. 소프트웨어 모듈은 RAM(Random Access Memory), ROM(Read Only Memory), EPROM(Erasable Programmable ROM), EEPROM(Electrically Erasable Programmable ROM), 플래시 메모리(Flash Memory), 하드 디스크, 착탈형 디스크, CD-ROM, 또는 본 발명이 속하는 기술 분야에서 잘 알려진 임의의 형태의 컴퓨터 판독가능 기록매체에 상주할 수도 있다.The steps of a method or algorithm described in connection with the embodiments of the present invention may be embodied directly in hardware, in software modules executed in hardware, or in a combination of both. The software module may be a random access memory (RAM), a read only memory (ROM), an erasable programmable ROM (EPROM), an electrically erasable programmable ROM (EEPROM), a flash memory, a hard disk, a removable disk, a CD- May reside in any form of computer readable recording medium known in the art to which the invention pertains.
이상, 첨부된 도면을 참조로 하여 본 발명의 실시예를 설명하였지만, 본 발명이 속하는 기술분야의 통상의 기술자는 본 발명이 그 기술적 사상이나 필수적인 특징을 변경하지 않고서 다른 구체적인 형태로 실시될 수 있다는 것을 이해할 수 있을 것이다. 그러므로, 이상에서 기술한 실시예들은 모든 면에서 예시적인 것이며, 제한적이 아닌 것으로 이해해야만 한다.While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, You will understand. Therefore, it should be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

Claims (10)

  1. 카메라로서,As a camera,
    상기 카메라의 좌측 또는 우측 측면에 부착되어 사진 또는 영상을 촬영하는 촬영부;A photographing unit attached to the left or right side of the camera to photograph a photograph or an image;
    상기 카메라 하부에 위치하며, 상기 카메라가 고정될 수 있도록 지지하는 마운트와 암수 결합되는 마운트홀;A mount hole located at a lower portion of the camera and mated with a mount for supporting the camera to be fixed;
    상기 촬영된 영상을 미리 정해진 주기인 제1주기마다 데이터 스트리밍을 통해 서버로 전송하는 송수신부; 및A transmitting / receiving unit for transmitting the photographed image to a server through data streaming every predetermined period of a first period; And
    상기 카메라에 포함된 장치들을 제어하는 제어부를 포함하는, 카메라.And a control unit for controlling the devices included in the camera.
  2. 제1항에 있어서,The method according to claim 1,
    상기 카메라는, 배터리 잔량을 측정하는 배터리감지부 및 카메라 내부 온도를 측정하는 온도센서부를 포함하며,The camera includes a battery sensing unit for measuring a remaining battery level and a temperature sensor for measuring a camera interior temperature,
    상기 제어부는,Wherein,
    상기 배터리 잔량이 임계치 이하인 경우, 상기 카메라 내부 온도가 임계치 이상인 경우 및 통신속도가 임계치 이하인 경우 중 적어도 하나에 해당할 경우, 상기 스트리밍 주기를 제1주기에서 제2주기로 변경하되, 상기 제2주기는 상기 제1주기보다 긴 것을 특징으로 하는, 카메라.Changing the streaming cycle from a first cycle to a second cycle when the remaining battery level is less than a threshold value, when the camera internal temperature is equal to or more than a threshold value, and when a communication speed is equal to or less than a threshold value, Is longer than the first period.
  3. 제1항에 있어서,The method according to claim 1,
    상기 촬영된 영상에서 사람의 얼굴을 인식하는 얼굴 인식부를 더 포함하며,And a face recognition unit for recognizing a face of a person in the photographed image,
    상기 제어부는,Wherein,
    모자이크 모드를 선택받고, 상기 촬영된 영상에서 특정 인물의 얼굴을 선택받으면, 상기 특정 인물을 제1인물로 등록하고, 상기 촬영된 영상에서 상기 제1인물을 제외한 인물의 얼굴을 모자이크 처리한 후에 상기 송수신부를 통해 상기 서버로 데이터 스트리밍하는, 카메라.A mosaic mode is selected, and when a face of a specific person is selected from the photographed image, the specific person is registered as a first person, mosaic processing is performed on a face of a person excluding the first person from the photographed image, And transmits the data to the server through the transmission / reception unit.
  4. 제1항에 있어서,The method according to claim 1,
    사용자의 모션을 인식하는 모션인식부를 더 포함하며,Further comprising a motion recognition unit for recognizing the motion of the user,
    사용자의 제1모션이 인식되면 사진 촬영모드로 전환하고, 상기 사진 촬영모드에서 제2모션이 인식되면 타이머 촬영모드로 전환하고, 상기 타이머 촬영모드에서 제3모션이 인식되면, 상기 제3모션의 종류에 따라서 타이머를 설정하고, 상기 타이머 후에 사진을 촬영하는, 카메라.When the first motion of the user is recognized, the mode is switched to the photographing mode, and when the second motion is recognized in the photographing mode, the mode is switched to the timer photographing mode, and when the third motion is recognized in the timer photographing mode, Sets a timer according to the type, and takes a picture after the timer.
  5. 제4항에 있어서,5. The method of claim 4,
    상기 제어부는,Wherein,
    사용자의 제4모션이 인식되면, 상기 제어부에 설정된 사용자의 SNS계정에 상기 제3모션을 이용하여 촬영된 사진을 업로드하는, 카메라.And uploads the photographed image using the third motion to the SNS account of the user set in the control unit when the fourth motion of the user is recognized.
  6. 제1항에 있어서,The method according to claim 1,
    상기 제어부는,Wherein,
    제1흔들림 보정모드를 선택받으면, 현재 촬영 영상에서 사용자의 얼굴을 인식하여 상기 사용자의 눈, 코 및 입술을 추출하고, 미간으로부터 코, 입술 가운데까지의 제1기준선을 설정하며,When the first shake correction mode is selected, the user's eyes, nose, and lips are recognized by recognizing the user's face on the current shot image, and a first reference line from the forehead to the nose and lips is set,
    상기 촬영부를 통해 촬영되는 영상을 상기 제1기준선의 각도가 유지되도록 상기 영상의 각도를 보정하는, 카메라.Wherein the angle of the image is corrected so that the angle of the first reference line is maintained in the image captured through the photographing unit.
  7. 제6항에 있어서,The method according to claim 6,
    상기 제어부는,Wherein,
    제2흔들림 보정모드를 선택받으면, 현재 촬영 영상에서 사용자의 양쪽 눈을 잇는 제2기준선을 설정하며,When the second shake correction mode is selected, a second reference line connecting the user's eyes to the current shot image is set,
    상기 촬영부를 통해 촬영되는 영상을 상기 제1기준선 및 제2기준선의 각도가 유지되도록 상기 영상의 각도를 보정하는, 카메라.Wherein the angle of the image is corrected so that the angle of the first reference line and the second reference line is maintained in the image photographed through the photographing unit.
  8. 제1항에 있어서,The method according to claim 1,
    상기 카메라는,The camera comprises:
    사용자로부터 방송모드를 입력받으면, 상기 서버가 연동되어 있는 방송 플랫폼으로 상기 카메라로부터 수신된 영상을 전송하여 방송하도록 하는, 카메라.And when receiving a broadcast mode from a user, transmits the image received from the camera to a broadcast platform in which the server is linked.
  9. 제1항에 있어서,The method according to claim 1,
    상기 서버는,The server comprises:
    상기 카메라와 상기 서버 또는 상기 서버와 상기 방송 플랫폼 간의 통신상태가 불량인 경우, 상기 방송 플랫폼으로 광고 송출을 요청하는 것을 특징으로 하는, 카메라.When the communication state between the camera, the server or the server and the broadcasting platform is poor, the broadcasting server requests the broadcasting platform to send an advertisement.
  10. 제8항 또는 제9항에 있어서,10. The method according to claim 8 or 9,
    사용자의 모션을 인식하는 모션인식부를 더 포함하며,Further comprising a motion recognition unit for recognizing the motion of the user,
    사용자의 제5모션이 인식되면, 사용자의 제6모션이 인식될때까지의 촬영 영상을 서버에 하이라이트 영상으로 저장하도록 요청하며,When the fifth motion of the user is recognized, the control unit requests the server to store the photographed image until the sixth motion of the user is recognized as a highlight image,
    상기 카메라와 상기 서버 또는 상기 서버와 상기 방송 플랫폼 간의 통신상태가 불량인 경우, 상기 방송 플랫폼으로 상기 하이라이트 영상을 전송하여, 상기 통신상태가 정상화될때까지 상기 하이라이트 영상을 방송하도록 하는, 카메라.And transmitting the highlight image to the broadcast platform when the communication state between the camera and the server or the server and the broadcast platform is poor, and broadcasting the highlight image until the communication state is normalized.
PCT/KR2018/010733 2017-12-22 2018-09-13 Camera for internet broadcasting and uploading WO2019124681A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020170178426A KR102000058B1 (en) 2017-12-22 2017-12-22 Camera for internet broadcasting and uploading
KR10-2017-0178426 2017-12-22

Publications (1)

Publication Number Publication Date
WO2019124681A1 true WO2019124681A1 (en) 2019-06-27

Family

ID=66994889

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/010733 WO2019124681A1 (en) 2017-12-22 2018-09-13 Camera for internet broadcasting and uploading

Country Status (2)

Country Link
KR (1) KR102000058B1 (en)
WO (1) WO2019124681A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004062560A (en) * 2002-07-30 2004-02-26 Omron Corp Face collating device and face collating method
KR101434533B1 (en) * 2013-06-25 2014-08-27 엔그램테크놀로지(주) System for filming camera using appreciate gesture of finger and method therefor
KR20160057751A (en) * 2014-11-14 2016-05-24 주식회사 라이브존 Supervisory system by using near field communication ip camera module
US20170076156A1 (en) * 2015-09-14 2017-03-16 Logitech Europe S.A. Automatically determining camera location and determining type of scene

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20000009799U (en) * 1998-11-12 2000-06-05 정은철 Ball CCT Camera
JP2009188836A (en) * 2008-02-07 2009-08-20 Sony Ericsson Mobilecommunications Japan Inc Portable communication terminal and controlling method of the same, and control program
KR100968816B1 (en) 2009-10-23 2010-07-08 (주)에스엠에이시스템 The monitor and broadcast device choosing the least noises

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004062560A (en) * 2002-07-30 2004-02-26 Omron Corp Face collating device and face collating method
KR101434533B1 (en) * 2013-06-25 2014-08-27 엔그램테크놀로지(주) System for filming camera using appreciate gesture of finger and method therefor
KR20160057751A (en) * 2014-11-14 2016-05-24 주식회사 라이브존 Supervisory system by using near field communication ip camera module
US20170076156A1 (en) * 2015-09-14 2017-03-16 Logitech Europe S.A. Automatically determining camera location and determining type of scene

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CD-MAN: "Winter Travel Supplies, Sony Action Cam FDR-X3000 and Accessories (feat.Action Cam Boss)", CDMANII.COM, 20 December 2017 (2017-12-20), XP055620967, Retrieved from the Internet <URL:https://cdmanii.com/6300> *

Also Published As

Publication number Publication date
KR102000058B1 (en) 2019-07-15
KR20190076547A (en) 2019-07-02

Similar Documents

Publication Publication Date Title
WO2016068588A1 (en) Method for scanning neighboring devices and electronic device thereof
CN109891874B (en) Panoramic shooting method and device
WO2018043884A1 (en) Method for controlling camera and electronic device therefor
WO2016208802A1 (en) Watch type mobile terminal and operation method thereof
CN111061445A (en) Screen projection method and computing equipment
WO2014021692A1 (en) Image processing method and apparatus
JP6538079B2 (en) Imaging parameter setting method, apparatus, program, and recording medium
WO2019128592A1 (en) Method and apparatus for live broadcasting
WO2018004238A1 (en) Apparatus and method for processing image
WO2015005722A1 (en) Mobile device, display apparatus and method for sharing contents thereof
CN111132137A (en) Wi-Fi connection method and device
WO2014073847A1 (en) User terminal, external apparatus, data transceiving system, and data transceiving method
WO2022116930A1 (en) Content sharing method, electronic device, and storage medium
WO2019156480A1 (en) Method of detecting region of interest on basis of gaze direction and electronic device therefor
WO2015167236A1 (en) Electronic device and method for providing emergency video call service
WO2017026644A1 (en) Method and device for generating video content
WO2018048130A1 (en) Content playback method and electronic device supporting same
CN114489533A (en) Screen projection method and device, electronic equipment and computer readable storage medium
CN110012130A (en) A kind of control method and electronic equipment of the electronic equipment with Folding screen
WO2016208992A1 (en) Electronic device and method for controlling display of panorama image
WO2017209409A1 (en) Spherical content editing method and electronic device supporting same
WO2021157767A1 (en) Mobile terminal and control method for same
WO2017018591A1 (en) Mobile terminal having camera and control method therefor
WO2019039861A1 (en) Electronic device and method for providing content associated with camera function from electronic device
WO2018124774A1 (en) Electronic device and control method therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18892123

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18892123

Country of ref document: EP

Kind code of ref document: A1