WO2017176021A1 - Observation scope for remote medical examination, and image processing device and sysyem comprising same - Google Patents

Observation scope for remote medical examination, and image processing device and sysyem comprising same Download PDF

Info

Publication number
WO2017176021A1
WO2017176021A1 PCT/KR2017/003648 KR2017003648W WO2017176021A1 WO 2017176021 A1 WO2017176021 A1 WO 2017176021A1 KR 2017003648 W KR2017003648 W KR 2017003648W WO 2017176021 A1 WO2017176021 A1 WO 2017176021A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
observation mirror
image data
terminal
Prior art date
Application number
PCT/KR2017/003648
Other languages
French (fr)
Korean (ko)
Inventor
고철웅
장인훈
Original Assignee
한국생산기술연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국생산기술연구원 filed Critical 한국생산기술연구원
Publication of WO2017176021A1 publication Critical patent/WO2017176021A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00188Optical arrangements with focusing or zooming features
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/042Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by a proximal camera, e.g. a CCD camera
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0013Medical image data

Definitions

  • the present invention relates to an observation mirror for remote medical care and an image processing apparatus and system including the same, and more particularly, to a detachable head part and a telescope for including a viewing mirror whose focal length is adjusted by the structure of the head part. Relates to a possible image data processing system.
  • the observation mirror is a device that allows the user to observe the inside of the object by photographing the inside of the object through a tube of the observation mirror inserted into the inside of the object.
  • Observation mirrors may be used by inserting a rigid or ductile tube into the interior wall of a building or a mechanical device, as well as by inserting a needle, rubber or plastic tip tube into a human or animal.
  • the inside of the human body bronchi, esophagus, chest cavity, heart, stomach, intestine, abdominal cavity, bladder, anus, nasal cavity, eardrum, etc.
  • it can be useful for the diagnosis of diseases.
  • Observation mirrors cause contamination of the tube and its vicinity because the tube is inserted inside the object.
  • a high level of cleanliness is required.
  • conventional observation glasses are very difficult to sterilize or disinfect because of their complicated structure and design.
  • the observation mirror needs to have a structure that is easy to clean, sterilize or disinfect.
  • the Republic of Korea Patent No. 0449349 name of the invention: an endoscope having an improved flexible insertion tube, hereinafter referred to as the prior art 1
  • the insertion tube further comprises a vapor barrier between the shell and the internal communicator, whereby the vapor barrier prevents the ingress of steam from the ambient air through the shell and into the interior space, and also the vapor in the interior space.
  • a flexible endoscope with a flexible insertion tube is disclosed that prevents reacting with the material to produce a material that is harmful to the elastic sheath.
  • 1512068 (name of the invention: a telemedicine service system and method, hereinafter referred to as prior art 2) in the telemedicine service method, through the camera of the portable terminal to the remote care service target Acquiring an image of the patient, acquiring a patient state including a plurality of questionnaire information having a hierarchical relationship to the acquired image, recognizing a telemedicine service type from the acquired image, Performing analysis corresponding to a service type to generate analysis result data; matching the collected patient state to the generated analysis result data and broadcasting it through at least one service server linked through a network; , The result of the data broadcast through the linked service server Searching for and collecting the same and displaying the collected information on the portable terminal, wherein the questionnaire information includes a plurality of preset questions for each medical service type selected by the user or the recognized remote medical service type by operating the portable terminal.
  • a remote medical service server including an answer inputted in response to the mobile terminal and communicating with the portable terminal and interworking with a plurality of service servers specialized for a plurality of medical service types, if any one data is selected from the result data retrieved through the service server. And simultaneously displaying, on the portable terminal, an evaluation history of the corresponding data collected through each of the linked service servers, and the evaluation history includes a satisfaction level of data provided by each teleservice service server for each service server.
  • the data quality collected and collected by the user is mapped and stored and managed corresponding to the corresponding service server, and the collected data for each service server is specialized medical service data according to symptoms corresponding to the recognized service type.
  • the specialized medical service data includes medical service data recommending a pouch product to treat the stoma, nursing care for wounds and nutrition, exercise, lifestyle recommendations, and medical counseling related to cancer.
  • a telemedicine service method is disclosed.
  • the technical problem to be achieved by the present invention is the first problem that the prior art 1 has a limit on the degree of sterilization and disinfection, the second problem that the prior art 1 can not adjust the focal length, the prior art 1 using only a flexible tube
  • the prior art 1 is a fourth invention that there is still a problem of contamination of the endoscope when the invention to improve the sterilization and disinfection problem or when confirming the image obtained by the endoscope in a place requiring cleanness such as an operating room
  • prior art 1 has a fifth problem that several people can not share the image obtained by the endoscope
  • sixth problem that the user can not receive any feedback by the prior art 1 and prior art 2 is different from the prior art 1
  • the user has the advantage of receiving remote medical treatment, the remote medical treatment for the wounded part using the camera of the portable terminal To try to solve the problem that Article 7 ceased.
  • the present invention in the observation mirror for photographing the interior of the object, the head portion and the head portion including a light acquisition unit for obtaining the light reflected from the interior of the object to generate an optical signal And a main body having a function of receiving an optical signal and generating image data, wherein the head portion is detachable from the main body.
  • the head portion may be characterized by having a function of adjusting the focal length.
  • the head portion may have a structure for converting a rotational motion into a linear motion, and the focal length may be adjusted by the structure.
  • the main body may include an image sensor for receiving the optical signal to generate an image signal.
  • the main body may further include an image processor for processing the image signal to generate the image data.
  • the head portion comprises a lens and a plurality of optical fibers, and further comprising a tube which is inserted into the inside of the object to transfer the reflected light to the light acquisition unit Can be.
  • the lens may be characterized in that the fisheye lens or a wide-angle lens.
  • the image processor may be characterized in that it comprises an image correction unit for receiving the image signal to correct the distortion image.
  • the main body may include a display unit for displaying the image data.
  • the display unit may be a fixed or removable display unit.
  • the display unit may be a foldable display unit.
  • the main body may include a communication unit for transmitting the image data to an external device.
  • the main body may further include a memory unit in which the image data is stored as a recording file according to a recording request of a user.
  • the user may download the recorded file to a USB flash drive or a memory card connected to the communication unit.
  • the body portion may be characterized in that it comprises a light source for generating light to be irradiated to the inside of the object.
  • the present invention also provides an observation mirror system comprising a terminal receiving the image data from the observation mirror and the communication unit through a wired or wireless communication means.
  • the wireless communication means is infrared (IR) communication, radio frequency (RF) communication, wireless LAN, Wi-Fi (Wi-Fi), WiBro (UWB), ultra-wideband communication (UWB) It may be characterized in that one or more of, Bluetooth, direct wireless and Near Field Communication (NFC).
  • IR infrared
  • RF radio frequency
  • Wi-Fi Wi-Fi
  • WiBro WiBro
  • UWB ultra-wideband communication
  • NFC Near Field Communication
  • the observation mirror and the terminal may be characterized in that sharing the image data through the mirroring (mirroring) by the wired or wireless communication means.
  • the main body may further include a mounting portion on which the terminal is mounted.
  • the present invention also provides a first terminal for sharing the image data through mirroring by the observation mirror, the observation mirror and the first communication means, and mirroring by the first terminal and the second communication means. It provides a observation system comprising a second terminal for sharing the image data through.
  • the present invention provides an observation mirror system comprising a server for receiving and storing the image data from the observation mirror and the communication unit.
  • it may be characterized in that it further comprises a terminal for receiving the image data from the server through a wired or wireless communication means.
  • the remote doctor may be characterized by analyzing the image data transmitted to the server or the terminal to provide an analysis result to the user of the observation mirror.
  • the virtual doctor may analyze the image data transmitted to the server or the terminal and provide the analysis result to the user of the observation mirror.
  • the server or the terminal may be characterized in that the authentication of the user ID.
  • the server or the terminal may be characterized in that to authenticate the ID of the observation mirror.
  • the present invention is a method for observing the interior of the object using the observation mirror, the step of generating light to be irradiated to the inside of the object, the step of irradiating the generated light inside the object, the light acquisition unit Acquiring the light reflected from the inside of the object to generate an optical signal, receiving an optical signal generated by the image sensor, generating an image signal, and processing the generated image signal by the image processor to generate image data
  • a display unit provides a method for observing the inside of the object using an observation mirror, characterized in that comprises the step of displaying the generated image data.
  • the generating of the image data by processing the generated image signal by the image processor unit includes: receiving the generated image signal by the image correction unit, and correcting the distortion image by the image correction unit It may be characterized in that it comprises a step of generating a correction image and the image data generation unit for generating the image data related to the correction image.
  • the step of generating the corrected image by correcting the distorted image by the image compensating includes performing a geometrical correction of the distorted image by using the distortion correcting algorithm. It can be characterized.
  • the distortion correction algorithm is one or more of the focal length (prical length), the principal point (principal point), the radial distortion coefficient (radial distortion coefficient) and the tangential distortion coefficient (tangential distortion coefficient) It may be characterized by including.
  • the step of generating the corrected image by correcting the distorted image by the image correction unit after performing the geometric correction of the distorted image using the distortion correction algorithm, Extracting the vanishing point coordinates from the distorted image by the image correction unit, calculating the correction coefficients constituting the projection algorithm from the vanishing point coordinates, and generating the corrected image by the image correction unit using the projection algorithm
  • the projection algorithm may be represented by Equation 1.
  • the correction coefficient may be represented by Equation 2.
  • the present invention is the first effect that the detachable head portion is easy to clean, sterilize or disinfect the head portion or the body portion, the second effect that the user can adjust the focal length to obtain the image quality desired by the user, various forms
  • the third effect that can be applied to various fields using the tube of the, the image can be confirmed by the terminal connected by the observation mirror and the wireless communication means, so the problem of contamination of the display part is solved and the fourth effect that the convenience is improved
  • the seventh effect that telemedicine is possible based on the image of the inside of the patient's body, the image obtained by the observation mirror Eighth effect that can be confirmed remotely, the ninth effect that the image obtained by the observation mirror is stored on the server and easy access to the acquired image, and the accumulated image data when the image obtained by the observation mirror is
  • the effects of the present invention are not limited to the above-described effects, but should be understood to include all the effects deduced from the configuration of the invention described in the detailed description or claims of the present invention.
  • 1 is a block diagram showing an embodiment of the inventor's observation mirror.
  • FIG. 2 is a schematic diagram showing a distortion image before applying a projection algorithm.
  • Figure 3 is a structural diagram showing an embodiment of the present observation system.
  • Figure 4 is a structural diagram showing one embodiment of the present observation system.
  • Figure 5 is a structural diagram showing an embodiment of the present observation system.
  • FIG. 6 is a perspective view showing an embodiment of the inventor's observation mirror.
  • FIG. 7 is a perspective view showing an embodiment of the inventor's observation mirror.
  • FIG. 8 is a perspective cross-sectional view showing an embodiment of the inventor's observation mirror.
  • Fig. 9 is a perspective view showing the head portion of the observation mirror according to the present invention as an example.
  • FIG. 10 is a perspective view showing a locker of the observation mirror according to the present invention as an example.
  • Fig. 11 is a perspective view showing the head portion of the observation mirror according to the present invention as an example.
  • FIG. 12 is a perspective view showing a head portion of the observation mirror according to the present invention as an example.
  • the best mode of the embodiment of the observation mirror of the present invention includes a head portion and a head portion including an optical acquisition portion for acquiring light reflected from the interior of the object and generating an optical signal in the observation mirror for photographing the interior of the object. And a main body having a function of receiving an optical signal and generating image data, wherein the head can be separated from the main body.
  • the observation mirror 1 of the present invention is an apparatus for photographing the inside of an object, and irradiates light into the inside of the object, and then acquires light reflected from the inside of the object to generate image data.
  • the interior of the subject may be a body part of a patient, a body part of an animal, an inner wall of a building, an inside of a pipe, or an inside of a vehicle, but the present invention is not limited thereto.
  • Observation mirror 1 comprises a head portion 10 and the body portion 20, the head portion 10 includes a light acquisition portion 110, the light emitting portion 120 or tube 180 It may further include.
  • the main body 20 coupled to the head portion 10 has a function of receiving an optical signal transmitted from the head portion 10 and generating image data, and includes an image sensor 210 and an image processor 220. , One or more of a memory unit 230, a display unit 240, a communication unit 250, and a light source unit 260.
  • FIG. 1 shows an embodiment of the observation mirror 1 in a block diagram, so that not only the configuration of the observation mirror 1 but also a method of observing the inside of the object using the observation mirror 1 can be grasped at a glance.
  • each component of the observation mirror 1 will be described in detail, and through this, a method of observing the inside of the object using the observation mirror 1 will also be understood.
  • Tube 180 is inserted into the inside of the object, and the user may observe the image of the inside of the object by inserting the tube 180 into the inside of the object in observing the inside of the object with the observation mirror 1.
  • Tube 180 is a needle type that can be inserted into the skin of the target, a rigid type (for example, a straight metal tube or an insertion portion of the otoscope), a soft type (for example, made of rubber and freely Bend) or an intermediate type of rigid type and flexible type (hard or soft intermediate between the hard type and flexible type), but is not limited thereto.
  • the tube 180 includes a lens and a plurality of optical fibers, and when the tube 180 is inserted into the inside of the object, the light generated by the light source unit 260, which will be described later, is a part of the plurality of optical fibers of the tube 180 (hereinafter, The light that passes through the lens and reaches the inside of the object, and the light reflected from the inside of the object is again the lens of the tube 180 and the plurality of optical fibers (hereinafter, referred to as a photographing line). Is processed in a predetermined manner inside the observation mirror (1).
  • the lens may be one or more of a standard lens, a telephoto lens, a micro lens, a wide angle lens, and a fisheye lens, but is not limited thereto.
  • the light source unit 260 generates light to be irradiated inside the object.
  • the light source unit 260 may include a lamp using a laser emitting diode (LED) or a laser diode (LD) as a light source, but the light source is not limited thereto.
  • the light source unit 260 further includes a color conversion plate made of glass or ceramic material when the laser emitting diode (LED) lamp emits a UV-Blue band laser to convert the UV-Blue band laser into white light. It is possible to convert, but the configuration of the light source unit 260 is not limited to these.
  • the light emitting unit 120 is coupled to the light source unit 260 to irradiate the light generated by the light source unit 260 to the inside of the object.
  • the light generated by the light source unit 260 is irradiated to the inside of the object through a light emitting line connected to the light emitter 120. That is, although FIG. 1 simply shows an arrow that light generated by the light source unit 260 is irradiated to the inside of the object, specifically, light generated by the light source unit 260 is a light emitting unit coupled to the light source unit 260. 120, the light emitting line connected to the light emitting unit 120, the lens, and may be sequentially delivered to the inside of the object.
  • the light acquisition unit 110 acquires the light reflected from the inside of the object.
  • FIG. 1 shows an arrow indicating that the light reflected from the inside of the object reaches the head portion 10.
  • the light reflected from the inside of the object is a light acquisition unit connected to a lens, a shooting line, and a shooting line. Can be delivered sequentially to (110).
  • the light acquisition unit 110 that acquires light reflected from the inside of the object generates an optical signal and transmits the light signal to the image sensor 210 which will be described later.
  • the tube 180, the light emitting unit 120, and the light acquisition unit 110, which are components of the head unit 10, have been described so far, and only the light source unit 260 is described among the components of the main body 20. However, before describing the image sensor 210 of the main body 20, the detachment of the head 10 will be described.
  • the user When the user observes the inside of the object by using the observation mirror 1, the user uses the observation mirror 1 in which the head portion 10 and the main body portion 20 are combined, but the head portion 10 is the main body portion ( 20).
  • Prior arts typically did not facilitate separating the head portion 10 from the body portion 20.
  • the optical fiber (photography line) of the tube 180 extends to the image processor 220 of the main body 20.
  • the light emitting unit 120 connected to the light emitting line may be coupled to the light source unit 260 or separated from the light source unit 260, and the light acquisition unit 110 connected to the photographing line may be an image sensor 210 or an image.
  • the head unit 10 Since the head unit 10 is spaced apart from the processor 220, the head unit 10 including the light emitting line, the light emitting unit 120, the photographing line, and the light acquisition unit 110 may include a main body unit 20 including the light source unit 260. Can be separated from).
  • the observation mirror 1 includes a connection part 30 for connecting the head part 10 and the main body part 20 in addition to the head part 10 and the main body part 20, and has a structure of the connecting part 30. Enables the detachment and coupling of the head portion 10.
  • the connection part 30 has a simple structure in which the head part 10 is separated by pulling the head part 10 from the main body part 20, and the head part 10 is coupled by pushing the head part 10 toward the main body part 20.
  • the head part 10 may be separated from the main body part 20 while the user uses the observation mirror 1, the connection part 30 may be separated and coupled to the head part 10.
  • the head portion 10 is fixed when the head portion 10 is coupled to the main body portion 20
  • such a structure may be various, one example of which will be described later. It will be described in the embodiment.
  • Head portion 10 has a function of adjusting the focal length (focal length), the focal length (focal length) means the distance from the lens of the tube 180 to the image sensor 210 of the body portion 20. do.
  • the user may combine various types of head portions 10 to the main body portion 20, various types of tubes 180 may be coupled to other components of the head portion 10, and various types Can be used as a component of the tube 180. Therefore, since the focal length may vary depending on one or more factors among the shape of the head portion 10, the shape of the tube 180, and the type of lens, the observation mirror 1 needs to have a function of adjusting the focal length. have.
  • the connecting portion 30 pulls the head portion 10 from the main body portion 20, the head portion 10 is separated and the head portion 10 is pushed toward the main body portion 20.
  • the focal length becomes long, and the head portion Pushing back (10) may shorten the focal length.
  • this structure is a linear motion of the rotational motion, such as rack and pinion gear It may be a structure for converting to, but is not limited thereto.
  • One example of a structure for converting a rotational motion into a linear motion will be described in the following embodiments.
  • the image sensor 210 receives an optical signal from the light acquisition unit 110 and generates an image signal which is an electrical signal.
  • the image sensor 210 may be a complementary metal oxide semiconductor (CMOS) or a charge coupled device (CCD), but is not limited thereto.
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • the image sensor 210 may have pixels corresponding to VGA, SVGA, or SXGA as necessary, but is not limited thereto.
  • the image processor 220 generates image data by processing the image signal generated by the image sensor 210 as shown in FIG. 1.
  • the image processor 220 may include at least one of a signal converter 221, an image compensator 222, and an image data generator 223.
  • the signal converter 221 converts the analog image signal into a digital image signal when the image sensor 210 generates an analog image signal.
  • a digital image signal may be converted, and color coordinate systems such as YUV and RGB may be converted.
  • the image compensator 222 receives an image signal from the image sensor 210 or the signal converter 221 and then corrects the distortion image to generate a corrected image.
  • the image correction unit 222 may not only perform normal color correction or gamma correction, but also perform geometric correction of the distorted image using the distortion correction algorithm, and use the projection algorithm to correct the perspective effect of the distorted image. Can be removed
  • the order of correction is preferably to remove the perspective effect after performing the geometric correction.
  • Geometric correction and perspective removal are particularly effective correction methods when using fisheye or wide-angle lenses.
  • Fish-eye and wide-angle lenses have shorter focal lengths than standard lenses.
  • Fish-eye lenses have a lens angle of at least 120 degrees, enabling wide-area images.
  • Wide-angle lenses have a 60 to 120 degree angle of view. The lens is not as good as the fisheye lens, but it also enables wider image acquisition.
  • fisheye lenses and wide-angle lenses have wide angles of view, distorting objects and exaggerating perspective. Therefore, when observing the inside of the object using a fisheye lens or a wide-angle lens, the inside of the object is distorted, a perspective occurs to the periphery of the region of interest, and it is difficult to determine the exact state of the inside of the object.
  • Radiation distortion is a distortion caused by the shape of the lens generally refers to a phenomenon in which the position of the pixel is convexly distorted near the edge of the image sensor 210. These convex phenomena are also called 'keg distortion' and can also cause fisheye effects.
  • Tangential distortion is a distortion that occurs because the image sensor 210 and the lens are not parallel to each other occurs in the manufacturing process of the observation mirror (1).
  • the distortion correction algorithm can be represented by a function including one or more of focal length, principal point, radial distortion coefficient, and tangential distortion coefficient, 222 may correct geometric distortions such as radial distortion or tangential distortion using a distortion correction algorithm.
  • Geometric correction is a well known theory and will be understood by those skilled in the art of image processing without further detailed description.
  • Figure 2 shows an embodiment of the distortion image, but the geometric correction is completed, but each side due to the perspective effect shows a curved rectangle. This is because a rectangular image with each side bent is created when a rectangle is photographed with a camera, because a part near the camera is larger in the rectangle and a part farther from the camera appears in a small distance, creating a perspective.
  • the image compensator 222 extracts vanishing point coordinates from the distorted image. In FIG.
  • the first vanishing point P1 (X P1 , Y P1 ) where the straight line AB extends to meet and the second vanishing point P2 (X P2 , Y P2 ) where the straight line AD extends from the straight line BC are extracted.
  • the image correction unit 222 calculates a correction coefficient constituting the projection algorithm from the vanishing point coordinates.
  • a projection matrix [P] is generated as in Equation 3 below.
  • the projection matrix [P] of Equation 3 is a matrix generated by multiplying the original matrix [A] by the matrix [B] and converts the spatial image on the three-dimensional image into a planar image on the two-dimensional image.
  • the correction coefficients a and b are parameters required for extracting the distortion image coordinates (x, y) from the correction image coordinates (X, Y).
  • Equation 4 If any point on the three-dimensional image is converted through the projection matrix [P], it can be expressed as Equation 4 below.
  • Equation 4 may be expressed as Equation 5 below by dividing Equation (ax + by + 1).
  • x / (ax + by + 1) and y / (ax + by + 1) may be represented by replacing with X and Y, respectively. That is, [X, Y, 0, 1] are two-dimensional points in which any point [x, y, z, 1] is transformed through the projection matrix [P], where four columns of each matrix are The base vector for calculating the correction coefficients a and b. Meanwhile, in FIG.
  • Equation 7 In order to represent columns 3 and 4 of Equation 6 in a basic vector format, dividing one row of Equation 6 by a and dividing two rows by b may be expressed as in Equation 7 below.
  • X P1 is the x coordinate of the first vanishing point when the first vanishing point is positioned on the x axis
  • Y P2 is the y coordinate of the second vanishing point when the second vanishing point is positioned on the y axis
  • Equation 5 may be summarized as shown in Equation 8 below.
  • the projection algorithm represented by Equation 1 can be used by substituting 1 / X P1 and 1 / Y P2 calculated by Equation 2 instead of the correction coefficients a and b.
  • the distortion image coordinates (x, y) may be extracted from the correction image coordinates (X, Y), and the pixels of the distortion image may be disposed in the pixels of the correction image.
  • This method assumes a corrected image in advance by an inverse mapping method and finds which pixel of the distorted image matches the corrected image.
  • the image correction unit 222 generates a correction image by an inverse mapping method using Equation 1. If the perspective effect is not removed as expected in the generated correction image, the user may adjust the correction coefficients a and b to cause the image correction unit 222 to generate a new correction image.
  • the image data generator 223 generates image data with respect to the corrected image generated by the image compensator 222.
  • the image data generation unit 223 may generate image data by scaling the corrected image according to the size of the display unit 240 to be described later, and generate and scale the encoder by compressing and encoding the scaled corrected image as necessary.
  • One image data can also be compressed.
  • the memory unit 230 is an area in which image data is stored by the image processor 220 as shown in FIG. 1.
  • the memory unit 230 may include a flash memory type, a hard disk type, a multimedia card micro type, and a card type memory (for example, SD memory and XD memory).
  • the image data may be stored in the memory unit 230 as a recording file according to the user's recording request.
  • the above-described distortion correction algorithm or projection algorithm may be stored in the memory unit 230 to refer to the algorithm stored in the memory unit 230 when the image correction unit 222 corrects the distortion image.
  • the memory unit 230 may store a projection algorithm to which the correction coefficient is calculated by calculating a correction coefficient when removing the perspective effect of the distorted image.
  • the display unit 240 displays image data generated by the image processor 220 as shown in FIG. 1.
  • the display unit 240 includes a liquid crystal display (LCD), an organic light emitting diode (OLED), an electroluminescent display (ELD), a plasma display panel (PDP),
  • the display device may include one display device selected from a field emission display (FED) and an electrophoretic display (EPD), but is not limited thereto.
  • the display unit 240 may be a fixed display unit 240 in which the display device is fixed to the main body unit 20, or may be a removable display unit 240 in which the display device can be detached. In the case of the removable display unit 240, a mounting unit (not shown) on which the display device is mounted may be provided on the main body unit 20.
  • the display unit 240 may include a folding display unit 240 having a cover to cover the screen of the display device when the cover is folded and to open the cover to show the screen of the display device.
  • the communication unit 250 transfers the image data generated by the image processor 220 to an external device as shown in FIG. 1, and thus, remote medical treatment described below is possible.
  • the communication unit 250 may transfer image data to an external device by wire, and may include a USB port or a memory card slot for this purpose, but is not limited thereto.
  • the communication unit 250 may transmit the image data to an external device wirelessly, in this case, infrared (IR) communication, radio frequency (RF) communication, wireless LAN, Wi-Fi (Wi-Fi), WiBro (WiBro), ultra-wideband One or more wireless communication means of communication (UWB), Bluetooth, direct wireless and near field communication (NFC) may be used, but is not limited thereto.
  • IR infrared
  • RF radio frequency
  • UWB Wi-Fi
  • WiBro WiBro
  • NFC ultra-wideband
  • the image data is stored as a recording file according to the user's recording request, the user connects the USB flash drive to the USB port of the communication unit 250 or connects the memory
  • the image processor 220 includes a direct memory access (DMA) controller (not shown) for fast storage and reading of image data, and includes a memory unit 230 and a display unit 240 in a direct memory access (DMA) manner.
  • the image data may be received between the communication unit 250 and the image processor unit 220.
  • DMA direct memory access
  • the observation system of the present invention comprises an observation mirror 1 and a terminal, and FIGS. 3 to 5 illustrate one embodiment of the observation system.
  • FIGS. 3 to 5 illustrate one embodiment of the observation system.
  • each component constituting the observation mirror system will be described in detail with reference to FIGS. 3 to 5.
  • the observation mirror 1 is an apparatus for generating image data by irradiating light into the inside of the object and then obtaining light reflected from the inside of the object.
  • the observation mirror 1 includes a head part 10 and a main body part 20. It is done by
  • the terminal is a device connected to the observation mirror 1 through a wired or wireless communication means.
  • the wired communication means may be a predetermined cable
  • the wireless communication means may be infrared (IR) communication, radio frequency (RF) communication, wireless LAN, Wi-Fi (Wi-Fi), WiBro, ultra-wideband communication (UWB), It may be one or more of Bluetooth, direct and near field communication (NFC), but does not exclude other wired or wireless communication means.
  • the terminal may be a wired terminal or a wireless terminal, and FIG. 3 shows a wired terminal and a wireless terminal connected through the observation mirror 1 and a wireless communication means.
  • the wired terminal may be a personal computer (PC) and / or a notebook, but the following is not limited thereto.
  • Wireless terminals include Personal Communication System (PCS), GSM (Global System for Mobile communications) terminals, Personal Digital Cellular (PDC) terminals, PHS (Personal Handyphone System) terminals, and personal digital assistants. It can be one or more of an assistant, a PDA), a smart phone, a telematics and a wireless data communication terminal, and a portable internet terminal, and the glasses worn by a doctor are also equipped with a wireless communication module to display image data. Since it can be transmitted, the wireless terminal is not limited to any form, as follows. When the main body 20 of the observation mirror 1 includes the communication unit 250, the terminal receives image data from the communication unit 250 through a wired or wireless communication means.
  • PCS Personal Communication System
  • GSM Global System for Mobile communications
  • PDC Personal Digital Cellular
  • PHS Personal Handyphone System
  • the main body 20 of the observation mirror 1 includes a mounting portion (not shown) on which the terminal is mounted, and the wired or wireless communication means from the observation mirror 1 to the terminal in a state where the terminal is mounted on the mounting portion. Image data transfer through can be made.
  • the observation mirror 1 and the terminal may share a plurality of functions through mirroring by the wired or wireless communication means.
  • the main body 20 of the observation mirror 1 includes the display unit 240
  • the screen displayed on the display unit 240 may be output to the terminal through streaming.
  • the observation mirror system may include the observation mirror 1, the first terminal, and the second terminal.
  • the observation mirror 1 and the first terminal can share a plurality of functions through mirroring by the first communication means, and the first terminal and the second terminal share the mirroring by the second communication means. Multiple functions can be shared through.
  • the main body 20 of the observation mirror 1 includes the display unit 240
  • the screen displayed on the display unit 240 may be output to the first terminal through streaming.
  • the screen output from the first terminal may be identically output to the second terminal through streaming.
  • the first communication means may be the wired or wireless communication means
  • the second communication means may be the wired or wireless communication means
  • the first terminal may be the wired terminal or the wireless terminal
  • the second terminal may also be It may be a wired terminal or the wireless terminal.
  • 4 shows a first wireless terminal connected with the observation mirror 1 through a wireless communication means, and a second wireless terminal connected with the first wireless terminal via a wireless communication means. Mirroring between the first terminal and the second terminal may be performed by the second terminal sending a request for function sharing to the first terminal and the first terminal receiving the request responds, but is not limited thereto. .
  • the first terminal and the second terminal may share more functions than the observation mirror 1 and the first terminal. Chat and share.
  • Wi-Fi Wi-Fi
  • WCDMA, HSDPA, CDMA2000, WiBro, WiMax, LTE, LTE-Advanced and Wi-Fi may be connected via the Internet to connect using a wireless communication means, of course, does not exclude other wireless communication means.
  • 5 is a view illustrating a state in which a terminal located at a distance from the observation mirror 1 and the observation mirror 1 is connected.
  • gateways for example, a gateway configured on a wire / wireless router, a gateway of an Internet provider, etc.
  • access points for example, a wireless repeater, a base station, etc.
  • the observation mirror system of the present invention may comprise an observation mirror 1 and a server, and FIG. 5 shows one embodiment of such an observation mirror system.
  • the observation mirror 1 is connected to a server on a wired or wireless network, and image data transmitted from the communication unit 250 of the observation mirror 1 may be stored in the server.
  • the image data stored in the server can be used by a patient, doctor, or other person through a predetermined means.
  • the observation system transmits image data from the server in addition to the observation mirror 1 and the server through a wired or wireless communication means. It may be made further comprising a receiving terminal.
  • the terminal may be a wired terminal or a wireless terminal. As shown in FIG.
  • one or more gateways for example, a gateway set in a wire / wireless router, a gateway of an Internet provider, etc.
  • There may be one or more access points eg, a wireless repeater, a base station, etc.
  • the observation mirror 1 transmits image data as a digital signal there may be one or more repeaters for reproducing attenuated digital signals.
  • a remote doctor who is a doctor located at a distance from the observation mirror 1 analyzes the image data transmitted to the server or the terminal, and provides the analysis result and / or a related prescription to the user of the observation mirror 1, or to the server or the terminal.
  • the virtual doctor who is an installed image data analysis program may analyze the image data transmitted to the server or the terminal and provide the analysis result and / or a prescription related thereto to the user of the observation mirror 1.
  • the server or the terminal may authenticate the ID of the observer 1 so that the analysis result and / or a prescription thereof may be transmitted to the terminal owned by the observer 1 and the The ID may be authenticated so that the analysis result and / or the prescription related thereto may be delivered to the terminal owned by the observer 1.
  • the server or the terminal may authenticate the ID of the observer (1) so that the big data analysis result or the treatment prognosis comparison result may be delivered to the terminal owned by the observer (1), and the observer (1) By authenticating the ID of the observation device (1) may be a big data analysis results or treatment prognosis comparison results to the terminal owned by the user.
  • the manufactured observation mirror 1 has a pistol shape as a whole, so that the user can connect and grip the tube 180 to use it.
  • the manufactured observation mirror 1 may be divided into a head portion 10, a main body portion 20, and a connection portion 30.
  • the appearance of the manufactured observation mirror 1 is illustrated in FIGS. 6 and 7, and manufactured in FIG. 8.
  • the inside of the observed observation mirror 1 is shown.
  • the observation mirror 1 in which the head part 10 and the main body 20 shown in FIG. 6 are coupled is configured to separate the head part 10 and the main body part 20. .
  • the head unit 10 includes the light acquisition unit 110 shown in FIG. 8, the light emitting unit 120, the tube 180, the tube holder 130 shown in FIG. 7, the turn slider 150, and the fixing unit ( 160, the fixing groove 170, and although not shown in FIGS. 6 to 8, the turn guide 140 includes a turn guide 140 positioned between the tube holder 130 and the turn slider 150.
  • the tube holder 130 includes a tube holder protrusion 131
  • the turn guide 140 includes a guide surface 141
  • the turn slider 150 includes a turn slider groove ( 151).
  • the tube holder 130 includes a hole as shown in FIG. 7, and the user inserts the tube 180 into the hole as shown in FIG. 8 to use the observation mirror 1.
  • the fixing part 160 is a part which is fixed without being rotated, and a part which is not described above among the remaining parts will be described later.
  • the main body 20 includes an image sensor 210, an image processor 220, a light source 260, a display 240, a housing 270, and a handle 280 of FIG. 7. 6 to 8, but not shown, includes a memory 230 and a communication unit 250.
  • the housing unit 270 accommodates the image sensor 210, the image processor unit 220, the memory unit 230, the communication unit 250, the light source unit 260, and the like.
  • the handle part 280 is a part grasped by the user, and the remaining part has been described above.
  • the connection part 30 includes a pin 310 formed at one side of the head part 10 and a locker 320 formed at one side of the main body part 20, as shown in FIG. 7, and the head part 10.
  • the locker 320 includes a locker groove 321, a locker outer wall 322, and a locker inner wall 323.
  • 6 and 7 can be attached and detached by a structure in which the pin 310 and the locker 320 of the connecting portion 30 are separated or combined.
  • 9 and 10 show the structure of the connecting portion 30 in detail, and the parts unnecessary for explaining the detachment of the head portion 10 are not shown.
  • the user inserts the pin 310 into the locker groove 321 and rotates the locker 320 clockwise or counterclockwise to couple the head portion 10 to the main body portion 20.
  • the fixing part groove 170 is engaged with the locker outer wall 322, and the user still moves the head 10 back and forth in this state.
  • the head portion 10 can be separated from the main body portion 20.
  • the pin 310 inserted into the locker groove 321 is blocked by the locker inner wall 323 shown in FIG. Is in a state where it cannot be separated from the main body 20.
  • the user rotates the locker 320 clockwise or counterclockwise to align the locker groove 321 with the pin 310 and pulls the head portion 10 forward to move the head portion 10 to the main body. It can be separated from the portion 20.
  • the focal length can be adjusted by a structure that converts the rotational movement of the head portion 10 into a linear movement, and the structure of such a head portion 10 is shown in detail in FIGS. 11 and 12, and description of the focal length adjustment The unnecessary part is not shown.
  • the head portion 10 includes a tube holder protrusion 131, a tube holder 130 having a narrow front end and a wide rear end, a guide surface 141, and a tube holder 130.
  • a structure including a turn guide (140) surrounding a portion of the), a turn slider groove (151) engaged with the tube holder protrusion (131), and a turn slider (150) surrounding a portion of the turn guide (140). Equipped.
  • the tube holder 130 rotates counterclockwise and moves backwards (where The tube holder protrusion 131 moves along the guide surface 141.), and the focal length is shortened.
  • the turn guide 140 and the fixing part 160 which is not shown in FIGS. 11 and 12, do not move. That is, the focal length is adjusted while the rotational movement of the turn slider 150 is converted into the linear movement of the tube holder 130.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Physiology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Endoscopes (AREA)
  • Studio Devices (AREA)

Abstract

The present invention relates to: an observation scope, which has a detachable head part and a focal length to be adjusted by the structure of the head part; and a remote medical examination-enabled image data processing system comprising the same, and provides the observation scope for photographing the inside of an object, the observation scope comprising: the head part including a light acquisition unit for generating an optical signal by acquiring light reflected from the inside of the object; and a main body part coupled to the head part and having a function of receiving the optical signal so as to generate image data, wherein the head part can be separated from the main body part.

Description

원격진료를 위한 관찰경과 이를 포함하는 영상처리장치 및 시스템Observation mirror for telemedicine and image processing device and system including same
본 발명은, 원격진료를 위한 관찰경과 이를 포함하는 영상처리장치 및 시스템에 관한 것으로, 더욱 상세하게는, 헤드부의 탈착이 가능하고 헤드부의 구조에 의해 초점거리가 조절되는 관찰경과 이를 포함하는 원격진료가 가능한 이미지데이터처리시스템에 관한 것이다.The present invention relates to an observation mirror for remote medical care and an image processing apparatus and system including the same, and more particularly, to a detachable head part and a telescope for including a viewing mirror whose focal length is adjusted by the structure of the head part. Relates to a possible image data processing system.
관찰경은 대상의 내부에 삽입된 관찰경의 튜브를 통해 대상의 내부를 촬영함으로써 사용자가 대상의 내부를 관찰할 수 있게 하는 장치이다. 관찰경은 건물 내벽이나 기계 장치 내부에 경성 또는 연성이 있는 튜브가 삽입되어 사용될 수 있을 뿐만 아니라, 인간이나 동물의 내부에 니들 형태나 고무관 형태, 플라스틱 팁 형태의 튜브가 삽입되어 사용될 수도 있다. 특히 관찰경이 인체에 사용되는 경우, 개복 수술이나 절개 수술 없이도 인체의 내부(기관지, 식도, 흉강, 심장, 위, 장, 복강, 방광, 항문, 비강, 고막 등)를 검사할 수 있으므로, 여러 가지 질병의 진단에 유용하게 사용될 수 있다.The observation mirror is a device that allows the user to observe the inside of the object by photographing the inside of the object through a tube of the observation mirror inserted into the inside of the object. Observation mirrors may be used by inserting a rigid or ductile tube into the interior wall of a building or a mechanical device, as well as by inserting a needle, rubber or plastic tip tube into a human or animal. In particular, when the sight glasses are used in the human body, the inside of the human body (bronchi, esophagus, chest cavity, heart, stomach, intestine, abdominal cavity, bladder, anus, nasal cavity, eardrum, etc.) can be examined without laparotomy or incision. It can be useful for the diagnosis of diseases.
관찰경은 튜브가 대상의 내부에 삽입되기 때문에 튜브나 그 부근의 오염이 문제된다. 특히 수술실 등에서 인체의 내부에 튜브를 삽입하는 경우 높은 수준의 청결이 요구된다. 그러나 통상의 관찰경은 그 복잡한 구조 및 디자인 때문에 살균이나 소독이 매우 까다롭다.Observation mirrors cause contamination of the tube and its vicinity because the tube is inserted inside the object. In particular, when the tube is inserted into the inside of the human body in the operating room, a high level of cleanliness is required. However, conventional observation glasses are very difficult to sterilize or disinfect because of their complicated structure and design.
따라서 관찰경은 세척, 살균 또는 소독에 용이한 구조를 가질 필요가 있다. 이와 관련하여 대한민국 등록특허 0449349호(발명의 명칭 : 개선된유연성삽입튜브를갖는내시경, 이하 종래기술 1이라 한다.)에서는 내부공간을 감싸는 튜브형의 생체적합성 탄성 외피로 이루어지는 가요성 삽입튜브를 갖는 가요성 내시경에 있어서, 상기 삽입 튜브는 외피와 내부 공산 사이에 증기 배리어를 더 포함하여, 증기가 주위대기로부터 외피를 통과하여 내부 공간으로 유입되는 것이 증기 배리어에 의해 방지되고, 또한 증기가 내부 공간 내의 물질과 반응하여 탄성 외피에 해로운 물질을 생성하는 것을 방지하는 것을 특징으로 하는 가요성 삽입튜브를 갖는 가요성 내시경이 개시되어 있다.Therefore, the observation mirror needs to have a structure that is easy to clean, sterilize or disinfect. In this regard, the Republic of Korea Patent No. 0449349 (name of the invention: an endoscope having an improved flexible insertion tube, hereinafter referred to as the prior art 1) is a flexible insertion tube consisting of a tubular biocompatible elastic shell surrounding the inner space. In an endoscope, the insertion tube further comprises a vapor barrier between the shell and the internal communicator, whereby the vapor barrier prevents the ingress of steam from the ambient air through the shell and into the interior space, and also the vapor in the interior space. A flexible endoscope with a flexible insertion tube is disclosed that prevents reacting with the material to produce a material that is harmful to the elastic sheath.
한편 근래에는 IT 시장의 변화가 매우 빠르고, 뉴스에서는 연일 새로운 스마트 기기들의 출시 소식과 스마트 기술 관련 뉴스들이 쏟아지고 있다. 특히 센서 기술, 무선통신기술 등의 발전으로 스마트 기기를 이용하여 사용자 스스로 건강을 관리하거나 원격모니터링시스템 또는 원격진료시스템이 개발되는 등 헬스케어 시장이 급변하고 있다. 원격진료와 관련하여, 대한민국 등록특허 1512068호(발명의 명칭 : 원격 진료 서비스 시스템 및 방법, 이하 종래기술 2라 한다.)에서는 원격 진료 서비스 방법에 있어서, 휴대용 단말기의 카메라를 통하여 원격 진료 서비스 대상에 대한 영상을 획득하는 과정과, 상기 획득된 영상에 대한 계층적 관계를 갖는 복수의 문진정보를 포함하는 환자 상태를 수집하는 과정과, 상기 획득된 영상으로부터 원격 진료 서비스 타입을 인식하고, 상기 인식된 서비스 타입에 해당하는 분석을 수행하여 분석 결과 데이터를 생성하는 과정과, 상기 생성된 분석 결과 데이터에 상기 수집된 환자 상태를 매칭하여 네트워크를 통해 연동된 적어도 하나 이상의 서비스 서버를 통해 브로드캐스팅하는 과정과, 상기 연동된 각 서비스 서버를 통해 브로드캐스팅된 데이터에 대한 결과를 검색하고 이를 수집하여 상기 휴대용 단말기에 표시하는 과정을 포함하고, 상기 문진 정보는, 상기 휴대용 단말기를 조작하여 사용자로부터 선택된 진료 서비스 타입별 혹은 상기 인식된 원격 진료 서비스 타입별 기 설정된 다수의 질문에 대해 입력된 답변을 포함하고, 상기 휴대용 단말기와 통신하고, 다수의 의료 서비스 타입별 특화된 다수의 서비스 서버와 연동된 원격 진료 서비스 서버는 상기 서비스 서버를 통해 검색된 결과 데이터에서 어느 하나의 데이터가 선택되면 상기 연동된 각 서비스 서버를 통해 수집된 해당 데이터에 대한 평가 이력을 상기 휴대용 단말기에 동시에 표시하는 과정을 포함하고, 상기 평가 이력은, 원격 진료 서비스 서버가 각 서비스 서버별 제공되는 데이터에 대한 만족도를 원격 진료 서비스 완료 후 혹은 서비스 중 사용자로부터 수집하여 수집된 데이터 품질에 대한 이력으로 해당 서비스 서버와 대응되게 맵핑하여 저장, 관리되는 것이고, 상기 서비스 서버별 수집된 데이터는, 상기 인식된 서비스 타입에 해당하는 증상에 따라 특화된 의료 서비스 데이터이고, 상기 특화된 의료 서비스 데이터는 장루를 치료할 파우치 제품을 추천하는 의료 서비스 데이터, 상처를 관리하는 간호술 및 암에 관련된 영양, 운동, 생활 습관 권고 내용, 의료 상담에 관련된 데이터를 포함함을 특징으로 하는 원격 진료 서비스 방법이 개시되어 있다.In the meantime, the IT market is changing rapidly, and news is pouring news of new smart devices and news about smart technology every day. In particular, with the development of sensor technology and wireless communication technology, the healthcare market is rapidly changing, such as the user's own health management using smart devices or the development of a remote monitoring system or a remote medical system. In relation to telemedicine, Republic of Korea Patent No. 1512068 (name of the invention: a telemedicine service system and method, hereinafter referred to as prior art 2) in the telemedicine service method, through the camera of the portable terminal to the remote care service target Acquiring an image of the patient, acquiring a patient state including a plurality of questionnaire information having a hierarchical relationship to the acquired image, recognizing a telemedicine service type from the acquired image, Performing analysis corresponding to a service type to generate analysis result data; matching the collected patient state to the generated analysis result data and broadcasting it through at least one service server linked through a network; , The result of the data broadcast through the linked service server Searching for and collecting the same and displaying the collected information on the portable terminal, wherein the questionnaire information includes a plurality of preset questions for each medical service type selected by the user or the recognized remote medical service type by operating the portable terminal. A remote medical service server including an answer inputted in response to the mobile terminal and communicating with the portable terminal and interworking with a plurality of service servers specialized for a plurality of medical service types, if any one data is selected from the result data retrieved through the service server. And simultaneously displaying, on the portable terminal, an evaluation history of the corresponding data collected through each of the linked service servers, and the evaluation history includes a satisfaction level of data provided by each teleservice service server for each service server. After completing telemedicine service or during service The data quality collected and collected by the user is mapped and stored and managed corresponding to the corresponding service server, and the collected data for each service server is specialized medical service data according to symptoms corresponding to the recognized service type. The specialized medical service data includes medical service data recommending a pouch product to treat the stoma, nursing care for wounds and nutrition, exercise, lifestyle recommendations, and medical counseling related to cancer. A telemedicine service method is disclosed.
본 발명이 이루고자 하는 기술적 과제는 종래기술 1이 살균 및 소독 정도에 한계가 있다는 제1문제점, 종래기술 1은 초점거리를 사용자가 조절할 수 없다는 제2문제점, 종래기술 1은 가요성 튜브만을 사용하여 여러 분야에 응용하기 어렵다는 제3문제점, 종래기술 1은 살균 및 소독 문제를 개선하기 위한 발명이나 내시경이 획득한 이미지를 수술실 등 청결이 요구되는 장소에서 확인하는 경우 여전히 내시경의 오염 문제가 있다는 제4문제점, 종래기술 1은 내시경이 획득한 이미지를 여러 사람이 공유할 수 없다는 제5문제점, 종래기술 1에 의해 사용자가 어떠한 피드백도 받을 수 없다는 제6문제점 및 종래기술 2는 종래기술 1과는 달리 사용자가 원격진료를 받을 수 있다는 장점이 있으나 휴대용 단말기의 카메라를 이용하여 창상 부위 등에 대한 원격진료에 그친다는 제7문제점을 해결하려 하는 것이다.The technical problem to be achieved by the present invention is the first problem that the prior art 1 has a limit on the degree of sterilization and disinfection, the second problem that the prior art 1 can not adjust the focal length, the prior art 1 using only a flexible tube The third problem that it is difficult to apply to various fields, the prior art 1 is a fourth invention that there is still a problem of contamination of the endoscope when the invention to improve the sterilization and disinfection problem or when confirming the image obtained by the endoscope in a place requiring cleanness such as an operating room Problem, prior art 1 has a fifth problem that several people can not share the image obtained by the endoscope, sixth problem that the user can not receive any feedback by the prior art 1 and prior art 2 is different from the prior art 1 Although the user has the advantage of receiving remote medical treatment, the remote medical treatment for the wounded part using the camera of the portable terminal To try to solve the problem that Article 7 ceased.
본 발명이 이루고자 하는 기술적 과제는 이상에서 언급한 기술적 과제로 제한되지 않으며, 언급되지 않은 또 다른 기술적 과제들은 아래의 기재로부터 본 발명이 속하는 기술 분야에서 통상의 지식을 가진 자에게 명확하게 이해될 수 있을 것이다.The technical problem to be achieved by the present invention is not limited to the technical problem mentioned above, and other technical problems not mentioned above may be clearly understood by those skilled in the art from the following description. There will be.
상기 문제점을 해결하기 위해 안출되는 본 발명은, 대상의 내부를 촬영하는 관찰경에 있어서, 상기 대상의 내부로부터 반사되는 광을 취득하여 광신호를 생성하는 광취득부를 포함하는 헤드부 및 상기 헤드부에 결합되고, 상기 광신호를 수신하여 이미지데이터를 생성하는 기능을 구비하는 본체부를 포함하여 이루어지고, 상기 헤드부는 상기 본체부로부터 분리될 수 있는 것을 특징으로 하는 관찰경을 제공한다.In order to solve the above problems, the present invention, in the observation mirror for photographing the interior of the object, the head portion and the head portion including a light acquisition unit for obtaining the light reflected from the interior of the object to generate an optical signal And a main body having a function of receiving an optical signal and generating image data, wherein the head portion is detachable from the main body.
또한, 본 발명의 일실시예에 따르면, 상기 헤드부는 초점거리(focal length)를 조절하는 기능을 구비하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the head portion may be characterized by having a function of adjusting the focal length.
또한, 본 발명의 일실시예에 따르면, 상기 헤드부는 회전운동을 직선운동으로 변환하는 구조를 구비하고, 상기 구조에 의하여 상기 초점거리(focal length)가 조절되는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the head portion may have a structure for converting a rotational motion into a linear motion, and the focal length may be adjusted by the structure.
또한, 본 발명의 일실시예에 따르면, 상기 본체부는 상기 광신호를 수신하여 이미지신호를 생성하는 이미지센서를 포함하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the main body may include an image sensor for receiving the optical signal to generate an image signal.
또한, 본 발명의 일실시예에 따르면, 상기 본체부는 상기 이미지신호를 처리하여 상기 이미지데이터를 생성하는 이미지프로세서부를 더 포함하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the main body may further include an image processor for processing the image signal to generate the image data.
또한, 본 발명의 일실시예에 따르면, 상기 헤드부는 렌즈 및 복수 개의 광섬유를 포함하고, 상기 대상의 내부에 삽입되어 상기 반사되는 광을 상기 광취득부로 전달하는 튜브를 더 포함하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the head portion comprises a lens and a plurality of optical fibers, and further comprising a tube which is inserted into the inside of the object to transfer the reflected light to the light acquisition unit Can be.
또한, 본 발명의 일실시예에 따르면, 상기 렌즈는 어안렌즈 또는 광각렌즈인 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the lens may be characterized in that the fisheye lens or a wide-angle lens.
또한, 본 발명의 일실시예에 따르면, 상기 이미지프로세서부는 상기 이미지신호를 수신하여 왜곡이미지를 보정하는 이미지보정부를 포함하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the image processor may be characterized in that it comprises an image correction unit for receiving the image signal to correct the distortion image.
또한, 본 발명의 일실시예에 따르면, 상기 본체부는 상기 이미지데이터를 디스플레이하는 디스플레이부를 포함하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the main body may include a display unit for displaying the image data.
또한, 본 발명의 일실시예에 따르면, 상기 디스플레이부는 고정식 또는 탈착식 디스플레이부인 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the display unit may be a fixed or removable display unit.
또한, 본 발명의 일실시예에 따르면, 상기 디스플레이부는 접이식 디스플레이부인 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the display unit may be a foldable display unit.
또한, 본 발명의 일실시예에 따르면, 상기 본체부는 상기 이미지데이터를 외부기기로 전달하는 통신부를 포함하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the main body may include a communication unit for transmitting the image data to an external device.
또한, 본 발명의 일실시예에 따르면, 상기 본체부는 사용자의 녹화요청에 따라 상기 이미지데이터가 녹화파일로 저장되는 메모리부를 더 포함하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the main body may further include a memory unit in which the image data is stored as a recording file according to a recording request of a user.
또한, 본 발명의 일실시예에 따르면, 상기 사용자는 상기 녹화파일을 상기 통신부에 연결되는 유에스비플래시 드라이브(USB flash drive) 또는 메모리카드에 다운로드할 수 있는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the user may download the recorded file to a USB flash drive or a memory card connected to the communication unit.
또한, 본 발명의 일실시예에 따르면, 상기 본체부는 상기 대상의 내부에 조사될 광을 생성하는 광원부를 포함하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the body portion may be characterized in that it comprises a light source for generating light to be irradiated to the inside of the object.
또한, 본 발명은 상기 관찰경 및 상기 통신부로부터 상기 이미지데이터를 유선 또는 무선통신수단을 통하여 전달받는 단말을 포함하여 이루어지는 것을 특징으로 하는 관찰경시스템을 제공한다.The present invention also provides an observation mirror system comprising a terminal receiving the image data from the observation mirror and the communication unit through a wired or wireless communication means.
또한, 본 발명의 일실시예에 따르면, 상기 무선통신수단은 적외선(IR)통신, 라디오주파수(RF)통신, 무선랜, 와이파이(Wi-Fi), 와이브로(WiBro), 초광대역통신(UWB), 블루투스, 직비 및 근거리무선통신(NFC) 중 하나 이상인 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the wireless communication means is infrared (IR) communication, radio frequency (RF) communication, wireless LAN, Wi-Fi (Wi-Fi), WiBro (UWB), ultra-wideband communication (UWB) It may be characterized in that one or more of, Bluetooth, direct wireless and Near Field Communication (NFC).
또한, 본 발명의 일실시예에 따르면, 상기 관찰경 및 상기 단말은 상기 유선 또는 무선통신수단에 의한 미러링(mirroring)을 통하여 상기 이미지데이터를 공유하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the observation mirror and the terminal may be characterized in that sharing the image data through the mirroring (mirroring) by the wired or wireless communication means.
또한, 본 발명의 일실시예에 따르면, 상기 본체부는 상기 단말이 거치되는 거치부를 더 포함하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the main body may further include a mounting portion on which the terminal is mounted.
또한, 본 발명은 상기 관찰경, 상기 관찰경과 제1통신수단에 의한 미러링(mirroring)을 통하여 상기 이미지데이터를 공유하는 제1단말 및 상기 제1단말과 제2통신수단에 의한 미러링(mirroring)을 통하여 상기 이미지데이터를 공유하는 제2단말을 포함하여 이루어지는 것을 특징으로 하는 관찰경시스템을 제공한다.The present invention also provides a first terminal for sharing the image data through mirroring by the observation mirror, the observation mirror and the first communication means, and mirroring by the first terminal and the second communication means. It provides a observation system comprising a second terminal for sharing the image data through.
또한, 본 발명은 상기 관찰경 및 상기 통신부로부터 상기 이미지데이터를 전달받아 저장하는 서버를 포함하여 이루어지는 것을 특징으로 하는 관찰경시스템을 제공한다.In addition, the present invention provides an observation mirror system comprising a server for receiving and storing the image data from the observation mirror and the communication unit.
또한, 본 발명의 일실시예에 따르면, 상기 서버로부터 상기 이미지데이터를 유선 또는 무선통신수단을 통하여 전달받는 단말을 더 포함하여 이루어지는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, it may be characterized in that it further comprises a terminal for receiving the image data from the server through a wired or wireless communication means.
또한, 본 발명의 일실시예에 따르면, 원격의사가 상기 서버 또는 상기 단말에 전달된 이미지데이터를 분석하여 분석결과를 상기 관찰경의 사용자에게 제공하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the remote doctor may be characterized by analyzing the image data transmitted to the server or the terminal to provide an analysis result to the user of the observation mirror.
또한, 본 발명의 일실시예에 따르면, 가상의사가 상기 서버 또는 상기 단말에 전달된 이미지데이터를 분석하여 분석결과를 상기 관찰경의 사용자에게 제공하는 것을 특징으로 할 수 있다.According to an embodiment of the present invention, the virtual doctor may analyze the image data transmitted to the server or the terminal and provide the analysis result to the user of the observation mirror.
또한, 본 발명의 일실시예에 따르면, 상기 서버 또는 상기 단말은 상기 사용자의 아이디를 인증하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the server or the terminal may be characterized in that the authentication of the user ID.
또한, 본 발명의 일실시예에 따르면, 상기 서버 또는 상기 단말은 상기 관찰경의 아이디를 인증하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the server or the terminal may be characterized in that to authenticate the ID of the observation mirror.
또한, 본 발명은 상기 관찰경을 이용하여 대상의 내부를 관찰하는 방법에 있어서, 광원부가 대상의 내부에 조사될 광을 생성하는 단계, 생성된 광이 대상의 내부에 조사되는 단계, 광취득부가 대상의 내부로부터 반사되는 광을 취득하여 광신호를 생성하는 단계, 이미지센서가 생성된 광신호를 수신하여 이미지신호를 생성하는 단계, 이미지프로세서부가 생성된 이미지신호를 처리하여 이미지데이터를 생성하는 단계 및 디스플레이부가 생성된 이미지데이터를 디스플레이하는 단계를 포함하여 이루어지는 것을 특징으로 하는 관찰경을 이용하여 대상의 내부를 관찰하는 방법을 제공한다.In addition, the present invention is a method for observing the interior of the object using the observation mirror, the step of generating light to be irradiated to the inside of the object, the step of irradiating the generated light inside the object, the light acquisition unit Acquiring the light reflected from the inside of the object to generate an optical signal, receiving an optical signal generated by the image sensor, generating an image signal, and processing the generated image signal by the image processor to generate image data And a display unit provides a method for observing the inside of the object using an observation mirror, characterized in that comprises the step of displaying the generated image data.
또한, 본 발명의 일실시예에 따르면, 상기 이미지프로세서부가 생성된 이미지신호를 처리하여 이미지데이터를 생성하는 단계는, 이미지보정부가 생성된 이미지신호를 수신하는 단계, 이미지보정부가 왜곡이미지를 보정하여 보정이미지를 생성하는 단계 및 이미지데이터생성부가 보정이미지에 관한 이미지데이터를 생성하는 단계를 포함하여 구성되는 것을 특징으로 할 수 있다.According to an embodiment of the present invention, the generating of the image data by processing the generated image signal by the image processor unit includes: receiving the generated image signal by the image correction unit, and correcting the distortion image by the image correction unit It may be characterized in that it comprises a step of generating a correction image and the image data generation unit for generating the image data related to the correction image.
또한, 본 발명의 일실시예에 따르면, 상기 이미지보정부가 왜곡이미지를 보정하여 보정이미지를 생성하는 단계는, 이미지보정부가 왜곡보정알고리즘을 이용하여 왜곡이미지의 기하학적 보정을 수행하는 단계를 포함하는 것을 특징으로 할 수 있다.According to an embodiment of the present invention, the step of generating the corrected image by correcting the distorted image by the image compensating includes performing a geometrical correction of the distorted image by using the distortion correcting algorithm. It can be characterized.
또한, 본 발명의 일실시예에 따르면, 상기 왜곡보정알고리즘은 초점거리(focal length), 주점(principal point), 방사왜곡상수(radial distortion coefficient) 및 접선왜곡상수(tangential distortion coefficient) 중 하나 이상을 포함하는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the distortion correction algorithm is one or more of the focal length (prical length), the principal point (principal point), the radial distortion coefficient (radial distortion coefficient) and the tangential distortion coefficient (tangential distortion coefficient) It may be characterized by including.
또한, 본 발명의 일실시예에 따르면, 상기 이미지보정부가 왜곡이미지를 보정하여 보정이미지를 생성하는 단계는, 상기 이미지보정부가 왜곡보정알고리즘을 이용하여 왜곡이미지의 기하학적 보정을 수행하는 단계 이후에, 이미지보정부가 왜곡이미지로부터 소실점좌표를 추출하는 단계, 이미지보정부가 소실점좌표로부터 프로젝션알고리즘을 구성하는 보정계수를 산출하는 단계 및 이미지보정부가 프로젝션알고리즘을 이용하여 보정이미지를 생성하는 단계를 더 포함하고, 프로젝션알고리즘은 수학식 1로 표현되는 것을 특징으로 할 수 있다.According to an embodiment of the present invention, the step of generating the corrected image by correcting the distorted image by the image correction unit, after performing the geometric correction of the distorted image using the distortion correction algorithm, Extracting the vanishing point coordinates from the distorted image by the image correction unit, calculating the correction coefficients constituting the projection algorithm from the vanishing point coordinates, and generating the corrected image by the image correction unit using the projection algorithm, The projection algorithm may be represented by Equation 1.
또한, 본 발명의 일실시예에 따르면, 보정계수는 수학식 2로 표현되는 것을 특징으로 할 수 있다.In addition, according to an embodiment of the present invention, the correction coefficient may be represented by Equation 2.
본 발명은 헤드부의 탈착이 가능하여 헤드부나 본체부의 세척, 살균 또는 소독이 용이하다는 제1효과, 사용자가 초점거리를 조절할 수 있어 사용자가 원하는 화질의 이미지를 획득할 수 있다는 제2효과, 다양한 형태의 튜브를 사용하여 여러 분야에 응용할 수 있다는 제3효과, 관찰경과 무선통신수단에 의해 연결되는 단말로 이미지를 확인할 수 있으므로 디스플레이부 오염의 문제가 해결되고 편의성이 향상된다는 제4효과, 관찰경이 획득한 이미지를 실시간으로 공유하거나 녹화하여 사후에 얻을 수 있다는 제5효과, 관찰경이 획득한 이미지에 기초하여 사용자는 이와 관련된 피드백을 받을 수 있다는 제6효과, 제 6효과와 관련하여 특히 사용자 및/또는 대상 환자의 신체 내부의 이미지에 기초한 원격진료가 가능하다는 제7효과, 관찰경이 획득한 이미지를 원격지에서 확인할 수 있다는 제8효과, 관찰경이 획득한 이미지가 서버에 저장되어 획득된 이미지에 대한 접근성이 용이하다는 제9효과 및 관찰경이 획득한 이미지가 서버에 축적되는 경우 축적된 이미지데이터를 빅데이터 분석이나 치료예후비교에 응용할 수 있다는 제10효과를 갖는다.The present invention is the first effect that the detachable head portion is easy to clean, sterilize or disinfect the head portion or the body portion, the second effect that the user can adjust the focal length to obtain the image quality desired by the user, various forms The third effect that can be applied to various fields using the tube of the, the image can be confirmed by the terminal connected by the observation mirror and the wireless communication means, so the problem of contamination of the display part is solved and the fourth effect that the convenience is improved The fifth effect of sharing or recording an image in real time and obtaining it in the future, based on the image obtained by the observation mirror, the user may receive feedback related to the sixth effect, in particular with respect to the sixth effect and / or the user and / or The seventh effect that telemedicine is possible based on the image of the inside of the patient's body, the image obtained by the observation mirror Eighth effect that can be confirmed remotely, the ninth effect that the image obtained by the observation mirror is stored on the server and easy access to the acquired image, and the accumulated image data when the image obtained by the observation mirror is accumulated on the server It has a tenth effect that it can be applied to data analysis or treatment prognosis.
본 발명의 실시예에 따르면 본 발명의 효과는 상기한 효과로 한정되는 것은 아니며, 본 발명의 상세한 설명 또는 특허청구범위에 기재된 발명의 구성으로부터 추론 가능한 모든 효과를 포함하는 것으로 이해되어야 한다.According to the embodiments of the present invention, the effects of the present invention are not limited to the above-described effects, but should be understood to include all the effects deduced from the configuration of the invention described in the detailed description or claims of the present invention.
도 1은 본 발명인 관찰경의 일실시예를 나타내는 블록도.1 is a block diagram showing an embodiment of the inventor's observation mirror.
도 2는 프로젝션알고리즘을 적용하기 전의 왜곡이미지를 나타내는 모식도.2 is a schematic diagram showing a distortion image before applying a projection algorithm.
도 3은 본 발명인 관찰경시스템의 일실시예를 나타내는 구조도.Figure 3 is a structural diagram showing an embodiment of the present observation system.
도 4는 본 발명인 관찰경시스템의 일실시예를 나타내는 구조도.Figure 4 is a structural diagram showing one embodiment of the present observation system.
도 5는 본 발명인 관찰경시스템의 일실시예를 나타내는 구조도.Figure 5 is a structural diagram showing an embodiment of the present observation system.
도 6은 본 발명인 관찰경의 실시예를 나타내는 사시도.6 is a perspective view showing an embodiment of the inventor's observation mirror.
도 7은 본 발명인 관찰경의 실시예를 나타내는 사시도.7 is a perspective view showing an embodiment of the inventor's observation mirror.
도 8은 본 발명인 관찰경의 실시예를 나타내는 사시단면도.8 is a perspective cross-sectional view showing an embodiment of the inventor's observation mirror.
도 9는 실시예로서, 본 발명인 관찰경의 헤드부를 나타내는 사시도.Fig. 9 is a perspective view showing the head portion of the observation mirror according to the present invention as an example.
도 10은 실시예로서, 본 발명인 관찰경의 락커를 나타내는 사시도.10 is a perspective view showing a locker of the observation mirror according to the present invention as an example.
도 11은 실시예로서, 본 발명인 관찰경의 헤드부를 나타내는 사시도.Fig. 11 is a perspective view showing the head portion of the observation mirror according to the present invention as an example.
도 12는 실시예로서, 본 발명인 관찰경의 헤드부를 나타내는 사시도.12 is a perspective view showing a head portion of the observation mirror according to the present invention as an example.
본 발명인 관찰경의 실시예 중 최선의 형태는, 대상의 내부를 촬영하는 관찰경에 있어서, 상기 대상의 내부로부터 반사되는 광을 취득하여 광신호를 생성하는 광취득부를 포함하는 헤드부 및 상기 헤드부에 결합되고, 상기 광신호를 수신하여 이미지데이터를 생성하는 기능을 구비하는 본체부를 포함하여 이루어지고, 상기 헤드부는 상기 본체부로부터 분리될 수 있는 것을 특징으로 한다. The best mode of the embodiment of the observation mirror of the present invention includes a head portion and a head portion including an optical acquisition portion for acquiring light reflected from the interior of the object and generating an optical signal in the observation mirror for photographing the interior of the object. And a main body having a function of receiving an optical signal and generating image data, wherein the head can be separated from the main body.
이하에서는 첨부한 도면을 참조하여 본 발명을 설명하기로 한다. 그러나 본 발명은 여러 가지 상이한 형태로 구현될 수 있으며, 따라서 여기에서 설명하는 실시예로 한정되는 것은 아니다. 그리고 도면에서 본 발명을 명확하게 설명하기 위해서 설명과 관계없는 부분은 생략하였으며, 명세서 전체를 통하여 유사한 부분에 대해서는 유사한 도면 부호를 붙였다.Hereinafter, with reference to the accompanying drawings will be described the present invention. As those skilled in the art would realize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present invention. In the drawings, parts irrelevant to the description are omitted in order to clearly describe the present invention, and like reference numerals designate like parts throughout the specification.
명세서 전체에서, 어떤 부분이 다른 부분과 "연결(접속, 접촉, 결합)"되어 있다고 할 때, 이는 "직접적으로 연결"되어 있는 경우뿐 아니라, 그 중간에 다른 부재를 사이에 두고 "간접적으로 연결"되어 있는 경우도 포함한다. 또한 어떤 부분이 어떤 구성요소를 "포함"한다고 할 때, 이는 특별히 반대되는 기재가 없는 한 다른 구성요소를 제외하는 것이 아니라 다른 구성요소를 더 구비할 수 있다는 것을 의미한다.Throughout the specification, when a part is said to be "connected (connected, contacted, coupled)" with another part, it is not only "directly connected" but also "indirectly connected" with another member in between. "Includes the case. In addition, when a part is said to "include" a certain component, this means that it may further include other components, without excluding the other components unless otherwise stated.
본 명세서에서 사용한 용어는 단지 특정한 실시예를 설명하기 위해 사용된 것으로, 본 발명을 한정하려는 의도가 아니다. 단수의 표현은 문맥상 명백하게 다르게 뜻하지 않는 한, 복수의 표현을 포함한다. 본 명세서에서, "포함하다" 또는 "가지다" 등의 용어는 명세서상에 기재된 특징, 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것이 존재함을 지정하려는 것이지, 하나 또는 그 이상의 다른 특징들이나 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것들의 존재 또는 부가 가능성을 미리 배제하지 않는 것으로 이해되어야 한다.The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. As used herein, the terms "comprise" or "have" are intended to indicate that there is a feature, number, step, action, component, part, or combination thereof described on the specification, and one or more other features. It is to be understood that the present invention does not exclude the possibility of the presence or the addition of numbers, steps, operations, components, components, or a combination thereof.
본 발명의 관찰경(1)은 대상의 내부를 촬영하는 장치로서, 대상의 내부에 광을 조사한 다음 대상의 내부로부터 반사되는 광을 취득하여 이미지데이터를 생성한다. 여기서 대상의 내부는 환자의 신체 부위, 동물의 신체 부위, 건물 내벽, 배관 내부 또는 챠량 내부가 될 수 있고, 이하 같으나, 이에 한정하는 것은 아니다. 관찰경(1)은 헤드부(10) 및 본체부(20)를 포함하여 이루어지는데, 헤드부(10)는 광취득부(110)를 포함하고, 광방출부(120)나 튜브(180)를 더 포함할 수 있다. 헤드부(10)에 결합된 본체부(20)는 헤드부(10)로부터 전달되는 광신호를 수신하여 이미지데이터를 생성하는 기능 등을 구비하고, 이미지센서(210), 이미지프로세서부(220), 메모리부(230), 디스플레이부(240), 통신부(250) 및 광원부(260) 중 하나 이상을 포함할 수 있다. 도 1은 관찰경(1)의 일실시예를 블록도로 도시하고 있어, 관찰경(1)의 구성뿐만 아니라 관찰경(1)을 이용하여 대상의 내부를 관찰하는 방법까지 한눈에 파악할 수 있다. 이하, 도 1을 참조하여 관찰경(1)을 이루는 각 구성요소별로 상술하기로 하며, 이를 통해 관찰경(1)을 이용하여 대상의 내부를 관찰하는 방법도 함께 파악될 수 있을 것이다.The observation mirror 1 of the present invention is an apparatus for photographing the inside of an object, and irradiates light into the inside of the object, and then acquires light reflected from the inside of the object to generate image data. Here, the interior of the subject may be a body part of a patient, a body part of an animal, an inner wall of a building, an inside of a pipe, or an inside of a vehicle, but the present invention is not limited thereto. Observation mirror 1 comprises a head portion 10 and the body portion 20, the head portion 10 includes a light acquisition portion 110, the light emitting portion 120 or tube 180 It may further include. The main body 20 coupled to the head portion 10 has a function of receiving an optical signal transmitted from the head portion 10 and generating image data, and includes an image sensor 210 and an image processor 220. , One or more of a memory unit 230, a display unit 240, a communication unit 250, and a light source unit 260. FIG. 1 shows an embodiment of the observation mirror 1 in a block diagram, so that not only the configuration of the observation mirror 1 but also a method of observing the inside of the object using the observation mirror 1 can be grasped at a glance. Hereinafter, referring to FIG. 1, each component of the observation mirror 1 will be described in detail, and through this, a method of observing the inside of the object using the observation mirror 1 will also be understood.
튜브(180)는 대상의 내부에 삽입되는 것으로, 사용자는 관찰경(1)으로 대상의 내부를 관찰함에 있어 튜브(180)를 대상의 내부에 삽입하여 대상의 내부의 이미지를 관찰할 수 있다. 튜브(180)는 대상의 피부를 찔러 들어갈 수 있는 니들(needle) 타입, 경성 타입(예를 들어, 일직선의 금속관이나 검이경의 삽입 부분), 연성 타입(예를 들어, 고무로 되어 있어 자유자재로 휘어지는 것) 또는 경성 타입과 연성 타입의 중간 타입(경성 또는 연성이 경성 타입과 연성 타입의 중간 정도인 것)의 튜브(180)가 될 수 있으나, 이에 한정하는 것은 아니다. 튜브(180)는 렌즈 및 복수 개의 광섬유를 포함하고, 튜브(180)가 대상의 내부에 삽입되면, 후술할 광원부(260)가 생성한 광이 튜브(180)의 복수 개의 광섬유 중 일부(이하, 발광라인이라고 하기로 한다.) 및 렌즈를 통과하여 대상의 내부에 도달하고, 대상의 내부로부터 반사된 광은 다시 튜브(180)의 렌즈 및 복수 개의 광섬유 중 나머지(이하, 촬영라인이라고 하기로 한다.)를 통과하여 관찰경(1) 내부에서 소정의 방식으로 처리된다. 렌즈는 표준렌즈, 망원렌즈, 마이크로렌즈, 광각렌즈 및 어안렌즈 중 하나 이상이 될 수 있으나, 이에 한정하는 것은 아니다.The tube 180 is inserted into the inside of the object, and the user may observe the image of the inside of the object by inserting the tube 180 into the inside of the object in observing the inside of the object with the observation mirror 1. Tube 180 is a needle type that can be inserted into the skin of the target, a rigid type (for example, a straight metal tube or an insertion portion of the otoscope), a soft type (for example, made of rubber and freely Bend) or an intermediate type of rigid type and flexible type (hard or soft intermediate between the hard type and flexible type), but is not limited thereto. The tube 180 includes a lens and a plurality of optical fibers, and when the tube 180 is inserted into the inside of the object, the light generated by the light source unit 260, which will be described later, is a part of the plurality of optical fibers of the tube 180 (hereinafter, The light that passes through the lens and reaches the inside of the object, and the light reflected from the inside of the object is again the lens of the tube 180 and the plurality of optical fibers (hereinafter, referred to as a photographing line). Is processed in a predetermined manner inside the observation mirror (1). The lens may be one or more of a standard lens, a telephoto lens, a micro lens, a wide angle lens, and a fisheye lens, but is not limited thereto.
광원부(260)는 대상의 내부에 조사될 광을 생성한다. 광원부(260)는 레이저발광다이오드(Laser Emitting Diode, LED) 또는 레이저다이오드(Laser Diode, LD)를 광원으로 하는 램프를 포함할 수 있으나, 광원을 이에 한정하는 것은 아니다. 또한 광원부(260)는 레이저발광다이오드(Laser Emitting Diode, LED) 램프가 UV-Blue 대역의 레이저를 출사하는 경우, 유리나 세라믹 소재로 구성된 색변환플레이트를 더 포함하여 UV-Blue 대역의 레이저를 백색광으로 변환시킬 수 있으나, 광원부(260)의 구성을 이러한 것들에 한정하는 것은 아니다.The light source unit 260 generates light to be irradiated inside the object. The light source unit 260 may include a lamp using a laser emitting diode (LED) or a laser diode (LD) as a light source, but the light source is not limited thereto. In addition, the light source unit 260 further includes a color conversion plate made of glass or ceramic material when the laser emitting diode (LED) lamp emits a UV-Blue band laser to convert the UV-Blue band laser into white light. It is possible to convert, but the configuration of the light source unit 260 is not limited to these.
광방출부(120)는 광원부(260)에 결합되어 광원부(260)가 생성한 광을 대상의 내부에 조사한다. 광원부(260)가 생성한 광은 광방출부(120)에 연결된 발광라인을 통해 대상의 내부에 조사된다. 즉, 도 1은 광원부(260)가 생성한 광이 대상의 내부에 조사되는 것을 간단히 화살표로 도시하고 있으나, 구체적으로는 광원부(260)가 생성한 광은 광원부(260)에 결합된 광방출부(120), 광방출부(120)에 연결된 발광라인, 렌즈, 대상의 내부로 순차적으로 전달될 수 있는 것이다.The light emitting unit 120 is coupled to the light source unit 260 to irradiate the light generated by the light source unit 260 to the inside of the object. The light generated by the light source unit 260 is irradiated to the inside of the object through a light emitting line connected to the light emitter 120. That is, although FIG. 1 simply shows an arrow that light generated by the light source unit 260 is irradiated to the inside of the object, specifically, light generated by the light source unit 260 is a light emitting unit coupled to the light source unit 260. 120, the light emitting line connected to the light emitting unit 120, the lens, and may be sequentially delivered to the inside of the object.
광취득부(110)는 대상의 내부로부터 반사되는 광을 취득한다. 도 1은 대상의 내부로부터 반사되는 광이 헤드부(10)에 도달함을 간단히 화살표로 도시하고 있으나, 구체적으로는 대상의 내부로부터 반사되는 광은 렌즈, 촬영라인, 촬영라인에 연결된 광취득부(110)로 순차적으로 전달될 수 있는 것이다. 그리고 대상의 내부로부터 반사되는 광을 취득한 광취득부(110)는 광신호를 생성하여 후술하는 이미지센서(210)로 전달한다.The light acquisition unit 110 acquires the light reflected from the inside of the object. FIG. 1 shows an arrow indicating that the light reflected from the inside of the object reaches the head portion 10. Specifically, the light reflected from the inside of the object is a light acquisition unit connected to a lens, a shooting line, and a shooting line. Can be delivered sequentially to (110). The light acquisition unit 110 that acquires light reflected from the inside of the object generates an optical signal and transmits the light signal to the image sensor 210 which will be described later.
지금까지 헤드부(10)의 구성요소인 튜브(180), 광방출부(120) 및 광취득부(110)에 대해 설명하였고, 본체부(20)의 구성요소 중에서는 광원부(260)만 설명하였는데, 본체부(20)의 이미지센서(210)에 대해 설명하기 전에 헤드부(10)의 탈착에 대하여 설명하기로 한다. The tube 180, the light emitting unit 120, and the light acquisition unit 110, which are components of the head unit 10, have been described so far, and only the light source unit 260 is described among the components of the main body 20. However, before describing the image sensor 210 of the main body 20, the detachment of the head 10 will be described.
사용자가 관찰경(1)을 사용하여 대상의 내부를 관찰함에 있어서, 헤드부(10)와 본체부(20)가 결합된 관찰경(1)을 사용하나, 헤드부(10)는 본체부(20)로부터 분리될 수 있다. 종래의 기술들은 통상 헤드부(10)를 본체부(20)로부터 분리하는 것이 용이하지 않았다. 왜냐하면 종래의 기술들은 통상 튜브(180)의 광섬유(촬영라인)가 본체부(20)의 이미지프로세서부(220)까지 뻗어 있었기 때문이다. 그러나 본 발명은 발광라인에 연결된 광방출부(120)를 광원부(260)에 결합하거나 광원부(260)로부터 분리할 수 있고, 촬영라인에 연결된 광취득부(110)는 이미지센서(210) 또는 이미지프로세서부(220)와 이격되어 있기 때문에, 발광라인, 광방출부(120), 촬영라인, 광취득부(110)를 포함하는 헤드부(10)는 광원부(260)를 포함하는 본체부(20)로부터 분리될 수 있다. 두 번째로, 관찰경(1)은 헤드부(10) 및 본체부(20) 외에 헤드부(10)와 본체부(20)를 연결하는 연결부(30)를 포함하고, 연결부(30)의 구조가 헤드부(10)의 분리 및 결합을 가능하게 한다. 연결부(30)는 헤드부(10)를 본체부(20)로부터 당기면 헤드부(10)가 분리되고, 헤드부(10)를 본체부(20) 쪽으로 밀면 헤드부(10)가 결합되는 단순한 구조를 가질 수도 있으나, 이런 구조는 사용자가 관찰경(1)을 사용하는 도중에 헤드부(10)가 본체부(20)로부터 분리될 수 있으므로, 연결부(30)는 헤드부(10)의 분리 및 결합이 가능하면서도 헤드부(10)를 본체부(20)에 결합하였을 때 헤드부(10)가 고정되는 구조를 가지는 것이 바람직하고, 이러한 구조는 여러 가지가 있을 수 있으며, 이 중 하나의 예를 후술하는 실시예에서 설명하기로 한다.When the user observes the inside of the object by using the observation mirror 1, the user uses the observation mirror 1 in which the head portion 10 and the main body portion 20 are combined, but the head portion 10 is the main body portion ( 20). Prior arts typically did not facilitate separating the head portion 10 from the body portion 20. This is because in the conventional arts, the optical fiber (photography line) of the tube 180 extends to the image processor 220 of the main body 20. However, in the present invention, the light emitting unit 120 connected to the light emitting line may be coupled to the light source unit 260 or separated from the light source unit 260, and the light acquisition unit 110 connected to the photographing line may be an image sensor 210 or an image. Since the head unit 10 is spaced apart from the processor 220, the head unit 10 including the light emitting line, the light emitting unit 120, the photographing line, and the light acquisition unit 110 may include a main body unit 20 including the light source unit 260. Can be separated from). Secondly, the observation mirror 1 includes a connection part 30 for connecting the head part 10 and the main body part 20 in addition to the head part 10 and the main body part 20, and has a structure of the connecting part 30. Enables the detachment and coupling of the head portion 10. The connection part 30 has a simple structure in which the head part 10 is separated by pulling the head part 10 from the main body part 20, and the head part 10 is coupled by pushing the head part 10 toward the main body part 20. Although this may have a structure, since the head part 10 may be separated from the main body part 20 while the user uses the observation mirror 1, the connection part 30 may be separated and coupled to the head part 10. Although it is possible to have a structure in which the head portion 10 is fixed when the head portion 10 is coupled to the main body portion 20, such a structure may be various, one example of which will be described later. It will be described in the embodiment.
헤드부(10)는 초점거리(focal length)를 조절하는 기능을 구비하는데, 초점거리(focal length)는 튜브(180)의 렌즈로부터 본체부(20)의 이미지센서(210)까지의 거리를 의미한다. 사용자는 여러 가지 형태의 헤드부(10)를 본체부(20)에 결합할 수 있고, 여러 가지 형태의 튜브(180)를 헤드부(10)의 다른 구성요소에 결합할 수 있으며, 여러 가지 종류의 렌즈를 튜브(180)의 구성요소로서 사용할 수 있다. 따라서 헤드부(10)의 형태, 튜브(180)의 형태 및 렌즈의 종류 중 하나 이상의 요인에 의하여 초점거리가 달라질 수 있기 때문에 관찰경(1)은 이러한 초점거리를 조절하는 기능을 구비할 필요가 있다. 전술한 바와 같이, 연결부(30)가 헤드부(10)를 본체부(20)로부터 당기면 헤드부(10)가 분리되고, 헤드부(10)를 본체부(20) 쪽으로 밀면 헤드부(10)가 결합되는 단순한 구조를 가지는 경우, 헤드부(10)를 분리 또는 결합하는 과정에서 헤드부(10)를 앞·뒤로 움직이는 것이 가능하므로 헤드부(10)를 앞으로 당기면 초점거리가 길어지고, 헤드부(10)를 뒤로 밀면 초점거리가 짧아질 수 있다. 또한 연결부(30)의 구조와는 관계없이, 헤드부(10)가 갖춘 소정의 구조에 의해 초점거리가 조절될 수 있으며, 이러한 구조는 랙앤피니언(rack and pinion)기어와 같은 회전운동을 직선운동으로 변환하는 구조일 수 있으나, 이에 한정하는 것은 아니다. 회전운동을 직선운동으로 변환하는 구조 중 하나의 예는 후술하는 실시예에서 설명하기로 한다. Head portion 10 has a function of adjusting the focal length (focal length), the focal length (focal length) means the distance from the lens of the tube 180 to the image sensor 210 of the body portion 20. do. The user may combine various types of head portions 10 to the main body portion 20, various types of tubes 180 may be coupled to other components of the head portion 10, and various types Can be used as a component of the tube 180. Therefore, since the focal length may vary depending on one or more factors among the shape of the head portion 10, the shape of the tube 180, and the type of lens, the observation mirror 1 needs to have a function of adjusting the focal length. have. As described above, when the connecting portion 30 pulls the head portion 10 from the main body portion 20, the head portion 10 is separated and the head portion 10 is pushed toward the main body portion 20. In the case of having a simple structure that is coupled, since the head portion 10 can be moved forward and backward in the process of separating or combining the head portion 10, when the head portion 10 is pulled forward, the focal length becomes long, and the head portion Pushing back (10) may shorten the focal length. In addition, regardless of the structure of the connection portion 30, the focal length can be adjusted by a predetermined structure provided with the head portion 10, this structure is a linear motion of the rotational motion, such as rack and pinion gear It may be a structure for converting to, but is not limited thereto. One example of a structure for converting a rotational motion into a linear motion will be described in the following embodiments.
이미지센서(210)는 도 1에 도시된 바와 같이 광취득부(110)로부터 광신호를 수신하여 전기신호인 이미지신호를 생성한다. 이미지센서(210)는 상보성금속산화물반도체(Complementary Metal Oxide Semiconductor, CMOS) 또는 전하결합소자(Charge Coupled Device, CCD)일 수 있으나, 이에 한정하는 것은 아니다. 이미지센서(210)는 필요에 따라 VGA, SVGA 또는 SXGA에 해당하는 화소를 가질 수 있으나, 이에 한정하는 것은 아니다.As shown in FIG. 1, the image sensor 210 receives an optical signal from the light acquisition unit 110 and generates an image signal which is an electrical signal. The image sensor 210 may be a complementary metal oxide semiconductor (CMOS) or a charge coupled device (CCD), but is not limited thereto. The image sensor 210 may have pixels corresponding to VGA, SVGA, or SXGA as necessary, but is not limited thereto.
이미지프로세서부(220)는 도 1에 도시된 바와 같이 이미지센서(210)가 생성한 이미지신호를 처리하여 이미지데이터를 생성한다. 구체적으로 이미지프로세서부(220)는 신호변환부(221), 이미지보정부(222) 및 이미지데이터생성부(223) 중 하나 이상을 포함할 수 있다.The image processor 220 generates image data by processing the image signal generated by the image sensor 210 as shown in FIG. 1. In detail, the image processor 220 may include at least one of a signal converter 221, an image compensator 222, and an image data generator 223.
신호변환부(221)는 이미지센서(210)가 아날로그 이미지신호를 생성한 경우, 이를 디지털 이미지신호로 변환한다. 또한 필요에 따라 노이즈 제거를 한 후에 디지털 이미지신호로 변환할 수도 있으며, YUV, RGB 등의 색좌표계를 변환할 수도 있다.The signal converter 221 converts the analog image signal into a digital image signal when the image sensor 210 generates an analog image signal. In addition, after removing noise, a digital image signal may be converted, and color coordinate systems such as YUV and RGB may be converted.
이미지보정부(222)는 이미지센서(210) 또는 신호변환부(221)로부터 이미지신호를 수신한 다음, 왜곡이미지를 보정하여 보정이미지를 생성한다. 이미지보정부(222)는 통상적인 컬러보정이나 감마보정을 수행할 수 있을 뿐만 아니라, 왜곡보정알고리즘을 이용하여 왜곡이미지의 기하학적 보정을 수행할 수 있고, 프로젝션알고리즘을 이용하여 왜곡이미지의 원근효과를 제거할 수 있다. 보정의 순서는 기하학적 보정을 수행한 다음에, 원근효과를 제거하는 것이 바람직하다.The image compensator 222 receives an image signal from the image sensor 210 or the signal converter 221 and then corrects the distortion image to generate a corrected image. The image correction unit 222 may not only perform normal color correction or gamma correction, but also perform geometric correction of the distorted image using the distortion correction algorithm, and use the projection algorithm to correct the perspective effect of the distorted image. Can be removed The order of correction is preferably to remove the perspective effect after performing the geometric correction.
기하학적 보정과 원근효과 제거는 어안렌즈나 광각렌즈를 사용할 경우 특히 효과적인 보정방법이다. 어안렌즈와 광각렌즈는 표준렌즈보다 초점거리가 짧은 렌즈로서, 어안렌즈는 화각이 최소 120도를 넘는 렌즈로 광범위한 영역의 이미지 획득을 가능하게 하고, 광각렌즈는 화각이 60도에서 120도 정도인 렌즈로 어안렌즈만큼은 아니지만 이 역시 넓은 이미지 획득을 가능하게 한다. 그러나 어안렌즈나 광각렌즈는 화각이 넓기 때문에 사물을 왜곡시키고 원근감을 과장시킨다. 따라서 어안렌즈나 광각렌즈를 사용하여 대상의 내부를 관찰하는 경우, 대상의 내부가 왜곡되고, 관심 영역의 주변부에 대해 원근감이 발생하여 대상의 내부의 정확한 상태를 판단하기 곤란하다.Geometric correction and perspective removal are particularly effective correction methods when using fisheye or wide-angle lenses. Fish-eye and wide-angle lenses have shorter focal lengths than standard lenses. Fish-eye lenses have a lens angle of at least 120 degrees, enabling wide-area images. Wide-angle lenses have a 60 to 120 degree angle of view. The lens is not as good as the fisheye lens, but it also enables wider image acquisition. However, fisheye lenses and wide-angle lenses have wide angles of view, distorting objects and exaggerating perspective. Therefore, when observing the inside of the object using a fisheye lens or a wide-angle lens, the inside of the object is distorted, a perspective occurs to the periphery of the region of interest, and it is difficult to determine the exact state of the inside of the object.
먼저 기하학적 보정에 대하여 살펴본다. 실제 렌즈는 완벽할 수 없으며 왜곡이 존재하는데, 수학적으로 이상적인 포물선렌즈보다 구면렌즈를 만드는 것이 훨씬 쉬우며, 렌즈와 이미지센서(210)를 완벽하게 정렬하는 것도 쉬운 일이 아니기 때문이다. 렌즈의 왜곡에는 크게 방사왜곡(radial distortion)과 접선왜곡(tangential distortion)이 있다. 방사왜곡은 렌즈의 모양에 의해서 발생하는 왜곡으로 일반적으로 이미지센서(210) 가장자리 부근에서 픽셀의 위치가 볼록하게 왜곡되는 현상을 말한다. 이러한 볼록 현상은 '술통 왜곡'이라고도 하며, 어안효과의 원인이 되기도 한다. 접선왜곡은 이미지센서(210)와 렌즈가 평행하지 않아서 발생하는 왜곡으로 관찰경(1) 제조과정에서 생긴다. 왜곡보정알고리즘은 초점거리(focal length), 주점(principal point), 방사왜곡상수(radial distortion coefficient) 및 접선왜곡상수(tangential distortion coefficient) 중 하나 이상을 포함하는 함수로 나타낼 수 있고, 이미지보정부(222)는 왜곡보정알고리즘을 이용하여 방사왜곡(radial distortion)이나 접선왜곡(tangential distortion)과 같은 기하학적 왜곡을 보정할 수 있다. 기하학적 보정은 공지된 이론으로서, 영상처리분야의 당업자라면 더 이상의 상세한 설명 없이도 이해할 수 있을 것이다.First, the geometric correction is discussed. The actual lens cannot be perfect and there is distortion, because it is much easier to make a spherical lens than a mathematically ideal parabolic lens, and it is not easy to perfectly align the lens with the image sensor 210. There are largely radial distortion and tangential distortion of the lens. Radiation distortion is a distortion caused by the shape of the lens generally refers to a phenomenon in which the position of the pixel is convexly distorted near the edge of the image sensor 210. These convex phenomena are also called 'keg distortion' and can also cause fisheye effects. Tangential distortion is a distortion that occurs because the image sensor 210 and the lens are not parallel to each other occurs in the manufacturing process of the observation mirror (1). The distortion correction algorithm can be represented by a function including one or more of focal length, principal point, radial distortion coefficient, and tangential distortion coefficient, 222 may correct geometric distortions such as radial distortion or tangential distortion using a distortion correction algorithm. Geometric correction is a well known theory and will be understood by those skilled in the art of image processing without further detailed description.
다음으로 원근효과 제거에 대하여 살펴본다. 도 2는 왜곡이미지의 일실시예를 나타낸 것으로서, 기하학적 보정이 완료되었지만, 원근효과 때문에 각 변이 휘어진 사각형을 도시하고 있다. 이렇게 각 변이 휘어진 사각형 이미지가 생성된 것은, 직사각형을 카메라로 촬영하는 경우 직사각형에서 카메라와의 거리가 가까운 부분은 크게 나타나고 카메라와의 거리가 먼 부분은 작게 나타나서, 원근감이 생성되었기 때문인데, 관찰경(1)이 대상의 내부를 촬영함에 있어서도 대상의 내부에 대해 원근감이 적용된 이미지가 획득된다면 대상의 내부의 정확한 형태를 파악하기 어렵게 된다. 따라서 왜곡이미지의 원근효과를 제거할 필요가 있는데, 그 첫 번째 단계로 이미지보정부(222)가 왜곡이미지로부터 소실점좌표를 추출한다. 도 2에서는 직선 AB와 직선 DC가 연장되어 만나는 제1소실점 P1(XP1, YP1)과 직선 AD와 직선 BC가 연장되어 만나는 제2소실점 P2(XP2, YP2)를 추출한다. 도 2에서는 제1소실점 P1은 x축 상에 위치하고, 제2소실점 P2는 y축 상에 위치하는 것으로 하였다. 두 번째 단계로 이미지보정부(222)가 소실점좌표로부터 프로젝션알고리즘을 구성하는 보정계수를 산출한다. 이를 위해 먼저 하기 수학식 3과 같이 프로젝션매트릭스 [P]를 생성한다.Next, we will look at removing perspective effects. Figure 2 shows an embodiment of the distortion image, but the geometric correction is completed, but each side due to the perspective effect shows a curved rectangle. This is because a rectangular image with each side bent is created when a rectangle is photographed with a camera, because a part near the camera is larger in the rectangle and a part farther from the camera appears in a small distance, creating a perspective. (1) Even when photographing the interior of the object, it is difficult to grasp the exact shape of the interior of the object if the image with the perspective applied to the interior of the object is acquired. Therefore, it is necessary to remove the perspective effect of the distorted image. As a first step, the image compensator 222 extracts vanishing point coordinates from the distorted image. In FIG. 2, the first vanishing point P1 (X P1 , Y P1 ) where the straight line AB extends to meet and the second vanishing point P2 (X P2 , Y P2 ) where the straight line AD extends from the straight line BC are extracted. In FIG. 2, it is assumed that the first vanishing point P1 is located on the x axis, and the second vanishing point P2 is located on the y axis. In the second step, the image correction unit 222 calculates a correction coefficient constituting the projection algorithm from the vanishing point coordinates. To this end, first, a projection matrix [P] is generated as in Equation 3 below.
[수학식 3][Equation 3]
Figure PCTKR2017003648-appb-I000001
Figure PCTKR2017003648-appb-I000001
(a, b는 보정계수이고, 이하 같다.)(a and b are correction coefficients and are as follows.)
수학식 3의 프로젝션매트릭스 [P]는 원본매트릭스 [A]에 매트릭스 [B]를 곱함으로써 생성되는 매트릭스로, 3차원 상의 공간이미지를 2차원 상의 평면이미지로 변환하는 매트릭스이다. 그리고 보정계수 a, b는 보정이미지 좌표 (X, Y)로부터 왜곡이미지 좌표 (x, y)를 추출하기 위해 필요한 파라미터이다.The projection matrix [P] of Equation 3 is a matrix generated by multiplying the original matrix [A] by the matrix [B] and converts the spatial image on the three-dimensional image into a planar image on the two-dimensional image. The correction coefficients a and b are parameters required for extracting the distortion image coordinates (x, y) from the correction image coordinates (X, Y).
3차원 상의 임의의 한 점을 프로젝션매트릭스 [P]를 통하여 변환하면 하기 수학식 4와 같이 나타낼 수 있다.If any point on the three-dimensional image is converted through the projection matrix [P], it can be expressed as Equation 4 below.
[수학식 4][Equation 4]
Figure PCTKR2017003648-appb-I000002
Figure PCTKR2017003648-appb-I000002
그리고 수학식 4의 3열과 4열을 기본벡터 형식으로 나타내기 위하여 수학식 4를 (ax+by+1)로 나누면 하기 수학식 5와 같이 나타낼 수 있다.In order to express the third column and the fourth column of Equation 4 in the basic vector format, Equation 4 may be expressed as Equation 5 below by dividing Equation (ax + by + 1).
[수학식 5][Equation 5]
Figure PCTKR2017003648-appb-I000003
Figure PCTKR2017003648-appb-I000003
여기서, 수학식 5의 마지막 줄에 나타낸 것처럼, x/(ax+by+1)와 y/(ax+by+1)은 각각 X, Y로 치환하여 나타낼 수 있다. 즉, [X, Y, 0, 1]은 3차원 상의 임의의 한 점 [x, y, z, 1]이 프로젝션매트릭스 [P]를 통하여 변환된 2차원 상의 점이고, 여기서 각 매트릭스의 4열은 보정계수 a, b를 산출하기 위한 기본벡터이다. 한편 도 2에서 제1소실점 P1(XP1, YP1)과 제2소실점 P2(XP2, YP2)는 각각 x축과 y축 상의 점으로 표시되었으므로 x축 상의 단위벡터 [1, 0, 0, 0]과 y축 상의 단위벡터 [0, 1, 0, 0]을 프로젝션매트릭스 [P]를 통하여 변환하면 하기 수학식 6과 같이 나타낼 수 있다.Here, as shown in the last line of Equation 5, x / (ax + by + 1) and y / (ax + by + 1) may be represented by replacing with X and Y, respectively. That is, [X, Y, 0, 1] are two-dimensional points in which any point [x, y, z, 1] is transformed through the projection matrix [P], where four columns of each matrix are The base vector for calculating the correction coefficients a and b. Meanwhile, in FIG. 2, since the first vanishing point P1 (X P1 , Y P1 ) and the second vanishing point P2 (X P2 , Y P2 ) are represented as points on the x and y axes, respectively, the unit vectors on the x axis [1, 0, 0 , 0] and the unit vector [0, 1, 0, 0] on the y-axis can be expressed by Equation 6 below through the projection matrix [P].
[수학식 6][Equation 6]
Figure PCTKR2017003648-appb-I000004
Figure PCTKR2017003648-appb-I000004
그리고 수학식 6의 3열과 4열을 기본벡터 형식으로 나타내기 위하여 수학식 6의 1행을 a로 나누고, 2행을 b로 나누면 하기 수학식 7과 같이 나타낼 수 있다.In order to represent columns 3 and 4 of Equation 6 in a basic vector format, dividing one row of Equation 6 by a and dividing two rows by b may be expressed as in Equation 7 below.
[수학식 7][Equation 7]
Figure PCTKR2017003648-appb-I000005
Figure PCTKR2017003648-appb-I000005
(XP1은 제1소실점을 x축 상에 위치시켰을 때 제1소실점의 x좌표이고, YP2 는 제2소실점을 y축 상에 위치시켰을 때 제2소실점의 y좌표이며, 이하 같다.)(X P1 is the x coordinate of the first vanishing point when the first vanishing point is positioned on the x axis, and Y P2 is the y coordinate of the second vanishing point when the second vanishing point is positioned on the y axis, as follows.
그리고 수학식 7로부터 하기 수학식 2의 관계를 얻을 수 있다.And the relationship of the following formula (2) can be obtained from (7).
[수학식 2][Equation 2]
Figure PCTKR2017003648-appb-I000006
Figure PCTKR2017003648-appb-I000006
한편 상기 수학식 5를 정리하면, 하기 수학식 8과 같이 나타낼 수 있다.Meanwhile, Equation 5 may be summarized as shown in Equation 8 below.
[수학식 8][Equation 8]
Figure PCTKR2017003648-appb-I000007
Figure PCTKR2017003648-appb-I000007
그리고 수학식 8의 연립방정식을 풀면 하기 수학식 1과 같은 프로젝션알고리즘을 생성할 수 있다.And solving the system of equations (8) can generate a projection algorithm, such as the following equation (1).
[수학식 1][Equation 1]
Figure PCTKR2017003648-appb-I000008
Figure PCTKR2017003648-appb-I000008
(x, y : 왜곡이미지 좌표, X, Y : 보정이미지 좌표, a, b : 보정계수)(x, y: distortion image coordinate, X, Y: correction image coordinate, a, b: correction coefficient)
수학식 1로 표현되는 프로젝션알고리즘은 보정계수 a, b 대신에 수학식 2에서 산출한 1/XP1, 1/YP2를 대입하여 이용할 수 있다. 그리고 이러한 수학식 1을 이용함으로써 보정이미지 좌표 (X, Y)로부터 왜곡이미지 좌표 (x, y)를 추출하여 왜곡이미지의 화소를 보정이미지의 화소에 배치할 수 있다. 이는 인버스매핑(inverse mapping) 방식으로 미리 보정이미지를 가정하고 보정이미지의 화소가 왜곡이미지의 어느 화소와 매칭되는지를 찾는 방식이다. 이렇게 이미지보정부(222)는 수학식 1을 이용하는 인버스매핑(inverse mapping) 방식으로 보정이미지를 생성한다. 사용자는 생성된 보정이미지에 원근효과가 기대한 만큼 제거되지 않은 경우, 보정계수 a, b를 조절하여 이미지보정부(222)가 새로운 보정이미지를 생성하도록 할 수 있다.The projection algorithm represented by Equation 1 can be used by substituting 1 / X P1 and 1 / Y P2 calculated by Equation 2 instead of the correction coefficients a and b. By using Equation 1, the distortion image coordinates (x, y) may be extracted from the correction image coordinates (X, Y), and the pixels of the distortion image may be disposed in the pixels of the correction image. This method assumes a corrected image in advance by an inverse mapping method and finds which pixel of the distorted image matches the corrected image. The image correction unit 222 generates a correction image by an inverse mapping method using Equation 1. If the perspective effect is not removed as expected in the generated correction image, the user may adjust the correction coefficients a and b to cause the image correction unit 222 to generate a new correction image.
이미지데이터생성부(223)는 이미지보정부(222)가 생성한 보정이미지에 관하여 이미지데이터를 생성한다. 이미지데이터생성부(223)는 보정이미지를 후술할 디스플레이부(240)의 사이즈에 맞게 스케일링하여 이미지데이터를 생성할 수 있고, 필요에 따라 스케일링된 보정이미지를 압축부호화하는 인코더를 구비하여 스케일링하여 생성한 이미지데이터를 압축할 수도 있다.The image data generator 223 generates image data with respect to the corrected image generated by the image compensator 222. The image data generation unit 223 may generate image data by scaling the corrected image according to the size of the display unit 240 to be described later, and generate and scale the encoder by compressing and encoding the scaled corrected image as necessary. One image data can also be compressed.
메모리부(230)는 도 1에 도시된 바와 같이 이미지프로세서부(220)에 의해 이미지데이터가 저장되는 영역이다. 메모리부(230)는 플레시메모리타입(flash memory type), 하드디스크타입(hard disk type), 멀티미디어카드마이크로타입(multimedia card micro type), 카드타입의 메모리(예를 들어 SD메모리, XD메모리 등), 램(Random Access Memory, RAM), SRAM(Static Random Access Memory), 롬(Read-Only Memory, ROM), EEROM(Electrically Erasable Programmable Read-Only Memory), PROM(Programmable Read-Only Memory), 자기메모리, 자기디스크 및 광디스크 중 하나 이상이 될 수 있으나, 이에 한정하는 것은 아니다. 또한 사용자가 관찰경(1)이 촬영하는 이미지를 녹화하길 원하는 경우, 이러한 사용자의 녹화요청에 따라 이미지데이터는 메모리부(230)에서 녹화파일로 저장될 수도 있다. 또한 전술한 왜곡보정알고리즘이나 프로젝션알고리즘이 메모리부(230)에 저장되어 이미지보정부(222)가 왜곡이미지를 보정할 때 메모리부(230)에 저장된 상기 알고리즘을 참조할 수 있으며, 이미지보정부(222)는 특히 왜곡이미지의 원근효과를 제거할 때 보정계수를 산출하여 산출된 보정계수가 적용된 프로젝션알고리즘을 메모리부(230)에 저장할 수 있다.The memory unit 230 is an area in which image data is stored by the image processor 220 as shown in FIG. 1. The memory unit 230 may include a flash memory type, a hard disk type, a multimedia card micro type, and a card type memory (for example, SD memory and XD memory). Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEROM), Programmable Read-Only Memory (PROM), Magnetic Memory It may be one or more of a magnetic disk and an optical disk, but is not limited thereto. In addition, when the user wants to record an image taken by the observation mirror 1, the image data may be stored in the memory unit 230 as a recording file according to the user's recording request. In addition, the above-described distortion correction algorithm or projection algorithm may be stored in the memory unit 230 to refer to the algorithm stored in the memory unit 230 when the image correction unit 222 corrects the distortion image. In particular, the memory unit 230 may store a projection algorithm to which the correction coefficient is calculated by calculating a correction coefficient when removing the perspective effect of the distorted image.
디스플레이부(240)는 도 1에 도시된 바와 같이 이미지프로세서부(220)가 생성한 이미지데이터를 디스플레이한다. 디스플레이부(240)는 액정디스플레이(Liquid Crystal Display, LCD), 유기발광다이오드(Organic Light Emitting Diode, OLED), 전계발광디스플레이(Electro Luminescent Display, ELD), 플라즈마디스플레이패널(Plasma Display Panel, PDP), 전계방출디스플레이(Field Emission Display, FED) 및 전기영동디스플레이(Electro Phoretic Display, EPD) 중 선택되는 하나의 디스플레이장치를 포함할 수 있으나, 이에 한정하는 것은 아니다. 디스플레이부(240)는 디스플레이장치가 본체부(20)에 고정되어 있는 고정식 디스플레이부(240)일 수 있고, 디스플레이장치를 탈착할 수 있는 탈착식 디스플레이부(240)일 수도 있다. 탈착식 디스플레이부(240)의 경우, 본체부(20)에 디스플레이장치가 거치되는 거치부(미도시)가 구비될 수 있다. 또한 디스플레이부(240)는, 커버를 구비하여 커버를 접으면 디스플레이장치의 화면을 덮게 되고 커버를 펴면 디스플레이장치의 화면이 보이게 되는, 접이식 디스플레이부(240)일 수도 있다.The display unit 240 displays image data generated by the image processor 220 as shown in FIG. 1. The display unit 240 includes a liquid crystal display (LCD), an organic light emitting diode (OLED), an electroluminescent display (ELD), a plasma display panel (PDP), The display device may include one display device selected from a field emission display (FED) and an electrophoretic display (EPD), but is not limited thereto. The display unit 240 may be a fixed display unit 240 in which the display device is fixed to the main body unit 20, or may be a removable display unit 240 in which the display device can be detached. In the case of the removable display unit 240, a mounting unit (not shown) on which the display device is mounted may be provided on the main body unit 20. In addition, the display unit 240 may include a folding display unit 240 having a cover to cover the screen of the display device when the cover is folded and to open the cover to show the screen of the display device.
통신부(250)는 도 1에 도시된 바와 같이 이미지프로세서부(220)가 생성한 이미지데이터를 외부기기로 전달하며, 이로 인해 후술하는 원격진료가 가능하다. 통신부(250)는 이미지데이터를 외부기기로 유선으로 전달할 수도 있고, 이를 위해 USB포트 또는 메모리카드슬롯을 구비할 수 있으나, 이에 한정하는 것은 아니다. 또한 통신부(250)는 이미지데이터를 외부기기로 무선으로 전달할 수도 있고, 이 경우 적외선(IR)통신, 라디오주파수(RF)통신, 무선랜, 와이파이(Wi-Fi), 와이브로(WiBro), 초광대역통신(UWB), 블루투스, 직비 및 근거리무선통신(NFC) 중 하나 이상의 무선통신수단을 이용할 수 있으나, 이에 한정하는 것은 아니다. 사용자의 녹화요청에 따라 이미지데이터가 녹화파일로 저장된 경우, 사용자는 통신부(250)의 USB포트에 유에스비플래시 드라이브(USB flash drive)를 연결하거나 통신부(250)의 메모리카드슬롯에 메모리카드를 연결하여 녹화파일을 다운로드할 수 있다.The communication unit 250 transfers the image data generated by the image processor 220 to an external device as shown in FIG. 1, and thus, remote medical treatment described below is possible. The communication unit 250 may transfer image data to an external device by wire, and may include a USB port or a memory card slot for this purpose, but is not limited thereto. In addition, the communication unit 250 may transmit the image data to an external device wirelessly, in this case, infrared (IR) communication, radio frequency (RF) communication, wireless LAN, Wi-Fi (Wi-Fi), WiBro (WiBro), ultra-wideband One or more wireless communication means of communication (UWB), Bluetooth, direct wireless and near field communication (NFC) may be used, but is not limited thereto. When the image data is stored as a recording file according to the user's recording request, the user connects the USB flash drive to the USB port of the communication unit 250 or connects the memory card to the memory card slot of the communication unit 250. You can download the recorded file.
이미지프로세서부(220)는 이미지데이터의 빠른 저장과 독출을 위해 DMA(Direct Memory Access)컨트롤러(미도시)를 포함하여, DMA(Direct Memory Access) 방식으로 메모리부(230), 디스플레이부(240), 통신부(250) 및 이미지프로세서부(220) 간에 이미지데이터를 수수할 수 있다.The image processor 220 includes a direct memory access (DMA) controller (not shown) for fast storage and reading of image data, and includes a memory unit 230 and a display unit 240 in a direct memory access (DMA) manner. The image data may be received between the communication unit 250 and the image processor unit 220.
본 발명의 관찰경시스템은 관찰경(1) 및 단말을 포함하여 이루어지며, 도 3 내지 도 5는 관찰경시스템의 일실시예를 도시하고 있다. 이하, 도 3 내지 도 5를 참조하여 관찰경시스템을 이루는 각 구성요소별로 상술하기로 한다.The observation system of the present invention comprises an observation mirror 1 and a terminal, and FIGS. 3 to 5 illustrate one embodiment of the observation system. Hereinafter, each component constituting the observation mirror system will be described in detail with reference to FIGS. 3 to 5.
관찰경(1)은 전술한 바와 같이, 대상의 내부에 광을 조사한 다음 대상의 내부로부터 반사되는 광을 취득하여 이미지데이터를 생성하는 장치로서, 헤드부(10) 및 본체부(20)를 포함하여 이루어진다.As described above, the observation mirror 1 is an apparatus for generating image data by irradiating light into the inside of the object and then obtaining light reflected from the inside of the object. The observation mirror 1 includes a head part 10 and a main body part 20. It is done by
단말은 관찰경(1)과 유선 또는 무선통신수단을 통하여 연결되는 기기이다. 유선통신수단은 소정의 케이블일 수 있고, 무선통신수단은 적외선(IR)통신, 라디오주파수(RF)통신, 무선랜, 와이파이(Wi-Fi), 와이브로(WiBro), 초광대역통신(UWB), 블루투스, 직비 및 근거리무선통신(NFC) 중 하나 이상일 수 있으나, 다른 유선 또는 무선통신수단을 배제하는 것은 아니다. 단말은 유선단말 또는 무선단말일 수 있고, 도 3은 관찰경(1)과 무선통신수단을 통하여 연결되는 유선단말과 무선단말을 도시하고 있다. 유선단말은 개인컴퓨터(Personal Computer, PC) 및/또는 노트북(notebook)일 수 있고, 이하 같으나, 이에 한정하는 것은 아니다. 무선단말은 개인통신단말기(Personal Communication System, PCS), GSM(Global System for Mobile communications)단말기, 개인디지털셀룰러단말기(Personal Digital Cellular, PDC), PHS(Personal Handyphone System)단말기, 개인정보단말기(Personal Digital Assistant, PDA), 스마트폰(smart phone), 텔레매틱스(telematics) 및 무선데이터통신단말기, 휴대인터넷단말 중 하나 이상이 될 수 있을 뿐만 아니라, 의사가 착용하는 안경도 무선통신모듈을 구비하여 이미지데이터를 전달받을 수 있으므로, 무선단말을 어떤 형태로든 한정하지 않으며, 이하 같다. 관찰경(1)의 본체부(20)가 통신부(250)를 포함하는 경우, 단말은 통신부(250)로부터 이미지데이터를 유선 또는 무선통신수단을 통하여 전달받는다. 또한 관찰경(1)의 본체부(20)는 단말이 거치되는 거치부(미도시)를 포함하여, 단말이 거치부에 거치된 상태에서 관찰경(1)에서 단말로 유선 또는 무선통신수단을 통한 이미지데이터 전달이 이루어질 수 있다.The terminal is a device connected to the observation mirror 1 through a wired or wireless communication means. The wired communication means may be a predetermined cable, the wireless communication means may be infrared (IR) communication, radio frequency (RF) communication, wireless LAN, Wi-Fi (Wi-Fi), WiBro, ultra-wideband communication (UWB), It may be one or more of Bluetooth, direct and near field communication (NFC), but does not exclude other wired or wireless communication means. The terminal may be a wired terminal or a wireless terminal, and FIG. 3 shows a wired terminal and a wireless terminal connected through the observation mirror 1 and a wireless communication means. The wired terminal may be a personal computer (PC) and / or a notebook, but the following is not limited thereto. Wireless terminals include Personal Communication System (PCS), GSM (Global System for Mobile communications) terminals, Personal Digital Cellular (PDC) terminals, PHS (Personal Handyphone System) terminals, and personal digital assistants. It can be one or more of an assistant, a PDA), a smart phone, a telematics and a wireless data communication terminal, and a portable internet terminal, and the glasses worn by a doctor are also equipped with a wireless communication module to display image data. Since it can be transmitted, the wireless terminal is not limited to any form, as follows. When the main body 20 of the observation mirror 1 includes the communication unit 250, the terminal receives image data from the communication unit 250 through a wired or wireless communication means. In addition, the main body 20 of the observation mirror 1 includes a mounting portion (not shown) on which the terminal is mounted, and the wired or wireless communication means from the observation mirror 1 to the terminal in a state where the terminal is mounted on the mounting portion. Image data transfer through can be made.
관찰경(1) 및 단말은 상기 유선 또는 무선통신수단에 의한 미러링(mirroring)을 통하여 복수 개의 기능을 공유할 수 있다. 예를 들어 관찰경(1)의 본체부(20)가 디스플레이부(240)를 포함하는 경우, 디스플레이부(240)에서 디스플레이되는 화면이 스트리밍(streaming)을 통해 단말로 동일하게 출력될 수 있다.The observation mirror 1 and the terminal may share a plurality of functions through mirroring by the wired or wireless communication means. For example, when the main body 20 of the observation mirror 1 includes the display unit 240, the screen displayed on the display unit 240 may be output to the terminal through streaming.
관찰경시스템에서 단말은 복수 개의 단말기를 의미할 수도 있으므로, 관찰경시스템은 관찰경(1), 제1단말 및 제2단말을 포함하여 이루어질 수 있다. 관찰경(1) 및 제1단말은 제1통신수단에 의한 미러링(mirroring)을 통하여 복수 개의 기능을 공유할 수 있고, 제1단말 및 제2단말은 제2통신수단에 의한 미러링(mirroring)을 통하여 복수 개의 기능을 공유할 수 있다. 예를 들어 관찰경(1)의 본체부(20)가 디스플레이부(240)를 포함하는 경우, 디스플레이부(240)에서 디스플레이되는 화면이 스트리밍(streaming)을 통해 제1단말로 동일하게 출력될 수 있고, 제1단말에서 출력되는 화면이 스트리밍(streaming)을 통해 제2단말로 동일하게 출력될 수 있다. 제1통신수단은 상기 유선 또는 무선통신수단일 수 있고, 제2통신수단도 상기 유선 또는 무선통신수단일 수 있으며, 제1단말은 상기 유선단말 또는 상기 무선단말일 수 있고, 제2단말도 상기 유선단말 또는 상기 무선단말일 수 있다. 도 4는 관찰경(1)과 무선통신수단을 통하여 연결되는 제1무선단말 및 제1무선단말과 무선통신수단을 통하여 연결되는 제2무선단말을 도시하고 있다. 제1단말 및 제2단말 간의 미러링(mirroring)은 제2단말이 제1단말에 기능 공유를 위한 요청을 전송하고 요청을 수신한 제1단말이 응답함으로써 이루어질 수 있으나, 이러한 방식에 한정하는 것은 아니다. 특히 관찰경(1) 및 제1단말보다 제1단말 및 제2단말이 더 많은 기능을 공유할 수 있는데, 전술한 화면뿐만 아니라, 오디오, 스케치, 파일, 카메라, 위치, 음성통화, 영상통화, 채팅 등을 공유할 수 있다.Since the terminal may mean a plurality of terminals in the observation mirror system, the observation mirror system may include the observation mirror 1, the first terminal, and the second terminal. The observation mirror 1 and the first terminal can share a plurality of functions through mirroring by the first communication means, and the first terminal and the second terminal share the mirroring by the second communication means. Multiple functions can be shared through. For example, when the main body 20 of the observation mirror 1 includes the display unit 240, the screen displayed on the display unit 240 may be output to the first terminal through streaming. The screen output from the first terminal may be identically output to the second terminal through streaming. The first communication means may be the wired or wireless communication means, the second communication means may be the wired or wireless communication means, the first terminal may be the wired terminal or the wireless terminal, and the second terminal may also be It may be a wired terminal or the wireless terminal. 4 shows a first wireless terminal connected with the observation mirror 1 through a wireless communication means, and a second wireless terminal connected with the first wireless terminal via a wireless communication means. Mirroring between the first terminal and the second terminal may be performed by the second terminal sending a request for function sharing to the first terminal and the first terminal receiving the request responds, but is not limited thereto. . In particular, the first terminal and the second terminal may share more functions than the observation mirror 1 and the first terminal. Chat and share.
관찰경시스템에서 단말이 관찰경(1)으로부터 원거리에 위치하는 경우, 단말은 관찰경(1)과 WCDMA, HSDPA, CDMA2000, 와이브로(WiBro), 와이맥스(WiMax), LTE, LTE-Advanced 및 와이파이(Wi-Fi) 중 하나 이상 무선통신수단을 이용하여 접속하는 인터넷을 통하여 연결될 수 있으나, 다른 무선통신수단을 배제하지 않음은 물론이다. 도 5는 관찰경(1) 및 관찰경(1)으로부터 원거리에 위치하는 단말이 연결되는 모습을 도시하고 있다. 관찰경(1)과 단말 간에는 하나 이상의 게이트웨이(예를 들어, 유·무선 공유기에 설정된 게이트웨이, 인터넷제공사의 게이트웨이 등)나 하나 이상의 액세스포인트(예를 들어, 무선중계기, 기지국 등)가 존재할 수 있으며, 관찰경(1)이 디지털신호로 이미지데이터를 송출하는 경우 감쇠한 디지털신호를 재생하는 리피터가 하나 이상 존재할 수 있다.In the observation system, when the terminal is located far from the observation mirror (1), the terminal is connected to the observation mirror (1) and WCDMA, HSDPA, CDMA2000, WiBro, WiMax, LTE, LTE-Advanced and Wi-Fi ( One or more of the Wi-Fi) may be connected via the Internet to connect using a wireless communication means, of course, does not exclude other wireless communication means. 5 is a view illustrating a state in which a terminal located at a distance from the observation mirror 1 and the observation mirror 1 is connected. There may be one or more gateways (for example, a gateway configured on a wire / wireless router, a gateway of an Internet provider, etc.) or one or more access points (for example, a wireless repeater, a base station, etc.) between the observation mirror 1 and the terminal. When the observation mirror 1 transmits the image data as a digital signal, there may be one or more repeaters for reproducing the attenuated digital signal.
본 발명의 관찰경시스템은 관찰경(1) 및 서버를 포함하여 이루어질 수도 있으며, 도 5는 이러한 관찰경시스템의 일실시예를 도시하고 있다. 이러한 관찰경시스템에서 관찰경(1)과 유선 또는 무선네트워크 상의 서버가 연결되어, 관찰경(1)의 통신부(250)로부터 전달된 이미지데이터가 서버에 저장될 수 있다. 서버에 저장된 이미지데이터는 환자, 의사, 기타 관계인이 소정의 수단을 통하여 이용할 수 있으며, 이러한 관점에서 관찰경시스템은 관찰경(1) 및 서버 외에 서버로부터 이미지데이터를 유선 또는 무선통신수단을 통하여 전달받는 단말을 더 포함하여 이루어질 수 있다. 여기서 단말은 유선단말 또는 무선단말일 수 있다. 이러한 관찰경시스템은 도 5에 도시된 바와 같이, 관찰경(1)과 인터넷을 통하여 연결되는 서버 사이에 하나 이상의 게이트웨이(예를 들어, 유·무선 공유기에 설정된 게이트웨이, 인터넷제공사의 게이트웨이 등)나 하나 이상의 액세스포인트(예를 들어, 무선중계기, 기지국 등)가 존재할 수 있으며, 관찰경(1)이 디지털신호로 이미지데이터를 송출하는 경우 감쇠한 디지털신호를 재생하는 리피터가 하나 이상 존재할 수 있다.The observation mirror system of the present invention may comprise an observation mirror 1 and a server, and FIG. 5 shows one embodiment of such an observation mirror system. In this observation system, the observation mirror 1 is connected to a server on a wired or wireless network, and image data transmitted from the communication unit 250 of the observation mirror 1 may be stored in the server. The image data stored in the server can be used by a patient, doctor, or other person through a predetermined means. In this respect, the observation system transmits image data from the server in addition to the observation mirror 1 and the server through a wired or wireless communication means. It may be made further comprising a receiving terminal. The terminal may be a wired terminal or a wireless terminal. As shown in FIG. 5, one or more gateways (for example, a gateway set in a wire / wireless router, a gateway of an Internet provider, etc.) between the observation mirror 1 and a server connected through the Internet, or the like. There may be one or more access points (eg, a wireless repeater, a base station, etc.), and when the observation mirror 1 transmits image data as a digital signal, there may be one or more repeaters for reproducing attenuated digital signals.
또한 이러한 관찰경시스템에서, 원격진료가 이루어질 수 있다. 관찰경(1)으로부터 원거리에 위치하는 의사인 원격의사가 서버 또는 단말에 전달된 이미지데이터를 분석하여 분석결과 및/또는 이와 관련된 처방을 관찰경(1)의 사용자에게 제공하거나, 서버 또는 단말에 설치된 이미지데이터 분석 프로그램인 가상의사가 서버 또는 단말에 전달된 이미지데이터를 분석하여 분석결과 및/또는 이와 관련된 처방을 관찰경(1)의 사용자에게 제공할 수도 있다. 이 경우 서버 또는 단말은, 관찰경(1) 사용자의 아이디를 인증하여 관찰경(1) 사용자가 소유하는 단말로 분석결과 및/또는 이와 관련된 처방이 전달되게 할 수 있고, 관찰경(1)의 아이디를 인증하여 관찰경(1) 사용자가 소유하는 단말로 분석결과 및/또는 이와 관련된 처방이 전달되게 할 수도 있다.Also in this observation system, telemedicine can be made. A remote doctor who is a doctor located at a distance from the observation mirror 1 analyzes the image data transmitted to the server or the terminal, and provides the analysis result and / or a related prescription to the user of the observation mirror 1, or to the server or the terminal. The virtual doctor who is an installed image data analysis program may analyze the image data transmitted to the server or the terminal and provide the analysis result and / or a prescription related thereto to the user of the observation mirror 1. In this case, the server or the terminal may authenticate the ID of the observer 1 so that the analysis result and / or a prescription thereof may be transmitted to the terminal owned by the observer 1 and the The ID may be authenticated so that the analysis result and / or the prescription related thereto may be delivered to the terminal owned by the observer 1.
또한 이러한 관찰경시스템에서, 다량의 이미지데이터가 서버 또는 단말에 축적되는 경우 축적된 이미지데이터는 빅데이터 분석이나 치료예후비교에 사용될 수 있고, 빅데이터 분석 결과나 치료예후비교 결과가 관찰경(1)의 사용자에게 제공될 수 있다. 이 경우 서버 또는 단말은, 관찰경(1) 사용자의 아이디를 인증하여 관찰경(1) 사용자가 소유하는 단말로 빅데이터 분석 결과나 치료예후비교 결과가 전달되게 할 수 있고, 관찰경(1)의 아이디를 인증하여 관찰경(1) 사용자가 소유하는 단말로 빅데이터 분석 결과나 치료예후비교 결과가 전달되게 할 수도 있다.Also, in such an observation system, when a large amount of image data is accumulated in a server or a terminal, the accumulated image data may be used for big data analysis or treatment prognosis, and the results of big data analysis or treatment prognosis may be observed. May be provided to the user. In this case, the server or the terminal may authenticate the ID of the observer (1) so that the big data analysis result or the treatment prognosis comparison result may be delivered to the terminal owned by the observer (1), and the observer (1) By authenticating the ID of the observation device (1) may be a big data analysis results or treatment prognosis comparison results to the terminal owned by the user.
이하, 앞에서 도 1을 참조하여 상술하였던 본 발명의 관찰경(1)에 관하여 실시예를 상술하기로 한다. 앞에서 명확하지 않았던 부분은 하기의 실시예를 통하여 명확하게 이해될 수 있을 것이다.Hereinafter, an embodiment of the observation mirror 1 of the present invention described above with reference to Figure 1 will be described in detail. Parts that were not clear above may be clearly understood through the following examples.
[실시예]EXAMPLE
<관찰경(1)의 전반적인 구조><Overall structure of observation mirror 1>
도 6 내지 도 8에 도시된 것과 같은 관찰경(1)을 제조하였다. 제조된 관찰경(1)은 전체적으로 권총 형상으로서, 사용자가 튜브(180)를 연결하고 파지하여 사용할 수 있도록 되어 있다. 제조된 관찰경(1)은 헤드부(10), 본체부(20) 및 연결부(30)로 나눌 수 있고, 도 6 및 도 7에서는 제조된 관찰경(1)의 외형을, 도 8에서는 제조된 관찰경(1)의 내부를 도시하고 있다. 도 6에 도시된 헤드부(10)와 본체부(20)가 결합된 관찰경(1)은 도 7에 도시된 바와 같이 헤드부(10)와 본체부(20)를 분리할 수 있도록 되어 있다.An observation mirror 1 as shown in Figs. 6 to 8 was prepared. The manufactured observation mirror 1 has a pistol shape as a whole, so that the user can connect and grip the tube 180 to use it. The manufactured observation mirror 1 may be divided into a head portion 10, a main body portion 20, and a connection portion 30. The appearance of the manufactured observation mirror 1 is illustrated in FIGS. 6 and 7, and manufactured in FIG. 8. The inside of the observed observation mirror 1 is shown. As shown in FIG. 7, the observation mirror 1 in which the head part 10 and the main body 20 shown in FIG. 6 are coupled is configured to separate the head part 10 and the main body part 20. .
헤드부(10)는 도 8에 도시된 광취득부(110), 광방출부(120), 튜브(180), 도 7에 도시된 튜브홀더(130), 턴슬라이더(150), 고정부(160), 고정부홈(170), 도 6 내지 도 8에는 도시하지 않았지만 튜브홀더(130)와 턴슬라이더(150) 사이에 위치하는 턴가이드(140)를 포함하여 이루어진다. 또한 도 6 내지 도 8에는 도시하지 않았지만 튜브홀더(130)는 튜브홀더돌기(131)를 포함하고, 턴가이드(140)는 안내면(141)을 포함하며, 턴슬라이더(150)는 턴슬라이더홈(151)을 포함한다. 튜브홀더(130)는 도 7에 도시된 바와 같이 홀(hole)을 포함하는데, 사용자는 도 8에 도시된 바와 같이 이 홀(hole)에 튜브(180)를 삽입하여 관찰경(1)을 사용할 수 있다. 고정부(160)는 회전하지 않고 고정된 부분이며, 나머지 부분 중 전술하지 않은 부분에 대하여는 후술한다.The head unit 10 includes the light acquisition unit 110 shown in FIG. 8, the light emitting unit 120, the tube 180, the tube holder 130 shown in FIG. 7, the turn slider 150, and the fixing unit ( 160, the fixing groove 170, and although not shown in FIGS. 6 to 8, the turn guide 140 includes a turn guide 140 positioned between the tube holder 130 and the turn slider 150. 6 to 8, the tube holder 130 includes a tube holder protrusion 131, the turn guide 140 includes a guide surface 141, and the turn slider 150 includes a turn slider groove ( 151). The tube holder 130 includes a hole as shown in FIG. 7, and the user inserts the tube 180 into the hole as shown in FIG. 8 to use the observation mirror 1. Can be. The fixing part 160 is a part which is fixed without being rotated, and a part which is not described above among the remaining parts will be described later.
본체부(20)는 도 8에 도시된 이미지센서(210), 이미지프로세서부(220), 광원부(260), 도 7에 도시된 디스플레이부(240), 하우징부(270), 손잡이부(280), 도 6 내지 도 8에는 도시하지 않았지만 메모리부(230), 통신부(250)를 포함하여 이루어진다. 하우징부(270)는 이미지센서(210), 이미지프로세서부(220), 메모리부(230), 통신부(250), 광원부(260) 등을 수용한다. 손잡이부(280)는 사용자가 파지하는 부분이며, 나머지 부분에 대하여는 전술하였다.The main body 20 includes an image sensor 210, an image processor 220, a light source 260, a display 240, a housing 270, and a handle 280 of FIG. 7. 6 to 8, but not shown, includes a memory 230 and a communication unit 250. The housing unit 270 accommodates the image sensor 210, the image processor unit 220, the memory unit 230, the communication unit 250, the light source unit 260, and the like. The handle part 280 is a part grasped by the user, and the remaining part has been described above.
연결부(30)는 도 7에 도시된 바와 같이 헤드부(10) 일측에 형성되는 핀(310) 및 본체부(20) 일측에 형성되는 락커(320)를 포함하여 이루어지고, 헤드부(10)와 본체부(20)를 연결한다. 또한 도 6 내지 도 8에는 도시하지 않았지만 락커(320)는 락커홈(321), 락커외벽(322) 및 락커내벽(323)을 포함한다.The connection part 30 includes a pin 310 formed at one side of the head part 10 and a locker 320 formed at one side of the main body part 20, as shown in FIG. 7, and the head part 10. And the main body 20. 6 to 8, the locker 320 includes a locker groove 321, a locker outer wall 322, and a locker inner wall 323.
<헤드부(10)의 탈착>Desorption of the Head 10
도 6 및 도 7에 도시된 헤드부(10)의 탈착은 연결부(30)의 핀(310)과 락커(320)가 분리 또는 결합하는 구조에 의해 가능하다. 도 9 및 도 10은 연결부(30)의 구조를 상세히 도시하고 있고, 헤드부(10)의 탈착의 설명에 불필요한 부분은 도시하지 않았다. 사용자는 핀(310)을 락커홈(321)에 삽입하고, 락커(320)를 시계 방향 또는 반시계 방향으로 회전시켜 헤드부(10)와 본체부(20)를 결합할 수 있다. 도 9에서 사용자가 핀(310)을 락커홈(321)에 삽입하면 고정부홈(170)이 락커외벽(322)에 맞물리게 되고, 이 상태는 아직 사용자가 헤드부(10)를 앞·뒤로 움직이면 헤드부(10)를 본체부(20)로부터 분리할 수 있는 상태이다. 이 상태에서 사용자가 락커(320)를 시계 방향 또는 반시계 방향으로 회전시키면, 락커홈(321)에 삽입된 핀(310)이 도 10에 도시된 락커내벽(323)에 막혀 헤드부(10)가 본체부(20)로부터 분리될 수 없는 상태가 된다. 또한 사용자는 이 상태에서 락커(320)를 시계 방향 또는 반시계 방향으로 회전시켜 락커홈(321)과 핀(310)을 정렬하고, 헤드부(10)를 앞으로 당김으로써 헤드부(10)를 본체부(20)로부터 분리할 수 있다.6 and 7 can be attached and detached by a structure in which the pin 310 and the locker 320 of the connecting portion 30 are separated or combined. 9 and 10 show the structure of the connecting portion 30 in detail, and the parts unnecessary for explaining the detachment of the head portion 10 are not shown. The user inserts the pin 310 into the locker groove 321 and rotates the locker 320 clockwise or counterclockwise to couple the head portion 10 to the main body portion 20. In FIG. 9, when the user inserts the pin 310 into the locker groove 321, the fixing part groove 170 is engaged with the locker outer wall 322, and the user still moves the head 10 back and forth in this state. The head portion 10 can be separated from the main body portion 20. When the user rotates the locker 320 clockwise or counterclockwise in this state, the pin 310 inserted into the locker groove 321 is blocked by the locker inner wall 323 shown in FIG. Is in a state where it cannot be separated from the main body 20. In this state, the user rotates the locker 320 clockwise or counterclockwise to align the locker groove 321 with the pin 310 and pulls the head portion 10 forward to move the head portion 10 to the main body. It can be separated from the portion 20.
<초점거리 조절><Focal length adjustment>
초점거리는 헤드부(10)가 갖춘 회전운동을 직선운동으로 변환하는 구조에 의하여 조절될 수 있고, 이러한 헤드부(10)의 구조를 도 11 및 도 12에서 상세히 도시하고 있으며, 초점거리 조절의 설명에 불필요한 부분은 도시하지 않았다. 도 11 및 도 12에 도시된 바와 같이, 헤드부(10)는 튜브홀더돌기(131)를 포함하고 좁은 선단과 넓은 후단을 가지는 튜브홀더(130), 안내면(141)을 포함하고 튜브홀더(130)의 일부를 둘러싸는 턴가이드(140), 튜브홀더돌기(131)에 맞물리는 턴슬라이더홈(151)을 포함하고 턴가이드(140)의 일부를 둘러싸는 턴슬라이더(150)를 포함하는 구조를 갖추고 있다. 사용자가 턴슬라이더(150)를 반시계 방향으로 회전시키면, 턴슬라이더홈(151)이 튜브홀더돌기(131)에 힘을 가하게 되면서 튜브홀더(130)가 반시계 방향으로 회전하며 뒤로 움직이게 되고(여기서 튜브홀더돌기(131)는 안내면(141)을 따라 이동함.), 초점거리는 짧아진다. 반대로, 사용자가 턴슬라이더(150)를 시계 방향으로 회전시키면, 턴슬라이더홈(151)이 튜브홀더돌기(131)에 힘을 가하게 되면서 튜브홀더(130)가 시계 방향으로 회전하며 앞으로 움직이게 되고(여기서 튜브홀더돌기(131)는 안내면(141)을 따라 이동함.), 초점거리는 길어진다. 여기서 턴가이드(140)와 도 11 및 도 12에는 도시되지 않은 고정부(160)는 움직이지 않는다. 즉 턴슬라이더(150)의 회전운동이 튜브홀더(130)의 직선운동으로 변환되면서 초점거리가 조절되는 것이다.The focal length can be adjusted by a structure that converts the rotational movement of the head portion 10 into a linear movement, and the structure of such a head portion 10 is shown in detail in FIGS. 11 and 12, and description of the focal length adjustment The unnecessary part is not shown. 11 and 12, the head portion 10 includes a tube holder protrusion 131, a tube holder 130 having a narrow front end and a wide rear end, a guide surface 141, and a tube holder 130. A structure including a turn guide (140) surrounding a portion of the), a turn slider groove (151) engaged with the tube holder protrusion (131), and a turn slider (150) surrounding a portion of the turn guide (140). Equipped. When the user rotates the turn slider 150 counterclockwise, while the turn slider groove 151 applies a force to the tube holder protrusion 131, the tube holder 130 rotates counterclockwise and moves backwards (where The tube holder protrusion 131 moves along the guide surface 141.), and the focal length is shortened. On the contrary, if the user rotates the turn slider 150 in the clockwise direction, while the turn slider groove 151 exerts a force on the tube holder protrusion 131, the tube holder 130 rotates in the clockwise direction and moves forward (where The tube holder protrusion 131 moves along the guide surface 141.), and the focal length becomes long. Here, the turn guide 140 and the fixing part 160, which is not shown in FIGS. 11 and 12, do not move. That is, the focal length is adjusted while the rotational movement of the turn slider 150 is converted into the linear movement of the tube holder 130.
본 발명을 첨부된 도면과 함께 설명하였으나, 이는 본 발명의 요지를 포함하는 다양한 실시 형태 중의 하나의 실시예에 불과하며, 당업계에서 통상의 지식을 가진 자가 용이하게 실시할 수 있도록 하는 데에 그 목적이 있는 것으로, 본 발명은 상기 설명된 실시예에만 국한되는 것이 아님은 명확하다. 따라서, 본 발명의 보호범위는 하기의 청구범위에 의해 해석되어야 하며, 본 발명의 요지를 벗어나지 않는 범위 내에서의 변경, 치환, 대체 등에 의해 그와 동등한 범위 내에 있는 모든 기술 사상은 본 발명의 권리범위에 포함될 것이다. 또한, 도면의 일부 구성은 구성을 보다 명확하게 설명하기 위한 것으로 실제보다 과장되거나 축소되어 제공된 것임을 명확히 한다.Although the present invention has been described with reference to the accompanying drawings, it is merely one example of various embodiments including the gist of the present invention, which can be easily implemented by those skilled in the art. It is clear that the present invention is not limited to the above-described embodiment only. Therefore, the protection scope of the present invention should be interpreted by the following claims, and all technical ideas within the scope equivalent to the change, substitution, substitution, etc. within the scope not departing from the gist of the present invention shall be the right of the present invention. It will be included in the scope. In addition, some of the components of the drawings are intended to more clearly describe the configuration, and it is clear that the exaggerated or reduced size is provided.
<부호의 설명><Description of the code>
1 : 관찰경1: observation mirror
10 : 헤드부  10: head part
110 : 광취득부     110: light acquisition unit
120 : 광방출부     120: light emitting unit
130 : 튜브홀더     130: tube holder
131 : 튜브홀더돌기         131: tube holder projection
140 : 턴가이드     140: turn guide
141 : 안내면         141: guide surface
150 : 턴슬라이더     150: turn slider
151 : 턴슬라이더홈         151: Turn Slider Home
160 : 고정부     160: fixed part
170 : 고정부홈     170: fixing part groove
180 : 튜브     180: tube
20 : 본체부  20: main body
210 : 이미지센서     210: image sensor
220 : 이미지프로세서부     220: image processor unit
221 : 신호변환부         221: signal conversion unit
222 : 이미지보정부         222: Image Supplement
223 : 이미지데이터생성부         223: image data generation unit
230 : 메모리부     230: memory section
240 : 디스플레이부     240: display unit
250 : 통신부     250: communication unit
260 : 광원부     260: light source
270 : 하우징부     270: housing part
280 : 손잡이부     280: handle portion
30 : 연결부  30: connection
310 : 핀     310: pin
320 : 락커     320: locker
321 : 락커홈         321: Locker Home
322 : 락커외벽         322: locker outer wall
323 : 락커내벽         323: locker inner wall

Claims (20)

  1. 대상의 내부를 촬영하는 관찰경(1)에 있어서, In the observation mirror 1 which photographs the inside of the object,
    상기 대상의 내부로부터 반사되는 광을 취득하여 광신호를 생성하는 광취득부(110)를 포함하는 헤드부(10); 및 A head part 10 including a light acquiring part 110 for acquiring light reflected from the inside of the object to generate an optical signal; And
    상기 헤드부(10)에 결합되고, 상기 광신호를 수신하여 이미지데이터를 생성하는 기능을 구비하는 본체부(20); A main body unit 20 coupled to the head unit 10 and having a function of receiving the optical signal to generate image data;
    를 포함하여 이루어지고, It is made, including
    상기 헤드부(10)는 상기 본체부(20)로부터 분리될 수 있으며, 초점거리(focal length)를 조절하는 기능을 구비하는 것을 특징으로 하는 관찰경The head portion 10 may be separated from the body portion 20, and the observation mirror, characterized in that it has a function to adjust the focal length (focal length)
  2. 청구항 1에 있어서, The method according to claim 1,
    상기 헤드부(10)는 회전운동을 직선운동으로 변환하는 구조를 구비하고, 상기 구조에 의하여 상기 초점거리(focal length)가 조절되는 것을 특징으로 하는 관찰경.The head portion (10) has a structure for converting the rotational movement into a linear movement, the observation mirror, characterized in that the focal length (focal length) is adjusted by the structure.
  3. 청구항 1에 있어서, The method according to claim 1,
    상기 본체부(20)는, The main body 20,
    상기 광신호를 수신하여 이미지신호를 생성하는 이미지센서(210) 및 상기 이미지신호를 처리하여 상기 이미지데이터를 생성하는 이미지프로세서부(220)를 더 포함하는 것을 특징으로 하는 관찰경.And an image sensor (210) for receiving the optical signal to generate an image signal, and an image processor (220) for processing the image signal to generate the image data.
  4. 청구항 3에 있어서, The method according to claim 3,
    상기 이미지프로세서부(220)는, The image processor unit 220,
    상기 이미지신호를 수신하여 왜곡이미지를 보정하는 이미지보정부(222)를 포함하는 것을 특징으로 하는 관찰경.And an image compensator (222) for receiving the image signal and correcting the distorted image.
  5. 청구항 1에 있어서, The method according to claim 1,
    상기 헤드부(10)는, The head portion 10,
    렌즈 및 복수 개의 광섬유를 포함하고, 상기 대상의 내부에 삽입되어 상기 반사되는 광을 상기 광취득부(110)로 전달하는 튜브(180)를 더 포함하는 것을 특징으로 하는 관찰경.And a tube (180) including a lens and a plurality of optical fibers and inserted into the object to transmit the reflected light to the light acquisition unit (110).
  6. 청구항 1에 있어서, The method according to claim 1,
    상기 본체부(20)는, The main body 20,
    상기 이미지데이터를 디스플레이하는 디스플레이부(240)를 포함하는 것을 특징으로 하는 관찰경.And a display unit 240 for displaying the image data.
  7. 청구항 6에 있어서, The method according to claim 6,
    상기 디스플레이부(240)는 고정식, 탈착식 또는 접이식 디스플레이부(240)인 것을 특징으로 하는 관찰경.The display unit 240 is a observation, characterized in that the fixed, removable or folding display unit 240.
  8. 청구항 1에 있어서, The method according to claim 1,
    상기 본체부(20)는, The main body 20,
    상기 이미지데이터를 외부기기로 전달하는 통신부(250) 및 사용자의 녹화요청에 따라 상기 이미지데이터가 녹화파일로 저장되는 메모리부(230)를 더 포함하며, 상기 사용자는 상기 녹화파일을 상기 통신부(250)에 연결되는 유에스비플래시 드라이브(USB flash drive) 또는 메모리카드에 다운로드할 수 있는 것을 특징으로 하는 관찰경.The communication unit 250 for transmitting the image data to an external device and a memory unit 230 for storing the image data as a recording file in accordance with the recording request of the user, the user is the communication file 250 ), Which can be downloaded to a USB flash drive or a memory card.
  9. 청구항 1에 있어서, The method according to claim 1,
    상기 본체부(20)는, The main body 20,
    상기 대상의 내부에 조사될 광을 생성하는 광원부(260)를 포함하는 것을 특징으로 하는 관찰경.And a light source unit 260 for generating light to be irradiated to the inside of the object.
  10. 청구항 8의 관찰경(1); 및 An observation mirror (1) of claim 8; And
    상기 통신부(250)로부터 상기 이미지데이터를 유선 또는 무선통신수단을 통하여 전달받는 단말; 을 포함하여 이루어지며, 상기 관찰경(1) 및 상기 단말은 상기 유선 또는 무선통신수단에 의한 미러링(mirroring)을 통하여 상기 이미지데이터를 공유하는 것을 특징으로 하는 관찰경시스템.A terminal receiving the image data from the communication unit 250 through a wired or wireless communication means; And the observation mirror (1) and the terminal share the image data through mirroring by the wired or wireless communication means.
  11. 청구항 10에 있어서, The method according to claim 10,
    상기 본체부(20)는, The main body 20,
    상기 단말이 거치되는 거치부를 더 포함하는 것을 특징으로 하는 관찰경시스템.Observation system characterized in that it further comprises a mounting portion on which the terminal is mounted.
  12. 청구항 1의 관찰경(1); The observation mirror 1 of claim 1;
    상기 관찰경(1)과 제1통신수단에 의한 미러링(mirroring)을 통하여 상기 이미지데이터를 공유하는 제1단말; 및 A first terminal for sharing said image data through mirroring by said observation mirror (1) and first communication means; And
    상기 제1단말과 제2통신수단에 의한 미러링(mirroring)을 통하여 상기 이미지데이터를 공유하는 제2단말; A second terminal for sharing the image data through mirroring by the first terminal and a second communication means;
    을 포함하여 이루어지는 것을 특징으로 하는 관찰경시스템.Observation system, characterized in that comprises a.
  13. 청구항 8의 관찰경(1); An observation mirror (1) of claim 8;
    상기 통신부(250)로부터 상기 이미지데이터를 전달받아 저장하는 서버; 및Server for receiving and storing the image data from the communication unit 250; And
    상기 서버로부터 상기 이미지데이터를 유선 또는 무선통신수단을 통하여 전달받는 단말; 을 포함하여 이루어져, 원격의사가 상기 서버 또는 상기 단말에 전달된 이미지데이터를 분석하여 분석결과를 상기 관찰경(1)의 사용자에게 제공하는 것을 특징으로 하는 관찰경시스템.A terminal receiving the image data from the server through a wired or wireless communication means; It comprises a, the observation system, characterized in that the remote doctor analyzes the image data transmitted to the server or the terminal to provide the analysis results to the user of the observation mirror (1).
  14. 청구항 13에 있어서, The method according to claim 13,
    가상의사가 상기 서버 또는 상기 단말에 전달된 이미지데이터를 분석하여 분석결과를 상기 관찰경(1)의 사용자에게 제공하는 것을 특징으로 하는 관찰경시스템.Observation system, characterized in that the virtual doctor analyzes the image data transmitted to the server or the terminal and provides the analysis results to the user of the observation mirror (1).
  15. 청구항 13 또는 청구항 14에 있어서, The method according to claim 13 or 14,
    상기 서버 또는 상기 단말은 상기 사용자의 아이디 또는 상기 관찰경(1)의 아이디를 인증하는 것을 특징으로 하는 관찰경시스템.The server or the terminal observation system, characterized in that to authenticate the ID of the user or the ID of the observation mirror (1).
  16. 청구항 1의 관찰경(1)을 이용하여 상기 대상의 내부를 관찰하는 방법에 있어서, In the method of observing the inside of the object using the observation mirror (1) of claim 1,
    (I) 광원부(260)가 상기 대상의 내부에 조사될 광을 생성하는 단계; (I) generating a light to be irradiated to the inside of the object by the light source unit 260;
    (II) 상기 (I)단계에서 생성된 광이 상기 대상의 내부에 조사되는 단계; (II) radiating the light generated in the step (I) to the inside of the object;
    (III) 상기 광취득부(110)가 상기 대상의 내부로부터 반사되는 광을 취득하여 상기 광신호를 생성하는 단계; (III) the optical acquisition unit (110) acquiring the light reflected from the inside of the object to generate the optical signal;
    (IV) 이미지센서(210)가 상기 (III)단계에서 생성된 광신호를 수신하여 이미지신호를 생성하는 단계; (IV) the image sensor 210 generating an image signal by receiving the optical signal generated in the step (III);
    (V) 이미지프로세서부(220)가 상기 (IV)단계에서 생성된 이미지신호를 처리하여 상기 이미지데이터를 생성하는 단계; 및 (V) an image processor 220 generating the image data by processing the image signal generated in step (IV); And
    (VI) 디스플레이부(240)가 상기 (V)단계에서 생성된 이미지데이터를 디스플레이하는 단계;(VI) the display unit 240 displaying the image data generated in the step (V);
    를 포함하여 이루어지는 것을 특징으로 하는 관찰경을 이용하여 대상의 내부를 관찰하는 방법.Method for observing the inside of the object using an observation mirror comprising a.
  17. 청구항 16에 있어서, The method according to claim 16,
    상기 (V)단계는, Step (V),
    (V-1) 이미지보정부(222)가 상기 이미지신호를 수신하는 단계; (V-1) the image correction unit 222 receiving the image signal;
    (V-2) 상기 이미지보정부(222)가 왜곡이미지를 보정하여 보정이미지를 생성하는 단계; 및 (V-2) the image correction unit 222 correcting the distortion image to generate a correction image; And
    (V-3) 이미지데이터생성부(223)가 상기 보정이미지에 관한 이미지데이터를 생성하는 단계; (V-3) the image data generating unit 223 generating image data regarding the corrected image;
    를 포함하여 구성되는 것을 특징으로 하는 관찰경을 이용하여 대상의 내부를 관찰하는 방법.Method for observing the inside of the object using an observation mirror, characterized in that comprising a.
  18. 청구항 17에 있어서, The method according to claim 17,
    상기 (V-2)단계는, The (V-2) step,
    (V-2-1) 상기 이미지보정부(222)가 왜곡보정알고리즘을 이용하여 왜곡이미지의 기하학적 보정을 수행하는 단계를 더 포함하며, 상기 왜곡보정알고리즘은 초점거리(focal length), 주점(principal point), 방사왜곡상수(radial distortion coefficient) 및 접선왜곡상수(tangential distortion coefficient) 중 하나 이상을 포함하는 것을 특징으로 하는 관찰경을 이용하여 대상의 내부를 관찰하는 방법.(V-2-1) the image correction unit 222 further includes performing a geometric correction of the distorted image using a distortion correction algorithm, wherein the distortion correction algorithm includes a focal length and a principal. A method for observing an interior of a subject using an observation mirror, characterized in that it comprises at least one of a point, a radial distortion coefficient and a tangential distortion coefficient.
  19. 청구항 18에 있어서, The method according to claim 18,
    상기 (V-2)단계는, 상기 (V-2-1)단계 이후에, The (V-2) step, after the (V-2-1) step,
    (V-2-2) 상기 이미지보정부(222)가 상기 왜곡이미지로부터 소실점좌표를 추출하는 단계; (V-2-2) the image correction unit 222 extracting vanishing point coordinates from the distorted image;
    (V-2-3) 상기 이미지보정부(222)가 상기 소실점좌표로부터 프로젝션알고리즘을 구성하는 보정계수를 산출하는 단계; 및 (V-2-3) calculating, by the image correction unit 222, a correction coefficient constituting a projection algorithm from the vanishing point coordinates; And
    (V-2-4) 상기 이미지보정부(222)가 상기 프로젝션알고리즘을 이용하여 보정이미지를 생성하는 단계;(V-2-4) the image compensating unit 222 generating a correction image using the projection algorithm;
    를 더 포함하고, More,
    상기 프로젝션알고리즘은 하기 수학식 1로 표현되는 것을 특징으로 하는 관찰경을 이용하여 대상의 내부를 관찰하는 방법. The projection algorithm is a method for observing the inside of the object using an observation mirror, characterized in that represented by the following equation (1).
    [수학식 1][Equation 1]
    Figure PCTKR2017003648-appb-I000009
    Figure PCTKR2017003648-appb-I000009
    (x, y : 왜곡이미지 좌표, X, Y : 보정이미지 좌표, a, b : 보정계수) (x, y: distortion image coordinate, X, Y: correction image coordinate, a, b: correction coefficient)
  20. 청구항 19에 있어서, The method according to claim 19,
    상기 보정계수는 하기 수학식 2로 표현되는 것을 특징으로 하는 관찰경을 이용하여 대상의 내부를 관찰하는 방법.The correction coefficient is a method for observing the inside of the object using an observation mirror, characterized in that represented by the following formula (2).
    [수학식 2][Equation 2]
    Figure PCTKR2017003648-appb-I000010
    Figure PCTKR2017003648-appb-I000010
    (XP1 : 제1소실점을 x축 상에 위치시켰을 때 제1소실점의 x좌표, YP2 : 제2소실점을 y축 상에 위치시켰을 때 제2소실점의 y좌표)(X P1 : x coordinate of the first vanishing point when the first vanishing point is located on the x axis, Y P2 : y coordinate of the second vanishing point when the second vanishing point is located on the y axis)
PCT/KR2017/003648 2016-04-06 2017-04-03 Observation scope for remote medical examination, and image processing device and sysyem comprising same WO2017176021A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160042376A KR101789677B1 (en) 2016-04-06 2016-04-06 Observing device for telemedicine, Image processing device including the same and System having the same
KR10-2016-0042376 2016-04-06

Publications (1)

Publication Number Publication Date
WO2017176021A1 true WO2017176021A1 (en) 2017-10-12

Family

ID=60001570

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/003648 WO2017176021A1 (en) 2016-04-06 2017-04-03 Observation scope for remote medical examination, and image processing device and sysyem comprising same

Country Status (2)

Country Link
KR (1) KR101789677B1 (en)
WO (1) WO2017176021A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000139819A (en) * 1998-09-01 2000-05-23 Olympus Optical Co Ltd Endoscope
KR20040047258A (en) * 2002-11-29 2004-06-05 아람휴비스(주) Multipurpose Multi-Scope and Use Method Thereof
KR20040049036A (en) * 2002-12-03 2004-06-11 김영철 Medical instrument with Microscopic Camera
KR101023937B1 (en) * 2010-10-19 2011-03-21 주식회사메드스타 Detachable type medical image dianostic apparatus and method
KR101444485B1 (en) * 2006-11-16 2014-09-24 스트리커 코포레이션 Wireless endoscopic camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000139819A (en) * 1998-09-01 2000-05-23 Olympus Optical Co Ltd Endoscope
KR20040047258A (en) * 2002-11-29 2004-06-05 아람휴비스(주) Multipurpose Multi-Scope and Use Method Thereof
KR20040049036A (en) * 2002-12-03 2004-06-11 김영철 Medical instrument with Microscopic Camera
KR101444485B1 (en) * 2006-11-16 2014-09-24 스트리커 코포레이션 Wireless endoscopic camera
KR101023937B1 (en) * 2010-10-19 2011-03-21 주식회사메드스타 Detachable type medical image dianostic apparatus and method

Also Published As

Publication number Publication date
KR101789677B1 (en) 2017-10-25
KR20170114811A (en) 2017-10-16

Similar Documents

Publication Publication Date Title
CN104756093B (en) For mobile or network equipment interchangeable wireless sensing device
US7057639B2 (en) Intra-oral camera with integral display
JP6329715B1 (en) Endoscope system and endoscope
US8251893B2 (en) Device for displaying assistance information for surgical operation, method for displaying assistance information for surgical operation, and program for displaying assistance information for surgical operation
US20070038020A1 (en) Endoscope apparatus
US20240069327A1 (en) Circuit board assembly for a multiple viewing elements endoscope using cmos sensors
WO2017135564A1 (en) Electronic device, mobile terminal and control method thereof
WO2016072586A1 (en) Medical image processing apparatus and method
EP3682790A1 (en) Endoscope system
WO2005062253A1 (en) Axial rotation correction for in vivo images
WO2010058927A2 (en) Device for photographing face
WO2016006765A1 (en) X-ray device
WO2017176021A1 (en) Observation scope for remote medical examination, and image processing device and sysyem comprising same
JP6368886B1 (en) Endoscope system
WO2012015093A1 (en) Method and electronic device for remote diagnosis
US9832411B2 (en) Transmission system and processing device
US10149600B2 (en) Medical signal processing device and medical observation system
WO2019031894A1 (en) Electronic device providing stabilizer function
WO2014101306A1 (en) Wireless transmission control system and method based on telescope
JP6444506B2 (en) Image display control device, display device, and image display system
JP7264051B2 (en) Image processing device and image processing method
WO2015070491A1 (en) Binoculars-based wireless transmission control system and method
WO2012015094A1 (en) Method and electronic device for remote diagnosis
WO2018026127A1 (en) Drug injector for telemedicine, and image processing device and system comprising same
US11528395B2 (en) Camera head

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17779321

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17779321

Country of ref document: EP

Kind code of ref document: A1