WO2008054033A1 - A system of acquiring a capture image for virtual plastic - Google Patents

A system of acquiring a capture image for virtual plastic Download PDF

Info

Publication number
WO2008054033A1
WO2008054033A1 PCT/KR2006/004489 KR2006004489W WO2008054033A1 WO 2008054033 A1 WO2008054033 A1 WO 2008054033A1 KR 2006004489 W KR2006004489 W KR 2006004489W WO 2008054033 A1 WO2008054033 A1 WO 2008054033A1
Authority
WO
WIPO (PCT)
Prior art keywords
photographing
device driving
module
illumination
civ
Prior art date
Application number
PCT/KR2006/004489
Other languages
French (fr)
Inventor
Chang-Hwan Lee
Hyun-Jin Kim
Original Assignee
Maxuracy Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maxuracy Co., Ltd. filed Critical Maxuracy Co., Ltd.
Priority to PCT/KR2006/004489 priority Critical patent/WO2008054033A1/en
Publication of WO2008054033A1 publication Critical patent/WO2008054033A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the invention relates to a system of acquiring a capture image for virtual plastic operation (hereinafter, abbreviated to "CIV"), and more particularly, to a CIV acquiring system, where a photographing set, in which a variety of devices for photographing a plastic operation target part such as photographing device, illumination device and the like are packaged and dispersion-disposed at proper locations, and a computation module capable of flexibly controlling a movement of the photographing set itself as well as states of the photographing device and the illumination device disposed in the photographing set depending on photographing environments (for example, surrounding illumination degree, height of a customer and the like) are systematically link-disposed, so that a plastic operation related medical institution can easily acquire an optimized CIV, in which various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer are thoroughly considered, with a simple computation operation, without a separate complex process, thereby considerably improving a reliability of the virtual plastic operation without causing a difficulty to the customer and
  • a plastic operation related medical institution takes a photograph of a CIV corresponding to a plastic operation target part of a customer (for example, face) with a capture camera, a pattern generator and the like, creates three- dimensional shape data based on the CIV and virtually shows to a customer having a plastic operation plan a figure after the plastic operation on the plastic operation target part through a shape data editing process using a commercialized image editing tool, thereby guiding a series of plastic operation processes to be smoothly progressed under consultation with the customer.
  • the plastic operation related medical institution has appropriately disposed the photographing devices such as capture camera, pattern generator and the like on the basis of experiences, thereby progressing a series of photographing processes, without the systematic consideration of various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer. Accordingly, it is difficult to acquire an optimized CIV unless a separate measure is taken. As a result, the three-dimensional shape data, which will be finally created, has a low grade quality. Thereby, the customer and the medical institution should put up with the situation that a reliability of the virtual plastic operation is considerably decreased. Disclosure of Invention Technical Problem
  • An object of the invention is to systematically dispose a photographing set, in which a variety of devices for photographing a plastic operation target part such as photographing device, illumination device and the like are packaged and dispersion- disposed at proper locations, and computation modules capable of flexibly controlling a movement of the photographing set itself as well as states of the photographing device and the illumination device disposed in the photographing set depending on photographing environments (for example, surrounding illumination degree, height of a customer and the like), with being linked, so that a plastic operation related medical institution can easily acquire an optimized CIV, in which various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer are thoroughly considered, even with a simple computation operation, without a separate complex process, thereby considerably improving a reliability of the virtual plastic operation without causing a difficulty to the customer and the medical institution.
  • a CIV acquiring system comprising: a photographing set that has a capture camera, a pattern generator and an illumination device and link-operates the capture camera, the pattern generator and the illumination device to generate a CIV having reflected a plastic operation target part while going up and down depending on an outside computation control, the capture camera, the pattern generator and the illumination device being provided to one place and dispersion-disposed depending on functions thereof; and a device driving control tool that belongs to an information processing device electrically connected to the photographing set and selectively communicates with the photographing set to control driving states of the capture camera, the pattern generator and the illumination device and the up and down states of the photographing set, correspondingly to photographing environments.
  • FIG. 1 conceptually shows a CIV acquiring system according to an embodiment of the invention
  • FIG. 2 conceptually shows an installation aspect of a device driving control tool according to an embodiment of the invention
  • FIG. 3 conceptually shows a detailed structure of a device driving control tool according to an embodiment of the invention
  • FIG. 4 conceptually shows a display state of a user interface window according to an embodiment of the invention
  • FIG. 5 conceptually shows a device driving control process of a device driving control tool according to an embodiment of the invention
  • FIG. 6 conceptually shows a device driving control process of a device driving control tool according to another embodiment of the invention.
  • FIG. 7 conceptually shows a three-dimensional shape data creation managing tool according to an embodiment of the invention.
  • a CIV acquiring system 200 comprises a photographing set 1 and a device driving control tool 300 that belongs to a main body 101 of an information processing device 100 such as notebook computer, desktop computer and the like and is electrically connected to the photographing set 1 through various electrical connection cables 2, 3, 4, 5, 103.
  • the photographing set 1 has a variety of devices such as a capture camera 17, a pattern generator 25, an illumination device 12 and a mirror 14 that are provided at one place and dispersion-mounted depending on functions thereof, and goes up and down in accordance with an outside computation control signal, for example a control signal outputted from the device driving control tool 300 of the information processing device 100, thereby appropriately link-operating the capture camera 17, the pattern generator 25 and the illumination device 12 to generate a series of CIV having reflected a plastic operation target part of a customer participating in a virtual plastic operation.
  • an outside computation control signal for example a control signal outputted from the device driving control tool 300 of the information processing device 100
  • the device driving control tool 300 communicates with an operating system 107, a device connecting module 105 and the like of the information processing device 100 via an interface module 106, for example selectively communicates with the photographing set 1 in accordance with a computation operation of a user (medical institution, customer) using the information processing device 100, thereby flexibly controlling driving states of the capture camera 17, the pattern generator 25 and the illumination device 12 and the up and down states of the photographing set 1 , correspondingly to a series of photographing environments at the surroundings of a customer, such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer, for example.
  • the pattern generator 25 provided to the photographing set 1 has such a structure that it is opened toward a plastic operation target object, and serves to sequentially transmit (project) a series of pattern image signals to the plastic operation target part in accordance with a control signal outputted from the device driving control tool 300.
  • the capture camera 17 is opened toward the plastic operation target object.
  • the pattern image signal are transmitted (projected) to the plastic operation target part by the pattern generator 25, the capture camera takes a photograph of a corresponding pattern image together with the plastic operation target part under control of the device driving control tool 300, thereby real-time creating a series of CIV having contained a figure of the plastic operation target part.
  • the photographing set 1 has a support shaft 31 and first to third cases 20, 16, 11 supported by the support shaft 31.
  • the support shaft 31 is fixed to a base plate 37 while being inserted into a guide frame 32.
  • the base plate 37 supporting the guide frame 32 is flexibly connected to wheels 40 via links 39 and wheel connecting shafts 38. Therefore, when a user (customer, medical institution) carries out an operation of pushing the photographing set 1 in all direction, correspondingly to the circumstances of the user, the photographing set 1 including the support shaft 31 is moved in all direction, correspondingly to the directions of the applied force.
  • the guide frame 32 is firmly fixed to the base plate 37 through bolts 36, so that it can stably maintain a normal upright structure even when the support shaft 31 inserted therein supports the first to third cases 20, 16, 11.
  • a motor 35 for support shaft which is signal-connected to the device driving control tool 300 via the connection cable 5 and is fixed adjacent to the support shaft 31
  • a pinion 34 which is movably fixed to the motor 35 for support shaft while being engaged with the support shaft 31 via link recesses 33 formed at a part of the support shaft 31 , are additionally disposed in the guide frame 32.
  • the first case 20 forms a structure that is fixed to an upper end of the support shaft 31 and accommodates the pattern generator 25 therein (the pattern generator may be replaced with a capture camera, depending on the situations).
  • the first case 20 since the first case 20 is fixed to the upper end of the support shaft 31, when the support shaft 31 goes up and down as the motor 35 and the pinion 34 are jointly operated, the first case 31 can also go up and down rapidly, correspondingly to the going up and down operation of the support shaft 31.
  • the second case 16 is put on the first case 20 to form a fixed structure and accommodates the capture camera 17 therein, for example (the capture camera may be replaced with the pattern generator, depending on the situations).
  • the second case 16 since the second case 16 is also fixed to the upper end of the support shaft 31 via the first case 20, when the support shaft 31 goes up and down as the motor 35 and the pinion 34 are jointly operated, the second case 16 can also go up and down rapidly, correspondingly to the going up and down operation of the support shaft 31.
  • the third cases 11 are uprightly put on the first case 20 at both sides of the second case 16 and stably accommodate the illumination devices 12 therein, respectively. At this time, since the third cases 11 are also fixed to the upper end of the support shaft 31 via the first case 20, when the support shaft 31 goes up and down as the motor 35 and the pinion 34 are jointly operated, the third cases 11 can also go up and down rapidly, correspondingly to the going up and down operation of the support shaft 31.
  • the first case 20 accommodates motors 21 for illumination device that are signal-connected to the device driving control tool 300 via the connection cable 4 and fix the third cases 11 to be moveable via fixing shafts 22, respectively.
  • the fixing shafts 22 are correspondingly rotated rapidly.
  • the third cases 11 fixed by the fixing shafts 22 can flexibly rotate within a predetermined angle range, correspondingly to the rotation of the fixing shafts 22.
  • a fourth case 18 and a mirror frame 13 are further disposed on the first case 20, in addition to the cases as described above.
  • the fourth case 18 and the mirror frame 13 accommodate a pattern generator controlling board 19 and a mirror 15, respectively (the pattern generator controlling board may be directly received in the first case without disposing the fourth case, depending on the situations).
  • the pattern generator controlling board 19 disposed in the fourth case 18 is signal-connected to the device driving control tool 300 via the connection cable 3 and precisely controls the pattern generator 25 based on control information (for example, control information about the output timing of pattern image signals, control information about the output number of patter image signals, control information about the types of pattern image signals and the like) transmitted from the device driving control tool 300, thereby enabling the pattern generator 25 to normally carry out the function of "sequentially transmitting (projecting) pattern image signals to a plastic operation target part" without specific problems.
  • the mirror 14 disposed in the mirror frame 13 is moveably fixed and supported to the mirror frame 13 via a link shaft 15 that is connected to the mirror frame 13.
  • the mirror When a user (for example, customer participating in a virtual plastic operation) applies force of a predetermined magnitude, the mirror is rotated up and down, correspondingly to the applied force, thereby enabling the user to adjust a photographing position of the customer by oneself without specific problems and to adjust the customer's appearance.
  • a user for example, customer participating in a virtual plastic operation
  • the device driving control tool 300 controlling the operation of the photographing set 1 comprises a device driving controller 301 and a photographing device driving management module 360, an image transmitting/ receiving module 350, an image display module 380, an illumination brightness adj usting module 320, a camera exposure adjusting module 330, an illumination position adjusting module 310 and a photographing position adjusting module 340, which are collectively controlled by the device driving controller 301.
  • the device driving controller 301 forms a series of signal exchange relations with the photographing set 1, the operating system 107, the device connecting module 105 (Fig. 2) and the like via an information exchange gate 302.
  • the device driving controller displays a user interface window 501 as shown in Fig. 4 in a monitor 102 of the information processing device 1. Then, it collectively controls the capture camera 17, the pattern generator 25, the illumination device 12, the photographing set 1 and the like in accordance with a result of a computation event at a user (for example, medical institution) based on the user interface window 501.
  • the photographing device driving management module 360 which is controlled by the device driving controller 301, forms a series of signal exchange relations with a three-dimensional shape data creation managing tool 400 (shown in Fig. 2), the capture camera 17, the pattern generator controlling board 19 and the like via the information exchange gate 302, the device connecting module 105 and the like.
  • the photographing device driving management module When a user selects a photographing start item 503 of the user interface window 501, for example, the photographing device driving management module notifies a series of control information about whether or not a photographing start and containing a detailed setting item (for example, control information about a driving start time point of the capture camera, control information about an image reception of the capture camera, control information about an output time point of a pattern video signal, control information about an output number of a pattern video signal, control information about a type of a pattern video signal and the like) to the capture camera 17, the pattern generator control board 19 and the like in association with the three- dimensional shape data creation managing tool 400, thereby enabling the capture camera 17 and the pattern generator control board 19 to normally carry out the operations given to them (image capturing operation, pattern generating operation and the like).
  • a detailed setting item for example, control information about a driving start time point of the capture camera, control information about an image reception of the capture camera, control information about an output time point of a pattern video signal, control information
  • the image transmitting/receiving module 350 which is controlled by the device driving controller 301, forms a series of signal exchange relations with the three-dimensional shape data creation managing tool 400, the capture camera 17 and the like via the information exchange gate 302.
  • the image transmitting/receiving module communicates with the capture camera 17 to receive acquire an initial CIV captured by the camera and stores the acquired initial CIV in a temporary buffer 351.
  • the initial CIV When a user confirms that the initial CIV has no specific problem, it transmits the initial CIV, which is stored in the temporary buffer 351, to the three-dimensional shape data creation managing tool 400, thereby enabling the three-dimensional shape data creation managing tool 400 to normally carry out a series of three-dimensional shape data creating processes without specific problem (refer to Fig. 5).
  • the image display module 380 which is controlled by the device driving controller 301, forms a series of signal exchange relations with the operating system 107, the monitor 102 and the like of the information processing device 100 via the information exchange gate 302, the interface module 106, the device connecting module 105 and the like.
  • the image display module displays the initial CIV through the frame 502 of the user interface window 501 displayed in the monitor 102 while communicating with the operating system 107, as shown in Fig. 5.
  • the photographing position adjusting module 340 which is controlled by the device driving controller 301, forms a series of signal exchange relations with the operating system 107, the motor 35 for support shaft of the photographing set 1 and the like via the information exchange gate 302, the device connecting module 105, the interface module 106 and the like.
  • the photographing position adjusting module immediately outputs a series of ascent or descent control signals to the motor 35 for support shaft of the photographing set 1 (refer to Fig. 5).
  • the motor 35 for support shaft is fixedly arranged adjacent to the support shaft 31 of the photographing set 1 and the pinion 34 engaged with the support shaft 31 via the link recesses 33 formed at a part of the support shaft 31 is additionally arranged to the motor 35 for support shaft. Therefore, when a series of ascent or descent control signals are outputted by the photographing position adjusting module 340, the support shaft 31 engaged with the pinion 34 goes up and down, correspondingly to the rotation operation of the pinion 34. As a result, the capture camera 17 and the pattern generator 25, which are accommodated in the first case 20 and the second case 16, also go up and down in natural.
  • the photographing position adjusting module 340 carries out the functions thereof, the user can easily control the going up and down operations of the capture camera 17 and the pattern generator 25 of the photographing set 1 with a very simple computation operation of selecting the specific item 506 of the user interface window 501, thereby flexibly adjusting the photographing position of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process.
  • the plastic operation target part i.e., initial CIV
  • the illumination position adjusting module 310 which is controlled by the device driving controller 301, forms a series of signal exchange relations with the operating system 107, the motors 21 for illumination device of the photographing set 1 and the like via the information exchange gate 302, the device connecting module 105, the interface module 106 and the like.
  • the illumination position adjusting module immediately outputs a series of control signals to the motors 21 for illumination device of the photographing set 1 (refer to Fig. 5).
  • the motors 21 for illumination device fix the third cases 11 to be moveable via the fixing shafts 22. Therefore, when a series of control signals are outputted by the illumination position adjusting module 310, the fixing shafts 22 connected to the motors 21 for illumination device rapidly rotate, correspondingly to the rotation operation of the motors 21 for illumination device. As a result, the illumination devices 12, which are accommodated in the third cases 11, also rotate within a predetermined angle range in natural.
  • the illumination position adjusting module 310 carries out the functions thereof, the user can easily control the positions of the illumination devices 12 of the photographing set 1 with a very simple computation operation of selecting the specific item 504 of the user interface window 501, thereby flexibly adjusting the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process.
  • the plastic operation target part i.e., initial CIV
  • the illumination brightness adjusting module 320 which is controlled by the device driving controller 301, forms a series of signal exchange relations with the operating system 107, a illumination device driving board 26 of the photographing set 1 and the like via the information exchange gate 302, the device connecting module 105, the interface module 106 and the like.
  • the illumination brightness adjusting module immediately outputs a series of control signals to the illumination device driving board 26 of the photographing set 1 (refer to Fig. 5).
  • the illumination device driving board 26 rapidly carries out a series of processes for adjusting the brightness of the illumination device (for example, a process of lowering a voltage to be inputted to the illumination device) in accordance with the control signal of the illumination brightness adjusting module 320.
  • the illumination position adjusting module 310 carries out the functions thereof, the user can easily control the brightness of the illumination devices 12 of the photographing set 1 with a very simple computation operation of selecting the specific item 505 of the user interface window 501, thereby flexibly adjusting the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process.
  • the camera exposure adjusting module 330 which is controlled by the device driving controller 301, forms a series of signal exchange relations with the operating system 107, the capture camera 17 and the like via the information exchange gate 302, the device connecting module 105, the interface module 106 and the like. For example, when a user observes the initial CIV displayed through the frame 502 of the user interface window 501 and then selects a specific item 507 of the user interface window 501 so as to appropriately adjust an exposure degree of the capture camera 17, thereby determining a desired exposure adjustment value, the camera exposure adjusting module immediately outputs a series of control signals to the capture camera 17 (refer to Fig. 5).
  • the capture camera 17 rapidly carries out a series of processes for adjusting an exposure degree thereof (for example, a process of adjusting an iris diaphragm of the capture camera) in accordance with the control signal of the camera exposure adjusting module 330.
  • the camera exposure adjusting module 330 carries out the functions thereof, the user can easily control the exposure degree of the capture camera 17 of the photographing set 1 with a very simple computation operation of selecting the specific item 507 of the user interface window 501, thereby flexibly adjusting the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process.
  • a photographing position check module 390, a brightness sensing module 370 and an optimum value calculating module 391 may be further provided to the device driving controller 300, in addition to the above computation modules.
  • the brightness sensing module 370 which is controlled by the device driving controller 301, forms a series of signal exchange relations with each sensor S exposed to the outside of the photographing set via the information exchange gate, the device connecting module and the like, as shown in Fig. 6.
  • a series of sensing signals for example, sensing signal in which the surrounding brightness of the plastic operation target part is reflected
  • the brightness sensing module receives and processes the received sending signals to generate/acquire a series of brightness sensing values in which the surrounding brightness of the plastic operation target part is reflected and transfers the acquired brightness sensing value to the optimum value calculating module 391.
  • the photographing position check module 390 which is controlled by the device driving controller 301, forms a series of communication relations with the image display module 380 via the device driving controller 301.
  • the photographing position check module scans coordinates of the CIV in association with the image display module 380, thereby checking the position of the initial CIV.
  • the photographing position check module acquires a series of CIV position values in which the photographing positions of the plastic operation target part are reflected and transfers the acquired CIV position values to the optimum value calculating module 391.
  • the optimum value calculating module 391 which is controlled by the device driving controller 301, forms a series of signal exchange relations with the photographing position check module 390, the brightness sensing module 370 and the like.
  • the optimum value calculating module immediately compares the result (i.e., brightness sensing value, position value) acquired by the brightness sensing module and the photographing position check module with reference values (reference brightness value, reference position value) stored in a reference value storage sector 351 to automatically calculate optimum brightness correction value and photographing position correction value and selectively transfers the calculated brightness correction value and photographing position correction value to the illumination brightness adjusting module 320, the camera exposure adjusting module 330, the illumination position adjusting module
  • the photographing position adjusting module 340, the illumination position adjusting module 310, the illumination brightness adjusting module 320 and the camera exposure adjusting module 330 subsequently carries out a process of transmitting the control information in which the brightness correction value and photographing position correction value are reflected to each device of the photographing set 1.
  • the support shaft 31 engaged with the pinion 34 of the motor 35 for support shaft of the photographing set 1 additionally takes a series of correcting up and down operations, correspondingly to the rotation operation of the pinion 34.
  • the capture camera 17 and the pattern generator 35 that are accommodated in the first case and second case 20, 16 also additionally take a series of correcting up and down operations. Therefore, the user can flexibly have an advantage in that the photographing position of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process is automatically corrected, without a separate computation operation.
  • the fixing shafts 22 connected to the motors 21 for illumination device of the photographing set 1 rapidly takes a correcting rotation operation as the motors 21 for illumination device are additionally operated.
  • the illumination devices 12 accommodated in the third cases 11 take a series of correcting rotation operations within a predetermined angle range. Therefore, the user can flexibly have an advantage in that the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process is automatically corrected, without a separate computation operation.
  • the illumination device driving board 26 rapidly carries out a series of additional processes for correcting the brightness of the illumination devices 12, in accordance with the control signal of the illumination brightness adjusting module 320. Therefore, the user can flexibly have an advantage in that the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process is automatically corrected, without a separate computation operation.
  • initial CIV plastic operation target part
  • the capture camera 17 rapidly carries out a series of additional processes for correcting the exposure degree thereof, in accordance with the control signal of the camera exposure adjusting module 320. Therefore, the user can flexibly have an advantage in that the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process is automatically corrected, without a separate computation operation.
  • initial CIV plastic operation target part
  • the information processing device 100 is further provided with the three-dimensional shape data creation managing tool 400, in addition to the device driving control tool 300.
  • the three-dimensional shape data creation managing tool 400 comprises a three-dimensional shape data creation managing module 401 and a texture map creating module 470, a three-dimensional shape raw data creating module 410, a registration preprocessing module 420, a three-dimensional shape data registration processing module 430, a three-dimensional shape data merging processing module 440, and a three-dimensional shape data mapping processing module 460, which are collectively controlled by the three-dimensional shape data creation managing module 401 (the detailed operations of the device driving control tool are specifically described in a Korean Patent Application No. 2005-87580 filed by the applicant.
  • the structure of the device driving control tool is not limited to the above application and may be diversely modified depending on the situations).
  • the three-dimensional shape data creation managing module 401 selects/forms continuous signal exchange relations with the device driving control tool 300, the operating system 107 and the like via the information exchange gate 402.
  • the three- dimensional shape data creation managing module receives the CIV and collectively controls/progresses a series of processes for processing and creating the three- dimensional shape data based on the received CIV.
  • the texture map creating module 470 which is controlled by the three- dimensional shape data creation managing module 401, forms a series of communication relations with the device driving control tool 300 via the information exchange gate 402.
  • the texture map creating module rapidly progresses a process of mathematically processing the acquired CIV, for example finely dividing the space constituting the CIV and setting a series of coordinates to the divided pixels, thereby creating a source texture map corresponding to the CIV.
  • the three-dimensional shape raw data creating module 410 which is controlled by the three-dimensional shape data creation managing module 401, forms a series of communication relations with the device driving control tool 300 via the information exchange gate 402.
  • the three-dimensional shape raw data creating module progresses a process of mathematically processing the acquired CIV (for example, a space coding process, a process of calculating/setting 3D coordinates, a process of meshing each image, and the like), thereby creating three- dimensional shape raw data in which a feature of the plastic operation target part is reflected, separately from the texture map.
  • the three-dimensional shape raw data creating module 410 may comprise a space coding section that finely divides a measurement space of the CIV into each pixel and allots inherent space codes to each of the divided pixels, a 3D coordinates calculating section that calculates the space codes of each pixel allotted by the space coding section through a predetermined correction coefficient and converts the calculated space codes into three-dimensional coordinates, and a meshing processing section that connects and meshes the respective pixels to which the three- dimensional coordinates are provided.
  • the registration preprocessing module 420 which is controlled by the three-dimensional shape data creation managing module 401 together with the three-dimensional shape raw data creating module 410, systematically progresses processes of pre-processing the three-dimensional shape raw data, for example a process of removing a background of the three-dimensional shape raw data and making only the plastic operation target part into data (i.e., a process for extracting an object area), a process of subdividing to interpolate a space code image of the plastic operation target part (i.e., a process for interpolating a space code), a process of interpolating to recover the hollow space data of the plastic operation target part (i.e., a process for filling a hole), a process of Gaussian-filtering the plastic operation target part to uniformalize the data distribution (i.e., a process for
  • the registration preprocessing module 420 uses the information input/output device connected to the information processing device 100, for example, mouse, keyboard 104 and the monitor 102 to progress a series of computation operations for registration-processing the three-dimensional shape raw data (for example, a process of marking corresponding points to the three-dimensional shape data and requesting the registration of the points), the three-dimensional shape data registration processing module 430, which is controlled by the three-dimensional shape data creation managing module 401, adjusts the mutual positions of the three-dimensional shape raw data in accordance with the computation operation of the user, thereby registering the three-dimensional shape raw data into the single registered three-dimensional shape data.
  • the three-dimensional shape data registration processing module 430 which is controlled by the three-dimensional shape data creation managing module 401, adjusts the mutual positions of the three-dimensional shape raw data in accordance with the computation operation of the user, thereby registering the three-dimensional shape raw data into the single registered three-dimensional shape data.
  • the three-dimensional shape data merging processing module 440 which is controlled by the three-dimensional shape data creation managing module 401, sequentially progresses each processing step for merging the above registered three-dimensional shape data into merged three- dimensional shape data of a single object, for example, a merging step, a combined area correcting step, a hole area correcting step, a polygon number correcting step, a mesh quality correcting step and the like, in accordance with the merge processing basis information that is self-defined and provided for each registration type in advance, thereby automatically creating a series of merged three-dimensional shape data.
  • the three-dimensional shape data mapping processing module 460 which is controlled by the three-dimensional shape data creation managing module 401, immediately communicates with the texture map creating module 470 to acquire the created texture map and then maps the merged three-dimensional shape data into the texture map acquired, thereby creating the final three-dimensional shape data.
  • the CIV which is a basis for the three-dimensional shape data, maintains the optimized state, in which various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer are thoroughly considered as the device driving control tool 300 carries out the functions thereof. Accordingly, when the invention is realized, the three-dimensional shape data, which is finally formed by the computation modules of the three-dimensional shape data creation managing tool 300, can also maintain the optimized quality in natural. As a result, the user (customer, medical institution) can have an advantage in that the reliability of the overall virtual plastic operation is considerably improved.
  • the medical institution shows to the customer having a plastic operation plan a feature after a plastic operation on the plastic operation target part through a three-dimensional CIV editing process using the commercialized image editing tool, thereby enabling the plastic operation to be smoothly progressed under consultation with the customer.
  • a photographing set in which a variety of devices for photographing a plastic operation target part such as photographing device, illumination device and the like are packaged and dispersion-disposed at proper locations, and computation modules capable of flexibly controlling a movement of the photographing set itself as well as states of the photographing device and the illumination device disposed in the photographing set depending on photographing environments (for example, surrounding illumination degree, height of a customer and the like), with being linked, so that a plastic operation related medical institution can easily acquire an optimized CIV, in which various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer are thoroughly considered, even with a simple computation operation, without a separate complex process, thereby considerably improving a reliability of the virtual plastic operation without causing a difficulty to the customer and the medical institution.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Input (AREA)

Abstract

Disclosed is a system of acquiring a capture image for virtual plastic operation. According to the invention, it is possible to systematically dispose a photographing set, in which a variety of devices for photographing a plastic operation target part, illumination device and the like are packaged and dispersion-disposed at proper locations, and computation modules capable of flexibly controlling a movement of the photographing set itself as well as states of the photographing device and the illumination device disposed in the photographing set depending on photographing environments, with being linked, so that a plastic operation related medical institution can easily acquire an optimized CIV, in which various parameters are thoroughly considered, even with a simple computation operation, without a separate complex process.

Description

Description
A SYSTEM OF ACQUIRING A CAPTURE IMAGE FOR
VIRTUAL PLASTIC
Technical Field
[1] The invention relates to a system of acquiring a capture image for virtual plastic operation (hereinafter, abbreviated to "CIV"), and more particularly, to a CIV acquiring system, where a photographing set, in which a variety of devices for photographing a plastic operation target part such as photographing device, illumination device and the like are packaged and dispersion-disposed at proper locations, and a computation module capable of flexibly controlling a movement of the photographing set itself as well as states of the photographing device and the illumination device disposed in the photographing set depending on photographing environments (for example, surrounding illumination degree, height of a customer and the like) are systematically link-disposed, so that a plastic operation related medical institution can easily acquire an optimized CIV, in which various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer are thoroughly considered, with a simple computation operation, without a separate complex process, thereby considerably improving a reliability of the virtual plastic operation without causing a difficulty to the customer and the medical institution.
Background Art
[2] In recent years, as living conditions become prosperous, a social interest in a plastic operation is increased. Following the social environment, a virtual plastic operation related technology is also developed rapidly.
[3] According to the conventional virtual plastic operation technology, a plastic operation related medical institution (for example, plastic operation surgery) takes a photograph of a CIV corresponding to a plastic operation target part of a customer (for example, face) with a capture camera, a pattern generator and the like, creates three- dimensional shape data based on the CIV and virtually shows to a customer having a plastic operation plan a figure after the plastic operation on the plastic operation target part through a shape data editing process using a commercialized image editing tool, thereby guiding a series of plastic operation processes to be smoothly progressed under consultation with the customer.
[4] According to the conventional technology, since the CIV created by the capture camera, the pattern generator and the like is used as a basis image of the three- dimensional shape data for virtual plastic operation, which is finally created, the initial process of taking a photograph of the plastic operation target part of the customer is very important in the general virtual plastic operation process.
[5] However, despite the above situations, the plastic operation related medical institution has appropriately disposed the photographing devices such as capture camera, pattern generator and the like on the basis of experiences, thereby progressing a series of photographing processes, without the systematic consideration of various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer. Accordingly, it is difficult to acquire an optimized CIV unless a separate measure is taken. As a result, the three-dimensional shape data, which will be finally created, has a low grade quality. Thereby, the customer and the medical institution should put up with the situation that a reliability of the virtual plastic operation is considerably decreased. Disclosure of Invention Technical Problem
[6] The invention has been made to solve the above problems occurring in the prior art.
An object of the invention is to systematically dispose a photographing set, in which a variety of devices for photographing a plastic operation target part such as photographing device, illumination device and the like are packaged and dispersion- disposed at proper locations, and computation modules capable of flexibly controlling a movement of the photographing set itself as well as states of the photographing device and the illumination device disposed in the photographing set depending on photographing environments (for example, surrounding illumination degree, height of a customer and the like), with being linked, so that a plastic operation related medical institution can easily acquire an optimized CIV, in which various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer are thoroughly considered, even with a simple computation operation, without a separate complex process, thereby considerably improving a reliability of the virtual plastic operation without causing a difficulty to the customer and the medical institution. Technical Solution
[7] In order to achieve the above object, there is provided a CIV acquiring system comprising: a photographing set that has a capture camera, a pattern generator and an illumination device and link-operates the capture camera, the pattern generator and the illumination device to generate a CIV having reflected a plastic operation target part while going up and down depending on an outside computation control, the capture camera, the pattern generator and the illumination device being provided to one place and dispersion-disposed depending on functions thereof; and a device driving control tool that belongs to an information processing device electrically connected to the photographing set and selectively communicates with the photographing set to control driving states of the capture camera, the pattern generator and the illumination device and the up and down states of the photographing set, correspondingly to photographing environments. Brief Description of the Drawings
[8] The above and other objects, features and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
[9] FIG. 1 conceptually shows a CIV acquiring system according to an embodiment of the invention;
[10] FIG. 2 conceptually shows an installation aspect of a device driving control tool according to an embodiment of the invention;
[11] FIG. 3 conceptually shows a detailed structure of a device driving control tool according to an embodiment of the invention;
[12] FIG. 4 conceptually shows a display state of a user interface window according to an embodiment of the invention;
[13] FIG. 5 conceptually shows a device driving control process of a device driving control tool according to an embodiment of the invention;
[14] FIG. 6 conceptually shows a device driving control process of a device driving control tool according to another embodiment of the invention; and
[15] FIG. 7 conceptually shows a three-dimensional shape data creation managing tool according to an embodiment of the invention. Mode for the Invention
[16] Hereinafter, a preferred embodiment of the present invention will be described with reference to the accompanying drawings. In the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.
[17]
[18] As shown in Fig. 1, a CIV acquiring system 200 according to an embodiment of the invention comprises a photographing set 1 and a device driving control tool 300 that belongs to a main body 101 of an information processing device 100 such as notebook computer, desktop computer and the like and is electrically connected to the photographing set 1 through various electrical connection cables 2, 3, 4, 5, 103.
[19] At this time, the photographing set 1 has a variety of devices such as a capture camera 17, a pattern generator 25, an illumination device 12 and a mirror 14 that are provided at one place and dispersion-mounted depending on functions thereof, and goes up and down in accordance with an outside computation control signal, for example a control signal outputted from the device driving control tool 300 of the information processing device 100, thereby appropriately link-operating the capture camera 17, the pattern generator 25 and the illumination device 12 to generate a series of CIV having reflected a plastic operation target part of a customer participating in a virtual plastic operation.
[20] As shown in Fig. 2, the device driving control tool 300 communicates with an operating system 107, a device connecting module 105 and the like of the information processing device 100 via an interface module 106, for example selectively communicates with the photographing set 1 in accordance with a computation operation of a user (medical institution, customer) using the information processing device 100, thereby flexibly controlling driving states of the capture camera 17, the pattern generator 25 and the illumination device 12 and the up and down states of the photographing set 1 , correspondingly to a series of photographing environments at the surroundings of a customer, such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer, for example.
[21] At this time, the pattern generator 25 provided to the photographing set 1 has such a structure that it is opened toward a plastic operation target object, and serves to sequentially transmit (project) a series of pattern image signals to the plastic operation target part in accordance with a control signal outputted from the device driving control tool 300. The capture camera 17 is opened toward the plastic operation target object. When the pattern image signal are transmitted (projected) to the plastic operation target part by the pattern generator 25, the capture camera takes a photograph of a corresponding pattern image together with the plastic operation target part under control of the device driving control tool 300, thereby real-time creating a series of CIV having contained a figure of the plastic operation target part.
[22] Herein, as shown in Fig. 1, the photographing set 1 has a support shaft 31 and first to third cases 20, 16, 11 supported by the support shaft 31.
[23] At this time, the support shaft 31 is fixed to a base plate 37 while being inserted into a guide frame 32. Under such situations, the base plate 37 supporting the guide frame 32 is flexibly connected to wheels 40 via links 39 and wheel connecting shafts 38. Therefore, when a user (customer, medical institution) carries out an operation of pushing the photographing set 1 in all direction, correspondingly to the circumstances of the user, the photographing set 1 including the support shaft 31 is moved in all direction, correspondingly to the directions of the applied force.
[24] In this case, the guide frame 32 is firmly fixed to the base plate 37 through bolts 36, so that it can stably maintain a normal upright structure even when the support shaft 31 inserted therein supports the first to third cases 20, 16, 11.
[25] Under such structure, a motor 35 for support shaft, which is signal-connected to the device driving control tool 300 via the connection cable 5 and is fixed adjacent to the support shaft 31 , and a pinion 34, which is movably fixed to the motor 35 for support shaft while being engaged with the support shaft 31 via link recesses 33 formed at a part of the support shaft 31 , are additionally disposed in the guide frame 32.
[26] Under such structure that the motor 35 for support shaft and the pinion 34 are additionally disposed, when a series of control signals are outputted from the device driving control tool 300 and the motor 35 for support shaft is correspondingly driven selectively, the pinion 34 connected to the motor 35 can rapidly rotate as the motor 35 is driven. As a result, the support shaft 31 engaged with the pinion 34 can also take a series of going up and down operations, correspondingly to the rotation operation of the pinion 34.
[27] With the support shaft disposed as described above, the first case 20 forms a structure that is fixed to an upper end of the support shaft 31 and accommodates the pattern generator 25 therein (the pattern generator may be replaced with a capture camera, depending on the situations). At this time, since the first case 20 is fixed to the upper end of the support shaft 31, when the support shaft 31 goes up and down as the motor 35 and the pinion 34 are jointly operated, the first case 31 can also go up and down rapidly, correspondingly to the going up and down operation of the support shaft 31.
[28] In addition, the second case 16 is put on the first case 20 to form a fixed structure and accommodates the capture camera 17 therein, for example (the capture camera may be replaced with the pattern generator, depending on the situations). At this time, since the second case 16 is also fixed to the upper end of the support shaft 31 via the first case 20, when the support shaft 31 goes up and down as the motor 35 and the pinion 34 are jointly operated, the second case 16 can also go up and down rapidly, correspondingly to the going up and down operation of the support shaft 31.
[29] Moreover, the third cases 11 are uprightly put on the first case 20 at both sides of the second case 16 and stably accommodate the illumination devices 12 therein, respectively. At this time, since the third cases 11 are also fixed to the upper end of the support shaft 31 via the first case 20, when the support shaft 31 goes up and down as the motor 35 and the pinion 34 are jointly operated, the third cases 11 can also go up and down rapidly, correspondingly to the going up and down operation of the support shaft 31.
[30] Here, the first case 20 accommodates motors 21 for illumination device that are signal-connected to the device driving control tool 300 via the connection cable 4 and fix the third cases 11 to be moveable via fixing shafts 22, respectively. In this situation, when a series of control signals are outputted from the device driving control tool 300 and the motors 21 for illumination device are thus driven, the fixing shafts 22 are correspondingly rotated rapidly. As a result, the third cases 11 fixed by the fixing shafts 22 can flexibly rotate within a predetermined angle range, correspondingly to the rotation of the fixing shafts 22.
[31] Since the motors 21 for illumination device are rest in support frames 22 by bolts
23, which are firmly fixed in the first cases 11, it is possible to maintain the normal accommodation structure without specific movement, even when predetermined shock is applied from the outside.
[32] In the mean time, as shown in Fig. 1, a fourth case 18 and a mirror frame 13 are further disposed on the first case 20, in addition to the cases as described above. In this case, the fourth case 18 and the mirror frame 13 accommodate a pattern generator controlling board 19 and a mirror 15, respectively (the pattern generator controlling board may be directly received in the first case without disposing the fourth case, depending on the situations).
[33] Under such structure, the pattern generator controlling board 19 disposed in the fourth case 18 is signal-connected to the device driving control tool 300 via the connection cable 3 and precisely controls the pattern generator 25 based on control information (for example, control information about the output timing of pattern image signals, control information about the output number of patter image signals, control information about the types of pattern image signals and the like) transmitted from the device driving control tool 300, thereby enabling the pattern generator 25 to normally carry out the function of "sequentially transmitting (projecting) pattern image signals to a plastic operation target part" without specific problems. The mirror 14 disposed in the mirror frame 13 is moveably fixed and supported to the mirror frame 13 via a link shaft 15 that is connected to the mirror frame 13. When a user (for example, customer participating in a virtual plastic operation) applies force of a predetermined magnitude, the mirror is rotated up and down, correspondingly to the applied force, thereby enabling the user to adjust a photographing position of the customer by oneself without specific problems and to adjust the customer's appearance.
[34] Meanwhile, as shown in Fig. 3, the device driving control tool 300 controlling the operation of the photographing set 1 comprises a device driving controller 301 and a photographing device driving management module 360, an image transmitting/ receiving module 350, an image display module 380, an illumination brightness adj usting module 320, a camera exposure adjusting module 330, an illumination position adjusting module 310 and a photographing position adjusting module 340, which are collectively controlled by the device driving controller 301. [35] In this case, the device driving controller 301 forms a series of signal exchange relations with the photographing set 1, the operating system 107, the device connecting module 105 (Fig. 2) and the like via an information exchange gate 302. As a computation event occurs from the information processing device 100, the device driving controller displays a user interface window 501 as shown in Fig. 4 in a monitor 102 of the information processing device 1. Then, it collectively controls the capture camera 17, the pattern generator 25, the illumination device 12, the photographing set 1 and the like in accordance with a result of a computation event at a user (for example, medical institution) based on the user interface window 501.
[36] Under such situations, the photographing device driving management module 360, which is controlled by the device driving controller 301, forms a series of signal exchange relations with a three-dimensional shape data creation managing tool 400 (shown in Fig. 2), the capture camera 17, the pattern generator controlling board 19 and the like via the information exchange gate 302, the device connecting module 105 and the like. When a user selects a photographing start item 503 of the user interface window 501, for example, the photographing device driving management module notifies a series of control information about whether or not a photographing start and containing a detailed setting item (for example, control information about a driving start time point of the capture camera, control information about an image reception of the capture camera, control information about an output time point of a pattern video signal, control information about an output number of a pattern video signal, control information about a type of a pattern video signal and the like) to the capture camera 17, the pattern generator control board 19 and the like in association with the three- dimensional shape data creation managing tool 400, thereby enabling the capture camera 17 and the pattern generator control board 19 to normally carry out the operations given to them (image capturing operation, pattern generating operation and the like).
[37] In addition, the image transmitting/receiving module 350, which is controlled by the device driving controller 301, forms a series of signal exchange relations with the three-dimensional shape data creation managing tool 400, the capture camera 17 and the like via the information exchange gate 302. When a series of photographing processes are carried out by the photographing device driving management module 360, the image transmitting/receiving module communicates with the capture camera 17 to receive acquire an initial CIV captured by the camera and stores the acquired initial CIV in a temporary buffer 351. When a user confirms that the initial CIV has no specific problem, it transmits the initial CIV, which is stored in the temporary buffer 351, to the three-dimensional shape data creation managing tool 400, thereby enabling the three-dimensional shape data creation managing tool 400 to normally carry out a series of three-dimensional shape data creating processes without specific problem (refer to Fig. 5).
[38] Furthermore, the image display module 380, which is controlled by the device driving controller 301, forms a series of signal exchange relations with the operating system 107, the monitor 102 and the like of the information processing device 100 via the information exchange gate 302, the interface module 106, the device connecting module 105 and the like. When an initial CIV is stored in the temporary buffer 351 by the image transmitting/receiving module 305, the image display module displays the initial CIV through the frame 502 of the user interface window 501 displayed in the monitor 102 while communicating with the operating system 107, as shown in Fig. 5.
[39] When displaying the initial CIV, a user can flexibly observe states of the initial CIV
(for example, brightness state, photographing position state and the like) with naked eyes before the process of creating the three-dimensional shape data is progressed in earnest.
[40] Under such circumstances, the photographing position adjusting module 340, which is controlled by the device driving controller 301, forms a series of signal exchange relations with the operating system 107, the motor 35 for support shaft of the photographing set 1 and the like via the information exchange gate 302, the device connecting module 105, the interface module 106 and the like. When a user observes the initial CIV displayed through the frame 502 of the user interface window 501 and then selects a specific item 506 of the user interface window 501 so as to appropriately adjust a photographing position of the plastic operation target part, thereby determining a desired position adjustment value, the photographing position adjusting module immediately outputs a series of ascent or descent control signals to the motor 35 for support shaft of the photographing set 1 (refer to Fig. 5).
[41] As described above, the motor 35 for support shaft is fixedly arranged adjacent to the support shaft 31 of the photographing set 1 and the pinion 34 engaged with the support shaft 31 via the link recesses 33 formed at a part of the support shaft 31 is additionally arranged to the motor 35 for support shaft. Therefore, when a series of ascent or descent control signals are outputted by the photographing position adjusting module 340, the support shaft 31 engaged with the pinion 34 goes up and down, correspondingly to the rotation operation of the pinion 34. As a result, the capture camera 17 and the pattern generator 25, which are accommodated in the first case 20 and the second case 16, also go up and down in natural. Accordingly, while the photographing position adjusting module 340 carries out the functions thereof, the user can easily control the going up and down operations of the capture camera 17 and the pattern generator 25 of the photographing set 1 with a very simple computation operation of selecting the specific item 506 of the user interface window 501, thereby flexibly adjusting the photographing position of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process.
[42] In addition, the illumination position adjusting module 310, which is controlled by the device driving controller 301, forms a series of signal exchange relations with the operating system 107, the motors 21 for illumination device of the photographing set 1 and the like via the information exchange gate 302, the device connecting module 105, the interface module 106 and the like. For example, when a user observes the initial CIV displayed through the frame 502 of the user interface window 501 and then selects a specific item 504 of the user interface window 501 so as to appropriately adjust an illumination position of the plastic operation target part, thereby determining a desired illumination adjustment value, the illumination position adjusting module immediately outputs a series of control signals to the motors 21 for illumination device of the photographing set 1 (refer to Fig. 5).
[43] As described above, the motors 21 for illumination device fix the third cases 11 to be moveable via the fixing shafts 22. Therefore, when a series of control signals are outputted by the illumination position adjusting module 310, the fixing shafts 22 connected to the motors 21 for illumination device rapidly rotate, correspondingly to the rotation operation of the motors 21 for illumination device. As a result, the illumination devices 12, which are accommodated in the third cases 11, also rotate within a predetermined angle range in natural. Accordingly, while the illumination position adjusting module 310 carries out the functions thereof, the user can easily control the positions of the illumination devices 12 of the photographing set 1 with a very simple computation operation of selecting the specific item 504 of the user interface window 501, thereby flexibly adjusting the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process.
[44] In addition, the illumination brightness adjusting module 320, which is controlled by the device driving controller 301, forms a series of signal exchange relations with the operating system 107, a illumination device driving board 26 of the photographing set 1 and the like via the information exchange gate 302, the device connecting module 105, the interface module 106 and the like. For example, when a user observes the initial CIV displayed through the frame 502 of the user interface window 501 and then selects a specific item 505 of the user interface window 501 so as to appropriately adjust illumination brightness of the plastic operation target part, thereby determining a desired illumination brightness adjustment value, the illumination brightness adjusting module immediately outputs a series of control signals to the illumination device driving board 26 of the photographing set 1 (refer to Fig. 5).
[45] Under such circumstances, the illumination device driving board 26 rapidly carries out a series of processes for adjusting the brightness of the illumination device (for example, a process of lowering a voltage to be inputted to the illumination device) in accordance with the control signal of the illumination brightness adjusting module 320. As a result, while the illumination position adjusting module 310 carries out the functions thereof, the user can easily control the brightness of the illumination devices 12 of the photographing set 1 with a very simple computation operation of selecting the specific item 505 of the user interface window 501, thereby flexibly adjusting the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process.
[46] Further, the camera exposure adjusting module 330, which is controlled by the device driving controller 301, forms a series of signal exchange relations with the operating system 107, the capture camera 17 and the like via the information exchange gate 302, the device connecting module 105, the interface module 106 and the like. For example, when a user observes the initial CIV displayed through the frame 502 of the user interface window 501 and then selects a specific item 507 of the user interface window 501 so as to appropriately adjust an exposure degree of the capture camera 17, thereby determining a desired exposure adjustment value, the camera exposure adjusting module immediately outputs a series of control signals to the capture camera 17 (refer to Fig. 5).
[47] Under such circumstances, the capture camera 17 rapidly carries out a series of processes for adjusting an exposure degree thereof (for example, a process of adjusting an iris diaphragm of the capture camera) in accordance with the control signal of the camera exposure adjusting module 330. As a result, while the camera exposure adjusting module 330 carries out the functions thereof, the user can easily control the exposure degree of the capture camera 17 of the photographing set 1 with a very simple computation operation of selecting the specific item 507 of the user interface window 501, thereby flexibly adjusting the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process.
[48] Like this, according to the invention, when "an environment in which a variety of devices for photographing a plastic operation target part such as photographing device, illumination device and the like are packaged and provided" and "an environment in which a movement of the photographing set itself and the photographing device and the illumination device disposed in the photographing set can be flexibly controlled depending on photographing environments (for example, surrounding illumination degree, height of a customer and the like)" are provided, a user (for example, plastic operation related medical institution, customer) can easily acquire an optimized CIV, in which various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer are thoroughly considered, with a simple computation operation, without a separate complex process, thereby considerably improving a reliability of the virtual plastic operation without causing a difficulty to the customer and the medical institution.
[49] In the mean time, as shown in Fig. 3, according to another embodiment of the invention, a photographing position check module 390, a brightness sensing module 370 and an optimum value calculating module 391 may be further provided to the device driving controller 300, in addition to the above computation modules.
[50] In this case, the brightness sensing module 370, which is controlled by the device driving controller 301, forms a series of signal exchange relations with each sensor S exposed to the outside of the photographing set via the information exchange gate, the device connecting module and the like, as shown in Fig. 6. When a series of sensing signals (for example, sensing signal in which the surrounding brightness of the plastic operation target part is reflected) are outputted from the sensors S, the brightness sensing module receives and processes the received sending signals to generate/acquire a series of brightness sensing values in which the surrounding brightness of the plastic operation target part is reflected and transfers the acquired brightness sensing value to the optimum value calculating module 391.
[51] In addition, the photographing position check module 390, which is controlled by the device driving controller 301, forms a series of communication relations with the image display module 380 via the device driving controller 301. When the initial CIV is displayed in the frame 502 of the user interface window 501, the photographing position check module scans coordinates of the CIV in association with the image display module 380, thereby checking the position of the initial CIV. Depending on the check result, the photographing position check module acquires a series of CIV position values in which the photographing positions of the plastic operation target part are reflected and transfers the acquired CIV position values to the optimum value calculating module 391.
[52] Furthermore, the optimum value calculating module 391, which is controlled by the device driving controller 301, forms a series of signal exchange relations with the photographing position check module 390, the brightness sensing module 370 and the like. When a user (customer or medical institution) observes the initial CIV displayed in the frame 502 of the user interface window 501 and then selects an automatic control item 508 of the user interface window 501 so as to automatically correct the illumination state of the plastic operation target part, the position of the capture camera, the position of the illumination device and the like without separate manual operations, the optimum value calculating module immediately compares the result (i.e., brightness sensing value, position value) acquired by the brightness sensing module and the photographing position check module with reference values (reference brightness value, reference position value) stored in a reference value storage sector 351 to automatically calculate optimum brightness correction value and photographing position correction value and selectively transfers the calculated brightness correction value and photographing position correction value to the illumination brightness adjusting module 320, the camera exposure adjusting module 330, the illumination position adjusting module 310 and the photographing position adjusting module 340. Upon carrying out the processes, the photographing position adjusting module 340, the illumination position adjusting module 310, the illumination brightness adjusting module 320 and the camera exposure adjusting module 330 subsequently carries out a process of transmitting the control information in which the brightness correction value and photographing position correction value are reflected to each device of the photographing set 1.
[53] While the control information is additionally transmitted, the support shaft 31 engaged with the pinion 34 of the motor 35 for support shaft of the photographing set 1 additionally takes a series of correcting up and down operations, correspondingly to the rotation operation of the pinion 34. As a result, the capture camera 17 and the pattern generator 35 that are accommodated in the first case and second case 20, 16 also additionally take a series of correcting up and down operations. Therefore, the user can flexibly have an advantage in that the photographing position of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process is automatically corrected, without a separate computation operation.
[54] In addition, while the control information is additionally transmitted, the fixing shafts 22 connected to the motors 21 for illumination device of the photographing set 1 rapidly takes a correcting rotation operation as the motors 21 for illumination device are additionally operated. As a result, the illumination devices 12 accommodated in the third cases 11 take a series of correcting rotation operations within a predetermined angle range. Therefore, the user can flexibly have an advantage in that the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process is automatically corrected, without a separate computation operation.
[55] Furthermore, while the control information is additionally transmitted, the illumination device driving board 26 rapidly carries out a series of additional processes for correcting the brightness of the illumination devices 12, in accordance with the control signal of the illumination brightness adjusting module 320. Therefore, the user can flexibly have an advantage in that the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process is automatically corrected, without a separate computation operation.
[56] In addition, while the control information is additionally transmitted, the capture camera 17 rapidly carries out a series of additional processes for correcting the exposure degree thereof, in accordance with the control signal of the camera exposure adjusting module 320. Therefore, the user can flexibly have an advantage in that the color impression of the plastic operation target part (i.e., initial CIV) which will be finally acquired by a further photographing process is automatically corrected, without a separate computation operation.
[57] In the mean time, as shown in Fig. 2, the information processing device 100 is further provided with the three-dimensional shape data creation managing tool 400, in addition to the device driving control tool 300. In this case, as shown in Fig. 7, the three-dimensional shape data creation managing tool 400 comprises a three-dimensional shape data creation managing module 401 and a texture map creating module 470, a three-dimensional shape raw data creating module 410, a registration preprocessing module 420, a three-dimensional shape data registration processing module 430, a three-dimensional shape data merging processing module 440, and a three-dimensional shape data mapping processing module 460, which are collectively controlled by the three-dimensional shape data creation managing module 401 (the detailed operations of the device driving control tool are specifically described in a Korean Patent Application No. 2005-87580 filed by the applicant. The structure of the device driving control tool is not limited to the above application and may be diversely modified depending on the situations).
[58] At this time, the three-dimensional shape data creation managing module 401 selects/forms continuous signal exchange relations with the device driving control tool 300, the operating system 107 and the like via the information exchange gate 402. When a CIV containing a feature of the plastic operation target part is acquired by the association operation of the pattern generator 2 and the capture camera 1 disposed to the photographing set 1 and the acquired CIV is transmitted through the image transmitting/receiving module 360 of the device driving control tool 300, the three- dimensional shape data creation managing module receives the CIV and collectively controls/progresses a series of processes for processing and creating the three- dimensional shape data based on the received CIV.
[59] Here, the texture map creating module 470, which is controlled by the three- dimensional shape data creation managing module 401, forms a series of communication relations with the device driving control tool 300 via the information exchange gate 402. When the above CIV is acquired, the texture map creating module rapidly progresses a process of mathematically processing the acquired CIV, for example finely dividing the space constituting the CIV and setting a series of coordinates to the divided pixels, thereby creating a source texture map corresponding to the CIV.
[60] In addition, the three-dimensional shape raw data creating module 410, which is controlled by the three-dimensional shape data creation managing module 401, forms a series of communication relations with the device driving control tool 300 via the information exchange gate 402. When the above CIV is acquired, the three-dimensional shape raw data creating module progresses a process of mathematically processing the acquired CIV (for example, a space coding process, a process of calculating/setting 3D coordinates, a process of meshing each image, and the like), thereby creating three- dimensional shape raw data in which a feature of the plastic operation target part is reflected, separately from the texture map.
[61] In this case, the three-dimensional shape raw data creating module 410 may comprise a space coding section that finely divides a measurement space of the CIV into each pixel and allots inherent space codes to each of the divided pixels, a 3D coordinates calculating section that calculates the space codes of each pixel allotted by the space coding section through a predetermined correction coefficient and converts the calculated space codes into three-dimensional coordinates, and a meshing processing section that connects and meshes the respective pixels to which the three- dimensional coordinates are provided.
[62] In the mean time, when the three-dimensional shape raw data in which a feature of the plastic operation target part is reflected is completely created by the three- dimensional shape raw data creating module 410, the registration preprocessing module 420, which is controlled by the three-dimensional shape data creation managing module 401 together with the three-dimensional shape raw data creating module 410, systematically progresses processes of pre-processing the three-dimensional shape raw data, for example a process of removing a background of the three-dimensional shape raw data and making only the plastic operation target part into data (i.e., a process for extracting an object area), a process of subdividing to interpolate a space code image of the plastic operation target part (i.e., a process for interpolating a space code), a process of interpolating to recover the hollow space data of the plastic operation target part (i.e., a process for filling a hole), a process of Gaussian-filtering the plastic operation target part to uniformalize the data distribution (i.e., a process for Gaussian smoothing a space code) and the like, before carrying out the registration processing process in earnest, thereby enabling the registration processing resultant to have maintain a quality of a predetermined degree or more.
[63] Here, when the preprocessing process of the three-dimensional shape raw data is completed by the registration preprocessing module 420 and then a user (for example, medical institution) uses the information input/output device connected to the information processing device 100, for example, mouse, keyboard 104 and the monitor 102 to progress a series of computation operations for registration-processing the three-dimensional shape raw data (for example, a process of marking corresponding points to the three-dimensional shape data and requesting the registration of the points), the three-dimensional shape data registration processing module 430, which is controlled by the three-dimensional shape data creation managing module 401, adjusts the mutual positions of the three-dimensional shape raw data in accordance with the computation operation of the user, thereby registering the three-dimensional shape raw data into the single registered three-dimensional shape data.
[64] In addition, when the three-dimensional shape data registration processing module
430 completes the execution of the functions thereof, the three-dimensional shape data merging processing module 440, which is controlled by the three-dimensional shape data creation managing module 401, sequentially progresses each processing step for merging the above registered three-dimensional shape data into merged three- dimensional shape data of a single object, for example, a merging step, a combined area correcting step, a hole area correcting step, a polygon number correcting step, a mesh quality correcting step and the like, in accordance with the merge processing basis information that is self-defined and provided for each registration type in advance, thereby automatically creating a series of merged three-dimensional shape data.
[65] Furthermore, when a series of merged three-dimensional shape data are automatically created by the three-dimensional shape data merging processing module 440 and a message of "Map the merged three-dimensional shape data into the texture map" is correspondingly transmitted from the three-dimensional shape data creation managing module 401, the three-dimensional shape data mapping processing module 460, which is controlled by the three-dimensional shape data creation managing module 401, immediately communicates with the texture map creating module 470 to acquire the created texture map and then maps the merged three-dimensional shape data into the texture map acquired, thereby creating the final three-dimensional shape data.
[66] As described above, according to the invention, the CIV, which is a basis for the three-dimensional shape data, maintains the optimized state, in which various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer are thoroughly considered as the device driving control tool 300 carries out the functions thereof. Accordingly, when the invention is realized, the three-dimensional shape data, which is finally formed by the computation modules of the three-dimensional shape data creation managing tool 300, can also maintain the optimized quality in natural. As a result, the user (customer, medical institution) can have an advantage in that the reliability of the overall virtual plastic operation is considerably improved.
[67] Later, the medical institution (user) shows to the customer having a plastic operation plan a feature after a plastic operation on the plastic operation target part through a three-dimensional CIV editing process using the commercialized image editing tool, thereby enabling the plastic operation to be smoothly progressed under consultation with the customer. Industrial Applicability
[68] As described above, according to the invention, it is possible to systematically dispose a photographing set, in which a variety of devices for photographing a plastic operation target part such as photographing device, illumination device and the like are packaged and dispersion-disposed at proper locations, and computation modules capable of flexibly controlling a movement of the photographing set itself as well as states of the photographing device and the illumination device disposed in the photographing set depending on photographing environments (for example, surrounding illumination degree, height of a customer and the like), with being linked, so that a plastic operation related medical institution can easily acquire an optimized CIV, in which various parameters such as illumination degree at the surroundings of a plastic operation target part and a photographing position depending on a height of a customer are thoroughly considered, even with a simple computation operation, without a separate complex process, thereby considerably improving a reliability of the virtual plastic operation without causing a difficulty to the customer and the medical institution.
[69] While the invention has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made thereto without departing from the spirit and scope of the invention as defined by the appended claims.
[70]

Claims

Claims
[1] A CIV acquiring system comprising: a photographing set that has a capture camera, a pattern generator and an illumination device and link-operates the capture camera, the pattern generator and the illumination device to generate a CIV (Capture Image for Virtual plastic operation) having reflected a plastic operation target part while going up and down depending on an outside computation control, the capture camera, the pattern generator and the illumination device being provided to one place and dispersion-disposed depending on functions thereof; and a device driving control tool that belongs to an information processing device electrically connected to the photographing set and selectively communicates with the photographing set to control driving states of the capture camera, the pattern generator and the illumination device and the up and down states of the photographing set, correspondingly to photographing environments.
[2] The CIV acquiring system according to claim 1, wherein the photographing set comprises: a support shaft that goes up and down in accordance with a control signal outputted from the device driving control tool, with being fixed to be moveable in all direction; a first case that accommodates one of the capture camera and the pattern generator and goes up and down in accordance with the up and down operation of the support shaft, with being fixed to the support shaft; a second case that accommodates the other of the capture camera and the pattern generator and goes up and down in accordance with the up and down operation of the support shaft, with being fixed to an upper end of the first case; and a third case that is put on the first case with being uprightly stood at an outer side of the second case, accommodates the illumination device, goes up and down in accordance with the up and down operation of the support shaft and self-rotates within a predetermined angle range in accordance with a control signal outputted from the device driving control tool.
[3] The CIV acquiring system according to claim 2, further comprising: a motor for support shaft that is signal-connected to the device driving control tool, is fixed to a part of the support shaft and is selectively driven in accordance with a control signal outputted from the device driving control tool; and a pinion that is moveably fixed to the motor for support shaft with being engaged with a part of the support shaft and is rotated as the motor for support shaft is driven, thereby selectively causing the up and down operation of the support shaft.
[4] The CIV acquiring system according to claim 2, further comprising: a motor for illumination device that fixes the third case to be moveable with being accommodated in the first case, is signal-connected to the device driving control tool and is selectively driven in accordance with a control signal outputted from the device driving control tool, thereby selectively causing the self -rotation operation of the third case.
[5] The CIV acquiring system according to claim 1, wherein the device driving control tool comprises: a device driving controller that collectively controls the operating processes of the capture camera, the pattern generator, the illumination device and the photographing set while continuing to check an occurrence of a computation event at the information processing device; a photographing device driving management module that is controlled by the device driving controller and notifies/controls whether or not a photographing start and a detailed setting item of the capture camera and the pattern generator; a photographing position adjusting module that is controlled by the device driving controller and controls the up and down states of the photographing set so as to adjust positions of the capture camera and the pattern generator, thereby adjusting a photographing position of the plastic operation target part; an illumination position adjusting module that is controlled by the device driving controller and controls a position of the illumination device so that a color impression of the plastic operation target part is different; an illumination brightness adjusting module that is controlled by the device driving controller and controls brightness of the illumination device so that a color impression of the plastic operation target part is different; and a camera exposure adjusting module that is controlled by the device driving controller and controls an exposure degree of the capture camera so that a color impression of the plastic operation target part is different.
[6] The CIV acquiring system according to claim 5, further comprising: a brightness sensing module that is controlled by the device driving controller and senses brightness of surroundings of the plastic operation target part; a photographing position check module that is controlled by the device driving controller and checks a photographing position of the plastic operation target part; and an optimum value calculating module that is controlled by the device driving controller, compares a result acquired by the brightness sensing module and the photographing position check module with a reference value to automatically calculate a brightness correction value and a photographing position correction value and transmits the calculated brightness correction value and photographing position correction value to the illumination brightness adjusting module, the camera exposure adjusting module, the illumination position adjusting module and the photographing position adjusting module to enable the photographing position and the color impression of the plastic operation target part to be automatically adjusted.
PCT/KR2006/004489 2006-10-31 2006-10-31 A system of acquiring a capture image for virtual plastic WO2008054033A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2006/004489 WO2008054033A1 (en) 2006-10-31 2006-10-31 A system of acquiring a capture image for virtual plastic

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2006/004489 WO2008054033A1 (en) 2006-10-31 2006-10-31 A system of acquiring a capture image for virtual plastic

Publications (1)

Publication Number Publication Date
WO2008054033A1 true WO2008054033A1 (en) 2008-05-08

Family

ID=39344363

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2006/004489 WO2008054033A1 (en) 2006-10-31 2006-10-31 A system of acquiring a capture image for virtual plastic

Country Status (1)

Country Link
WO (1) WO2008054033A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR890000272Y1 (en) * 1985-12-30 1989-03-08 삼성전자 주식회사 The photographing position pursuit equipment of the camera
US5748992A (en) * 1990-12-10 1998-05-05 Nikon Corporation Camera control device
KR200189932Y1 (en) * 2000-03-02 2000-07-15 강원형 Apparatus for photographing face
KR200258645Y1 (en) * 2001-03-13 2001-12-28 김형진 Multi-directional automated photographing device of a reflective objects
KR20050051748A (en) * 2003-11-28 2005-06-02 (주)아이티큐브 Method for photograph service of kiosk using pan tilt camera and computer readable record medium on which program therefor is recorded
KR20070045388A (en) * 2005-10-27 2007-05-02 (주)맥서러씨 The system which acquisition the capture image for virtual plastic

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR890000272Y1 (en) * 1985-12-30 1989-03-08 삼성전자 주식회사 The photographing position pursuit equipment of the camera
US5748992A (en) * 1990-12-10 1998-05-05 Nikon Corporation Camera control device
KR200189932Y1 (en) * 2000-03-02 2000-07-15 강원형 Apparatus for photographing face
KR200258645Y1 (en) * 2001-03-13 2001-12-28 김형진 Multi-directional automated photographing device of a reflective objects
KR20050051748A (en) * 2003-11-28 2005-06-02 (주)아이티큐브 Method for photograph service of kiosk using pan tilt camera and computer readable record medium on which program therefor is recorded
KR20070045388A (en) * 2005-10-27 2007-05-02 (주)맥서러씨 The system which acquisition the capture image for virtual plastic

Similar Documents

Publication Publication Date Title
US20230161157A1 (en) Image generation apparatus and image generation method
CN107209950B (en) Automatic generation of virtual material from real world material
US9233026B2 (en) Vision aid with three-dimensional image acquisition
US10902635B2 (en) Line-of-sight detection device
US7199934B2 (en) Head-mounted display apparatus
US7095424B2 (en) Image display apparatus and method, and storage medium
US6694194B2 (en) Remote work supporting system
US9008371B2 (en) Method and system for ascertaining the position and orientation of a camera relative to a real object
CN107430786A (en) Electronical display for head mounted display is stable
JPWO2018159338A1 (en) Medical support arm system and controller
CN108828779B (en) Head-mounted display equipment
CN103458763B (en) Capsule-type endoscope system, method for displaying image and image display program
CN110770636B (en) Wearable image processing and control system with vision defect correction, vision enhancement and perception capabilities
CN113689577B (en) Method, system, equipment and medium for matching virtual three-dimensional model with entity model
KR101580559B1 (en) Medical image and information real time interaction transfer and remote assist system
KR101595962B1 (en) Colnoscopy surgery simulation system
KR100982171B1 (en) The System for Capturing 2D Facial Image
JP2021039444A (en) Image processing device, control method and program thereof
KR100717709B1 (en) The system which acquisition the capture image for virtual plastic
WO2008054033A1 (en) A system of acquiring a capture image for virtual plastic
EP4075786A1 (en) Image processing device, system, image processing method and image processing program
Chen et al. Development of a thermal and hyperspectral imaging system for wound characterization and metabolic correlation
EP3949828A1 (en) Imaging control device, imaging control method, program, and imaging system
KR100926281B1 (en) Method for generating and rendering 3 dimensional image and apparatus thereof
CN112659117A (en) Three-dimensional scanning method based on three-dimensional scanner, robot and rotary table

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06812328

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS EPO FORM 1205A DATED 14.07.2009.

122 Ep: pct application non-entry in european phase

Ref document number: 06812328

Country of ref document: EP

Kind code of ref document: A1