CN110853106B - Oral scanning system and oral scanning image processing method - Google Patents

Oral scanning system and oral scanning image processing method Download PDF

Info

Publication number
CN110853106B
CN110853106B CN201911034011.0A CN201911034011A CN110853106B CN 110853106 B CN110853106 B CN 110853106B CN 201911034011 A CN201911034011 A CN 201911034011A CN 110853106 B CN110853106 B CN 110853106B
Authority
CN
China
Prior art keywords
image
color intensity
intensity value
value
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911034011.0A
Other languages
Chinese (zh)
Other versions
CN110853106A (en
Inventor
黄敏雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qisda Optronics Suzhou Co Ltd
Qisda Corp
Original Assignee
Qisda Optronics Suzhou Co Ltd
Qisda Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qisda Optronics Suzhou Co Ltd, Qisda Corp filed Critical Qisda Optronics Suzhou Co Ltd
Priority to CN201911034011.0A priority Critical patent/CN110853106B/en
Publication of CN110853106A publication Critical patent/CN110853106A/en
Application granted granted Critical
Publication of CN110853106B publication Critical patent/CN110853106B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • A61B6/51
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Abstract

The invention provides a mouth scanning system and a mouth scanning image processing method, the mouth scanning system comprises a projection module, an image sensing unit and a processing unit, the image sensing unit senses that the projection module projects a first color light to irradiate a target object to obtain a first image and a second image, the processing unit synthesizes and forms a first graph according to the first image and a second graph according to the second image, the first graph is adjacent to the second graph, the first graph and the second graph are synthesized to generate a first connection, the first image and the second image respectively have a first color intensity value and a second color intensity value at the first connection, when the first color intensity value is different from the second color intensity value, the color intensity value of the first image or the color intensity value of the second image is adjusted to be the same through a first adjusting value, and therefore, the mouth scanning system material image can be automatically and rapidly adjusted to increase the watching comfort level.

Description

Oral scanning system and oral scanning image processing method
Technical Field
The present invention relates to a mouth-scanning system and a mouth-scanning image processing method, and more particularly, to a mouth-scanning system and a mouth-scanning image processing method for rapidly adjusting an image of the mouth-scanning system.
Background
When reconstructing a dental cast using a manual oral scanner, it is necessary to assist a dentist to clearly know the currently scanned tooth part through a 3D dental cast constructed in real time, and therefore it is relatively important that the color of the scanned 3D dental cast can be reproduced as much as possible. It is necessary to reproduce 3D depth by taking a picture when reconstructing a 3D dental model, and restoring the real image of the taken picture and pasting it on the surface of the 3D dental model becomes a technique very important for dentists, but may become a distorted state when reproducing due to the brightness difference of each picture caused by the special environment in the oral cavity, resulting in poor appearance.
At present, a light-projecting oral scanner projects a fixed stripe pattern by a projector, then a camera shoots the pattern, and further calculates the 3D depth of the tooth surface in the region. The light intensity of the projector is usually set to a fixed value, and the set value is used as the constant light intensity setting of the projector during the projection image capturing process. However, different tissue materials in the oral cavity have different degrees of absorption or reflection of light, and different artificial filling tissues of teeth have different characteristics. From the beginning to the end of scanning, due to different scanning positions and different scanned tissue materials, the projection brightness at some positions is too high, but the projection brightness at some positions is too low, and the exposure time of the camera during image capturing by the camera also affects the quality of the image obtained by the camera. The surface of the 3D dental model is reproduced by the above mechanism as if the pictures are used to make a mosaic on the surface of the 3D dental model one by one, but the most obvious problem is caused by the fact that the material of the surface has obvious bright and dark parts when the dental model is reconstructed due to the brightness difference of the individual pictures, and the material of the dental model surface has obvious color boundary lines.
Therefore, it is necessary to design a new oral scanning system and an oral scanning image processing method to overcome the above-mentioned drawbacks.
Disclosure of Invention
The invention aims to provide a mouth-scanning system and a mouth-scanning image processing method, which can automatically and quickly adjust the material image of the mouth-scanning system so as to increase the watching comfort level.
To achieve the above object, the present invention provides a mouth-scanning system, comprising: the projection module projects first color light; the image sensing unit senses that the first color light irradiates the target object to obtain a first image when the oral scanning system is at a first position, and the image sensing unit senses that the first color light irradiates the target object to obtain a second image when the oral scanning system is at a second position; the processing unit is coupled with the projection module and the image sensing unit respectively, forms a first graph according to the first image synthesis, and forms a second graph according to the second image synthesis; the processing unit is used for adjusting the color intensity value of the first image or the color intensity value of the second image through a first adjusting value when the first color intensity value is different from the second color intensity value, so that a third color intensity value of the first image after adjustment at the first junction is the same as a fourth color intensity value of the second image after adjustment at the first junction.
Preferably, the oral scanning system further includes a first ratio and a limit value range, the projection module further projects a second color light, the image sensing unit senses that the second color light irradiates the target object when the oral scanning system is at the first position to obtain a third image, the image sensing unit senses that the second color light irradiates the target object when the oral scanning system is at the second position to obtain a fourth image, the processing unit adjusts the color intensity value of the third image through the first adjustment value and synthesizes the adjusted first image and the adjusted third image into a third image or the processing unit adjusts the color intensity value of the fourth image through the first adjustment value and synthesizes the adjusted second image and the adjusted fourth image into a fourth image, the third image is adjacent to the fourth image, and when the color intensity value of the third image or the color intensity value of the fourth image exceeds the limit value range, the processing unit increases or decreases the color intensity value of the third image or the color intensity value of the fourth image through the first ratio so that the color intensity value of the third image or the color intensity value of the fourth image is within the limit value range.
Preferably, the oral scanning system further includes a second adjustment value and a third adjustment value, the image sensing unit senses that the first color light irradiates the target object when the oral scanning system is at a third position to obtain a fifth image, the processing unit synthesizes and forms a fifth map according to the fifth image, the fifth map is adjacent to the second map, the processing unit uses the color intensity value of the first map as a reference, the processing unit adjusts the color intensity value of the second map through the first adjustment value, the processing unit adjusts the color intensity value of the fifth map through the second adjustment value, when the color intensity value of the fifth map exceeds a range, the processing unit adjusts the color intensity value of the first map through the third adjustment value, adjusts the color intensity value of the second map adjusted by the first adjustment value, and adjusts the color intensity value of the fifth map adjusted by the second adjustment value so that the color intensity value of the first map adjusted by the third adjustment value, the color intensity value of the second map adjusted by the first adjustment value, and the third adjustment value pass through the second adjustment value and then the color intensity value of the fifth map.
Preferably, the projection module further projects a second color light and a third color light, the first color light is blue light, the second color light is green light, and the third color light is red light, and the first image and the second image formed by synthesis are color images.
Preferably, the processing unit obtains the first adjustment value by a difference between the first color intensity value and the second color intensity value.
The invention also provides a mouth scan image processing method, which is characterized by comprising the following steps: projecting a first color light; sensing the first color light irradiating the target object when the oral scanning system is at a first position to acquire a first image; sensing the first color light to irradiate the target object when the oral scanning system is at a second position so as to acquire a second image; synthesizing to form a first image according to the first image; synthesizing to form a second image according to the second image; synthesizing the first graph and the second graph to generate a first connection, wherein the first image has a first color intensity value at the first connection, and the second image has a second color intensity value at the first connection; and when the first color intensity value is different from the second color intensity value, adjusting the color intensity value of the first image or the color intensity value of the second image by a first adjustment value to make a third color intensity value of the first image after adjustment at the first junction be the same as a fourth color intensity value of the second image after adjustment at the first junction.
Preferably, the mouth-sweeping method further comprises the following steps: projecting a second color light; sensing the second color light irradiating the target object when the oral scanning system is at the first position to obtain a third image; sensing the second color light irradiating the target object when the oral scanning system is at a second position to obtain a fourth image; adjusting the color intensity value of the third image through the first adjustment value and synthesizing the adjusted first image and the adjusted third image into a third image or adjusting the color intensity value of the fourth image through the first adjustment value and synthesizing the adjusted second image and the adjusted fourth image into a fourth image through the processing unit, wherein the third image is adjacent to the fourth image; and when the color intensity value of the third graph or the color intensity value of the fourth graph exceeds the limit value range, the processing unit adjusts the color intensity value of the third graph or the color intensity value of the fourth graph to be increased or decreased through the first comparison so that the color intensity value of the third graph and the color intensity value of the fourth graph are in the limit value range.
Preferably, the mouth-sweeping method further comprises the following steps: sensing the first color light irradiating the target object when the oral scanning system is at a third position to obtain a fifth image; synthesizing to form a fifth image according to the fifth image, wherein the fifth image is adjacent to the second image; adjusting the color intensity value of the second graph through the first adjusting value; adjusting the color intensity value of the fifth image by the second adjustment value; and when the color intensity value of the fifth graph exceeds the limit value range, adjusting the color intensity value of the first graph by the third adjusting value, adjusting the color intensity value of the second graph adjusted by the first adjusting value and adjusting the color intensity value of the fifth graph adjusted by the second adjusting value to enable the color intensity value of the first graph adjusted by the third adjusting value, the color intensity value of the second graph adjusted by the first adjusting value and the third adjusting value, and the color intensity value of the fifth graph adjusted by the second adjusting value and the third adjusting value to be within the limit value range.
Preferably, the mouth-sweeping method further comprises the following steps: and projecting a second color light and a third color light, wherein the first color light is blue light, the second color light is green light, and the third color light is red light, and the first image and the second image formed by synthesis are color images.
Preferably, the mouth-sweeping method further comprises the following steps: and obtaining the first adjustment value according to the difference value between the first color intensity value and the second color intensity value.
Compared with the prior art, the oral scanning system and the oral scanning image processing method provided by the invention comprise a projection module, an image sensing unit and a processing unit, wherein the image sensing unit sequentially senses a first color light projected by the projection module to irradiate a target object so as to obtain a first image and a second image, the processing unit is respectively coupled with the projection module and the image sensing unit, the processing unit synthesizes the first image into a first graph according to the first image, the processing unit synthesizes the second image into a second graph according to the second image, the processing unit synthesizes the first graph and the second graph into a first reproduction graph, the first reproduction graph comprises a first area and a second area, the first area and the second area are connected at a first connection, the first image has a first color intensity value at the first connection, the second image has a second color intensity value at the first connection, when the first color intensity value is different from the second color intensity value, the processing unit adjusts the color intensity value of the first image or the color intensity value of the second image through the first adjustment value so that the third color intensity value and the second color intensity value after the first connection are adjusted at the first connection are the first connection, the third color intensity value and the second color intensity value of the second image are the same, and the comfort level of the oral scanning system can be automatically increased.
Drawings
FIG. 1 is a functional block diagram of a mouth scanning system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating adjustment of a mouth-scanning system at a first position according to an embodiment of the present invention;
FIG. 3 is a schematic view of an adjustment of the oral scanning system in a second position according to an embodiment of the present invention;
FIG. 4 is a first sequential schematic view of an embodiment of the present invention;
FIG. 5 is a schematic view of an adjustment of a mouth-scanning system at a first position according to another embodiment of the present invention;
FIG. 6 is a schematic view of the adjustment of the oral scanning system in a second position according to another embodiment of the present invention;
FIG. 7 is a flowchart illustrating a method for processing an oral scan image according to an embodiment of the invention;
FIG. 8 is a flowchart illustrating a method for processing an oral scan image according to another embodiment of the present invention;
fig. 9 is a flowchart of a mouth-scan image processing method according to another embodiment of the invention.
Detailed Description
In order to further understand the objects, structures, features and functions of the present invention, the following embodiments are described in detail.
Referring to fig. 1, fig. 2, fig. 3 and fig. 4, fig. 1 is a functional block diagram of a mouth-sweeping system according to an embodiment of the present invention, fig. 2 is an adjustment diagram of the mouth-sweeping system at a first position according to an embodiment of the present invention, fig. 3 is an adjustment diagram of the mouth-sweeping system at a second position according to an embodiment of the present invention, and fig. 4 is a first continuous diagram according to an embodiment of the present invention. The present invention provides a mouth-scanning system 100, which includes a projection module 1, an image sensing unit 2 and a processing unit 3, wherein the projection module 1 projects a first color light 101, the image sensing unit 2 senses that the mouth-scanning system 100 irradiates a target object with the first color light 101 when in a first position to obtain a first image 211, the image sensing unit 2 senses that the mouth-scanning system 100 irradiates the target object with the first color light 101 when in a second position to obtain a second image 221, the processing unit 3 is coupled to the projection module 1 and the image sensing unit 2 in a distributed manner, the processing unit 3 synthesizes the first image according to the first image 211 to form a first graph, the processing unit 3 synthesizes the first graph according to the second image 221 to form a second graph, wherein the first graph and the second graph are adjacent to each other, the processing unit synthesizes the first graph and the second graph to form a first connection 501, the first image 211 has a first color intensity value at the first connection 501, the second image 221 has a second color intensity value at the first connection 501, and when the first color intensity value and the second color intensity value are different from the first color intensity value, the processing unit 3 adjusts a first color intensity value a second color intensity value at the first connection 221 a second color intensity value, so that the second color intensity value is equal to a third color intensity value of the image 221, and a third color intensity value of the image 221 after the image 501 is adjusted at the third color intensity value, so that the third image can be viewed at the same comfort level after the image 501.
In practical implementation, the projection module 1 may be a projector, and the first color light 101 is blue light; the image sensing unit 2 may be a Charge-coupled Device (CCD) sensor, a Complementary Metal-Oxide semiconductor (CMOS) sensor or other sensors; the processing unit 3 can be a processor or a controller with data operation/processing function; the oral scanning system 100 may further include a storage unit 4, and the storage unit 4 may be a memory or other data storage device, as the case may be.
Further, the oral scanning system 100 further includes a first ratio b and a limit value range, the projection module 1 further projects a second color light 102, the image sensing unit 2 senses that the second color light 102 irradiates the target object when the oral scanning system 100 is at the first position to obtain a third image 212, the image sensing unit 2 senses that the second color light 102 irradiates the target object when the oral scanning system 100 is at the second position to obtain a fourth image 222, the processing unit 3 adjusts the color intensity value of the third image 212 by a first adjustment value a and synthesizes the adjusted first image 211 and the adjusted third image 212 into a third image or the processing unit adjusts the color intensity value of the fourth image 222 by the first adjustment value a and synthesizes the adjusted second image 221 and the adjusted fourth image 222 into a fourth image, the third image is adjacent to the fourth image, and when the color intensity value of the third image or the color intensity value of the fourth image exceeds the limit value range, the processing unit adjusts the color intensity value of the third image or the color intensity value of the fourth image by a first ratio b to increase or decrease the color intensity value of the third image or the color intensity value of the fourth image within the limit value range.
Further, the oral scanning system 100 further includes a second adjustment value and a third adjustment value, the image sensing unit 2 senses that the first color light 101 irradiates the target object when the oral scanning system 100 is at the third position to obtain a fifth image, the processing unit 3 synthesizes and forms a fifth map according to the fifth image, the fifth map is adjacent to the second map, the processing unit 3 uses the color intensity value of the first map as a reference, the processing unit 3 adjusts the color intensity value of the second map by the first adjustment value a, the processing unit 3 adjusts the color intensity value of the fifth map by the second adjustment value, when the color intensity value of the fifth map exceeds the limit value range, the processing unit 3 adjusts the color intensity value of the first map by the third adjustment value, adjusts the color intensity value of the second map adjusted by the first adjustment value and adjusts the color intensity value of the fifth map adjusted by the second adjustment value so that the color intensity value of the first map adjusted by the first adjustment value and then adjusted by the third adjustment value are within the range of the color intensity value of the fifth map adjusted by the second adjustment value.
Further, the synthesized first, second, third, fourth and fifth images are color images. Further, the oral scanning system 100 further includes a second ratio, when the color intensity value of one of the plurality of images adjusted by the processing unit 3 exceeds the limit value range, the processing unit 3 divides the problem area of one of the plurality of images exceeding the limit value range into a plurality of blocks, and the color intensity values of the plurality of blocks are sequentially adjusted according to the second ratio.
In one embodiment, referring to fig. 4, 5 and 6, it is assumed that the first color light 101 is blue light, the second color light 102 is green light and the third color light 103 is red light, the first image 211 and the second image 221 are blue light images, the third image 212 and the fourth image 222 are green light images, and the sixth image 213 and the seventh image 223 are red light images. The first adjustment value a is a difference value between the first color intensity value and the second color intensity value.
In one embodiment, referring to fig. 4, 5 and 6, it is assumed that the first color light 101 is blue light, the second color light 102 is green light and the third color light 103 is red light, the first image 211 and the second image 221 are blue light images, the third image 212 and the fourth image 222 are green light images, and the sixth image 213 and the seventh image 223 are red light images. The processing unit 3 adjusts the color intensity value of the first image 211, the color intensity value of the third image 212, the color intensity value of the sixth image 213, the color intensity value of the second image 221, the color intensity value of the fourth image 222, and the color intensity value of the seventh image 223 by using the first adjustment value a, so that the first adjustment value a adjusts the color intensity value of the first image 211, the color intensity value of the third image 212, the color intensity value of the sixth image 213, the color intensity value of the second image 221, the color intensity value of the fourth image 222, and the color intensity value of the seventh image 223 within the limit value range of 0-255. The increase or decrease of the overall color intensity value of the first region depends on whether the color of the second region is saturated or too dark, i.e. exceeds the limit value range of 0-255, and the color intensity value adjusted by the first adjustment value a should not exceed the maximum value minus 20% or fall below the minimum value plus 20%. When the color intensity value of one of the images adjusted by the first adjustment value a is close to the threshold value, the color intensity value is decreased or increased in a non-single direction, the color intensity values of the images 311, and 313 are decreased and the color intensity values of the images 321, 322, and 323 are increased according to the first ratio b (for example, if the color intensity values of the images 321, 322, and 323 need to be adjusted to be 0.89 times, the brightness of a certain range of the image synthesized by the images 321, 322, and 323 is decreased, the color intensity values of the images 311, and 313 are increased by 10%, and the color intensity values of the images 321, 322, and 323 are increased by 10% after being adjusted to be 0.89 times, that is, the color intensity values of the images 311, and 313 are adjusted to be 1.1 times, and the color intensity values of the images 321, 322, and 323 are adjusted to be 0.979 times, wherein 10% is an undetermined value, but an trial error value may be set according to actual conditions), or vice versa. The cyan-red image adjusted by the first ratio b is synthesized into a first color image 31 and a second color image 32.
In one embodiment, referring to fig. 4, 5 and 6, it is assumed that the first color light 101 is blue light, the second color light 102 is green light and the third color light 103 is red light, the first image 211 and the second image 221 are blue light images, the third image 212 and the fourth image 222 are green light images, and the sixth image 213 and the seventh image 223 are red light images. The processing unit 3 adjusts the color intensity value of the first image 211, the color intensity value of the third image 212, the color intensity value of the sixth image 213, the color intensity value of the second image 221, the color intensity value of the fourth image 222, and the color intensity value of the seventh image 223 by using the first adjustment value a, so that the first adjustment value a adjusts the color intensity value of the first image 211, the color intensity value of the third image 212, the color intensity value of the sixth image 213, the color intensity value of the second image 221, the color intensity value of the fourth image 222, and the color intensity value of the seventh image 223 within the limit value range of 0-255. The increase or decrease of the color intensity value of the whole first area depends on whether the color of the second area is saturated or too dark, i.e. exceeds the limit value range of 0-255, and the color intensity value adjusted by the first adjustment value a should not exceed the maximum value minus 20% or fall below the minimum value plus 20%. When the color intensity value of one of the images adjusted by the first adjustment value a is close to the threshold value, the color intensity value is decreased or increased in a non-single direction, the color intensity values of the images 311, and 313 are decreased and the color intensity values of the images 321, 322, and 323 are increased according to the first ratio b (for example, if the color intensity values of the images 321, 322, and 323 need to be adjusted to be 0.89 times, the brightness of a certain range of the image synthesized by the images 321, 322, and 323 is decreased, the color intensity values of the images 311, and 313 are increased by 10%, and the color intensity values of the images 321, 322, and 323 are increased by 10% after being adjusted to be 0.89 times, that is, the color intensity values of the images 311, and 313 are adjusted to be 1.1 times, and the color intensity values of the images 321, 322, and 323 are adjusted to be 0.979 times, wherein 10% is an undetermined value, but an trial error value may be set according to actual conditions), or vice versa. The cyan-red image adjusted by the first ratio b is synthesized into a first color image 31 and a second color image 32. When the color intensity values of the plurality of images are adjusted, the color intensity values of the images scanned later are adjusted by taking the first image as a reference, and the color intensity values are sequentially adjusted according to the scanning sequence of the images. For example: the first graph is a reference, so its value is 1, the second graph is slightly brighter, so its value must be adjusted lower, i.e. multiplied by 0.9, the fifth graph is slightly darker, i.e. multiplied by 1.1, the sixth graph is too dark and multiplied by 1.5, so the brightness adjustment values of the photos are sequentially known to be [1;0.9;1.1;1.5]; when the sixth graph is calculated to be increased by 1.5 times, but the brightness of some parts of the sixth graph is increased by 1.5 times and exceeds the limit range, and needs to be reduced by 0.9 time, the previously calculated ratios are adjusted according to the finally calculated values, namely [1 x 0.9;0.9 x 0.9;1.1 x 0.9;1.5*0.9].
In one embodiment, referring to fig. 4, 5 and 6, it is assumed that the first color light 101 is blue light, the second color light 102 is green light and the third color light 103 is red light, the first image 211 and the second image 221 are blue light images, the third image 212 and the fourth image 222 are green light images, and the sixth image 213 and the seventh image 223 are red light images. The processing unit 3 adjusts the color intensity value of the first image 211, the color intensity value of the third image 212, the color intensity value of the sixth image 213, the color intensity value of the second image 221, the color intensity value of the fourth image 222, and the color intensity value of the seventh image 223 by the first adjustment value a, so that the first adjustment value a adjusts the color intensity value of the first image 211, the color intensity value of the third image 212, the color intensity value of the sixth image 213, the color intensity value of the second image 221, the color intensity value of the fourth image 222, and the color intensity value of the seventh image 223 within the limit range of 0-255. The increase or decrease of the overall color intensity value of the first region depends on whether the color of the second region is saturated or too dark, i.e. exceeds the limit value range of 0-255, and the color intensity value adjusted by the first adjustment value a should not exceed the maximum value minus 20% or fall below the minimum value plus 20%. When the color intensity value of one of the images adjusted by the first adjustment value a is close to the threshold value, the color intensity value is decreased or increased in a non-single direction, the color intensity values of the images 311, and 313 are decreased and the color intensity values of the images 321, 322, and 323 are increased according to the first ratio b (for example, if the color intensity values of the images 321, 322, and 323 need to be adjusted to be 0.89 times, the brightness of a certain range of the image synthesized by the images 321, 322, and 323 is decreased, the color intensity values of the images 311, and 313 are increased by 10%, and the color intensity values of the images 321, 322, and 323 are increased by 10% after being adjusted to be 0.89 times, that is, the color intensity values of the images 311, and 313 are adjusted to be 1.1 times, and the color intensity values of the images 321, 322, and 323 are adjusted to be 0.979 times, wherein 10% is an undetermined value, but an trial error value may be set according to actual conditions), or vice versa. The cyan-red image adjusted by the first ratio b is combined into a first color image 31 and a second color image 32. When the color intensity values of the plurality of images are adjusted, the color intensity values of the images scanned later are adjusted by taking the first image as a reference, and the color intensity values are sequentially adjusted according to the scanning sequence of the images. For example: the first graph is a reference, so its value is 1, the second graph is slightly brighter, so its value must be adjusted lower, i.e. multiplied by 0.9, the fifth graph is slightly darker, i.e. multiplied by 1.1, the sixth graph is too dark and multiplied by 1.5, so the brightness adjustment values of the photos are sequentially known to be [1;0.9;1.1;1.5]; when the sixth graph is calculated to find that it needs to be increased by 1.5 times, but 1.5 times will cause the brightness of some parts to be too high to exceed the limit range, and needs to be reduced by 0.9 times, the previously calculated ratios are all adjusted according to the last calculated value, i.e., [1 x 0.9;0.9 x 0.9;1.1 x 0.9;1.5*0.9].
In one embodiment, referring to fig. 4, 5 and 6, it is assumed that the first color light 101 is blue light, the second color light 102 is green light and the third color light 103 is red light, the first image 211 and the second image 221 are blue light images, the third image 212 and the fourth image 222 are green light images, and the sixth image 213 and the seventh image 223 are red light images. The processing unit 3 adjusts the color intensity value of the first image 211, the color intensity value of the third image 212, the color intensity value of the sixth image 213, the color intensity value of the second image 221, the color intensity value of the fourth image 222, and the color intensity value of the seventh image 223 by using the first adjustment value a, so that the first adjustment value a adjusts the color intensity value of the first image 211, the color intensity value of the third image 212, the color intensity value of the sixth image 213, the color intensity value of the second image 221, the color intensity value of the fourth image 222, and the color intensity value of the seventh image 223 within the limit value range of 0-255. The increase or decrease of the overall color intensity value of the first region depends on whether the color of the second region is saturated or too dark, i.e. exceeds the limit value range of 0-255, and the color intensity value adjusted by the first adjustment value a should not exceed the maximum value minus 20% or fall below the minimum value plus 20%. When the color intensity value of one of the images after being adjusted by the first adjustment value a is close to the threshold value, the color intensity value is adjusted to be lower or higher in a non-unidirectional direction according to the first ratio b, the color intensity values of the images 311, and 313 are adjusted to be lower and the color intensity values of the images 321, 322, and 323 are increased according to the first ratio b (for example, when the color intensity value of the images 321, 322, and 323 needs to be adjusted to be 0.89 times, a certain range of brightness of the image formed by the images 321, 322, and 323 is lower, the color intensity value of the images 311, and 313 is increased by 10%, the color intensity value of the images 321, 322, and 323 is increased by 0.89 times and then is increased by 10%, that is, the color intensity value of the images 311, and 313 is adjusted to be 1.1 times, the color intensity value of the images 321, 322, and 323 is adjusted to be 0.979 times, wherein 10% is an undetermined value and is an attempted error value, which can be set according to actual conditions), or vice versa. The cyan-red image adjusted by the first ratio b is synthesized into a first color image 31 and a second color image 32. When the color intensity values of the plurality of images are adjusted, the color intensity values of the images scanned later are adjusted by taking the first image as a reference, and the color intensity values are sequentially adjusted according to the scanning sequence of the images. When one of the images is adjusted to cause the previously adjusted image brightness to be too bright or too dark, so that one of the images in the sequence is close to the limit value of the color intensity value, the image with the adjusted ratio calculated can be not adjusted, and the image with the problem is subjected to color gradient processing in the areas of 100 pixels before and after the problem point (for example, 10 areas in a set range, each area is changed by 1% difference, and finally the area with the problem can be adjusted to be higher or lower to be a designed ratio (here, 10%), and the selected pixel area is determined according to the actual situation. The selection of the first color light 101, the second color light 102 and the third color light 103 depends on the actual situation.
Referring to fig. 7, fig. 7 is a flowchart illustrating a method for processing an oral scan image according to an embodiment of the invention. The method 110 of processing the mouth scan image in fig. 7 is suitable for the mouth scan system 100 in fig. 1. Referring to fig. 2, 3, and 4, first, step S2 is executed to project the first color light 101. Next, step S4 is executed, when the sensor system 100 is at the first position, the first color light 101 irradiates the target object to obtain a first image, and when the sensor system 100 is at the second position, the first color light 101 irradiates the target object to obtain a second image 221. Then, step S6 is executed to synthesize a first graph according to the first image 221 and a second graph according to the second image 221, and the first graph and the second graph are synthesized to generate a first connection 501, where the first image 211 has a first color intensity value at the first connection 501, and the second image 221 has a second color intensity value at the first connection 501. Then, step S8 is executed to determine whether the first color intensity value is the same as the second color intensity value. If the images are the same, executing step S10 to execute the processing of the subsequent images; if not, step S12 is executed to adjust the color intensity value of the first image 211 or the color intensity value of the second image 221 by the first adjustment value a so that the third color intensity value of the first image 211 after adjustment at the first connection 501 is the same as the fourth color intensity value of the second image 221 after adjustment at the first connection 501, and then step S10 is executed. Preferably, the first color light 101 is blue light. In an embodiment, the first adjustment value a is obtained by a difference between the first color intensity value and the second color intensity value.
In another embodiment of the present invention, referring to fig. 8, fig. 8 is a flowchart of a mouth-scan image processing method according to another embodiment of the present invention. The method 111 of processing the mouth scan image in fig. 8 is suitable for the mouth scan system 100 in fig. 1. Referring to fig. 4, 5 and 6 together, first, step S20 is executed to project the first, second and third color lights. Then, step S22 is executed, and the image sensing unit 2 continuously performs image capturing calculation. Then, step S24 is performed to calculate a difference between the first color intensity value and the second color intensity value at the first succession. Then, step S26 is executed to determine whether the first color intensity value is different from the second color intensity value, and if so, step S42 is executed to execute the subsequent image capturing calculation of the image sensing unit 2; if not, step S28 is executed to adjust the color intensity value of the first graph or the color intensity value of the second graph according to the first adjustment value a. Then, step S30 is executed to determine whether the adjusted image exceeds the limit value range, and if not, step S42 is executed to execute the subsequent image capturing calculation of the image sensing unit 2; if the color intensity value exceeds the threshold value, step S32 is executed, and the color intensity value of the third graph and the color intensity value of the fourth graph are adjusted to be within the threshold value range by the first ratio b. Then, step S34 is executed to determine whether the color intensity value of the previously adjusted image sequence exceeds the limit value range, and if not, step S42 is executed to execute the image capturing calculation of the subsequent image sensing unit 2; if the color intensity value exceeds the threshold value, step S36 is executed to raise or lower the color intensity value of the adjusted image sequence within the threshold value. Then, step S38 is executed to determine whether the problem image that cannot be adjusted within the limit value range is processed alone, and if not, step S42 is executed to execute the subsequent image capturing calculation of the image sensing unit 2; if the single processing is performed, step S40 is executed, and the problem area is adjusted by the second ratio. Then, step S42 is executed to perform image capturing calculation of the subsequent image sensing unit 2. Preferably, the first color light 101 is blue light, the second color light 102 is green light, and the third color light 103 is red light. In a specific implementation, the first adjustment value a can be obtained by a difference between the first color intensity value and the second color intensity value.
In another embodiment of the present invention, referring to fig. 9, fig. 9 is a flowchart of a mouth-scan image processing method according to another embodiment of the present invention. The method 112 of FIG. 9 is suitable for the system 100 of FIG. 1. Referring to fig. 4, 5 and 6, step S20' is performed to project blue light, green light and red light by the projection module 1. Then, step S22' is executed, and the image sensing unit 2 continuously performs image capturing calculation. Then, step S24' is performed to calculate a difference between the first color intensity value and the second color intensity value at the first succession. Then, step S26 'is executed to determine whether the first color intensity value is different from the second color intensity value, and if so, step S42' is executed to execute the subsequent image capturing calculation of the image sensing unit 2; if not, step S28' is executed to adjust the color intensity value of the first graph or the color intensity value of the second graph according to the first adjustment value a. Then, step S30 'is executed to determine whether the adjusted image exceeds the limit value range, and if not, step S42' is executed to execute the subsequent image capturing calculation of the image sensing unit 2; if the color intensity value exceeds the threshold value, step S32' is executed, and the color intensity value of the third graph and the color intensity value of the fourth graph are adjusted to be within the threshold value range by the first proportion b. Then, step S34 'is executed to determine whether the color intensity value of the previously adjusted image sequence exceeds the limit value range, and if not, step S42' is executed to execute the image capturing calculation of the subsequent image sensing unit 2; if the color intensity value exceeds the threshold value, step S36' is executed to raise or lower the color intensity value of the adjusted image sequence within the threshold value. Then, step S38 'is executed to determine whether the problem image that cannot be adjusted within the limit value range is processed alone, and if not, step S42' is executed to execute the subsequent image capturing calculation of the image sensing unit 2; if the process is independent, step S40' is performed, and the problem area is adjusted by the second ratio. Then, step S42' is executed to perform image capturing calculation of the subsequent image sensing unit 2. Preferably, the first adjustment value a is derived from the difference between the first color intensity value and the second color intensity value.
In summary, the present invention provides a mouth scanning system and a mouth scanning image processing method, the mouth scanning system includes a projection module, an image sensing unit and a processing unit, the image sensing unit senses that the projection module projects a first color light to illuminate a target object to obtain a first image and a second image, the processing unit synthesizes the first image to form a first graph and synthesizes the second image to form a second graph, the first graph is adjacent to the second graph, the first graph and the second graph are synthesized to generate a first connection, the first image and the second image respectively have a first color intensity value and a second color intensity value at the first connection, when the first color intensity value is different from the second color intensity value, the color intensity value of the first image or the color intensity value of the second image is adjusted to be the same through a first adjustment value, so that the material image of the mouth scanning system can be automatically and rapidly adjusted to increase the viewing comfort.
Although the present invention has been described in connection with the accompanying drawings, the embodiments disclosed in the drawings are intended to be illustrative of preferred embodiments of the present invention and should not be construed as limiting the invention. The scale in the schematic drawings does not represent the scale of actual components for the sake of clarity in describing the required components.
The present invention has been described in relation to the above embodiments, which are only exemplary of the implementation of the present invention. It should be noted that the disclosed embodiments do not limit the scope of the invention. Rather, it is intended that the invention be covered by the appended claims without departing from the spirit and scope of the invention.

Claims (8)

1. A oral scanning system, comprising:
the projection module projects first color light;
the image sensing unit senses that the first color light irradiates the target object to obtain a first image when the oral scanning system is at a first position, and the image sensing unit senses that the first color light irradiates the target object to obtain a second image when the oral scanning system is at a second position; and
the processing unit is coupled with the projection module and the image sensing unit respectively, forms a first graph according to the first image synthesis, and forms a second graph according to the second image synthesis;
when the first color intensity value is different from the second color intensity value, the processing unit adjusts the color intensity value of the first image or the color intensity value of the second image through a first adjusting value so that a third color intensity value of the first image after adjustment at the first junction is the same as a fourth color intensity value of the second image after adjustment at the first junction;
the processing unit adjusts the color intensity value of the fourth image through the first adjusting value and synthesizes the adjusted second image and the adjusted fourth image into a fourth image, the third image is adjacent to the fourth image, and when the color intensity value of the third image or the color intensity value of the fourth image exceeds the limit value range of the color intensity value of the fourth image, the processing unit increases or decreases the color intensity value of the third image or the color intensity value of the fourth image through the first proportion so that the color intensity value of the third image or the color intensity value of the fourth image is within the limit value range of the color intensity value of the fourth image.
2. A oral scanning system, comprising:
the projection module projects first color light;
the image sensing unit senses that the first color light irradiates the target object to obtain a first image when the oral scanning system is at a first position, and the image sensing unit senses that the first color light irradiates the target object to obtain a second image when the oral scanning system is at a second position; and
the processing unit is coupled with the projection module and the image sensing unit respectively, forms a first graph according to the first image synthesis, and forms a second graph according to the second image synthesis;
when the first color intensity value is different from the second color intensity value, the processing unit adjusts the color intensity value of the first image or the color intensity value of the second image through a first adjusting value so that a third color intensity value of the first image after adjustment at the first junction is the same as a fourth color intensity value of the second image after adjustment at the first junction;
the image sensing unit senses that the first color light irradiates the target object when the oral scanning system is at a third position to obtain a fifth image, the processing unit synthesizes and forms a fifth image according to the fifth image, the fifth image is adjacent to the second image, the processing unit takes the color intensity value of the first image as a reference, the processing unit adjusts the color intensity value of the second image through the first adjusting value, the processing unit adjusts the color intensity value of the fifth image through the second adjusting value, and when the color intensity value of the fifth image exceeds a limit value range, the processing unit adjusts the color intensity value of the first image through the third adjusting value, adjusts the color intensity value of the second image adjusted through the first adjusting value, adjusts the color intensity value of the fifth image adjusted through the second adjusting value, so that the color intensity value of the first image adjusted through the third adjusting value, the color intensity value of the second image adjusted through the first adjusting value and then the third adjusting value pass through the second adjusting value and then the second adjusting value.
3. The oral scanning system as claimed in claim 1 or 2, wherein the projection module further projects a second color light and a third color light, the first color light is blue light, the second color light is green light, the third color light is red light, and the first image and the second image are color images.
4. The oral scanning system of claim 1 or 2, wherein the processing unit derives the first adjustment value by a difference between the first color intensity value and the second color intensity value.
5. A method for processing a mouth scan image is used for a mouth scan system, and is characterized by comprising the following steps:
projecting a first color light;
sensing the first color light irradiating the target object when the oral scanning system is at a first position to acquire a first image;
sensing the first color light to irradiate the target object when the oral scanning system is at a second position so as to acquire a second image;
synthesizing to form a first image according to the first image;
synthesizing to form a second image according to the second image;
synthesizing the first graph and the second graph to generate a first succession, wherein the first image has a first color intensity value at the first succession, and the second image has a second color intensity value at the first succession; and
when the first color intensity value is different from the second color intensity value, adjusting the color intensity value of the first image or the color intensity value of the second image by a first adjustment value to make a third color intensity value of the first image after adjustment at the first junction be the same as a fourth color intensity value of the second image after adjustment at the first junction;
wherein, the method also comprises the following steps:
projecting a second color light;
sensing the second color light irradiating the target object when the oral scanning system is at the first position to obtain a third image;
sensing the second color light irradiating the target object when the oral scanning system is at a second position to acquire a fourth image;
adjusting the color intensity value of the third image according to the first adjustment value and synthesizing the adjusted first image and the adjusted third image into a third image, or adjusting the color intensity value of the fourth image according to the first adjustment value and synthesizing the adjusted second image and the adjusted fourth image into a fourth image, wherein the third image is adjacent to the fourth image; and
when the color intensity value of the third graph or the color intensity value of the fourth graph exceeds the limit value range, the color intensity value of the third graph or the color intensity value of the fourth graph is increased or decreased through a first proportion, so that the color intensity value of the third graph and the color intensity value of the fourth graph are within the limit value range.
6. A mouth scan image processing method is used for a mouth scan system and is characterized by comprising the following steps:
projecting a first color light;
sensing the first color light irradiating the target object when the oral scanning system is at a first position to acquire a first image;
sensing the first color light irradiating the target object when the mouth scanning system is at a second position to obtain a second image;
synthesizing to form a first image according to the first image;
synthesizing to form a second image according to the second image;
synthesizing the first graph and the second graph to generate a first connection, wherein the first image has a first color intensity value at the first connection, and the second image has a second color intensity value at the first connection; and
when the first color intensity value is different from the second color intensity value, adjusting the color intensity value of the first image or the color intensity value of the second image by a first adjustment value to make a third color intensity value of the first image after adjustment at the first junction be the same as a fourth color intensity value of the second image after adjustment at the first junction;
wherein, the method also comprises the following steps:
sensing the first color light irradiating the target object when the oral scanning system is at a third position to acquire a fifth image;
synthesizing to form a fifth image according to the fifth image, wherein the fifth image is adjacent to the second image;
adjusting the color intensity value of the second graph through the first adjusting value;
adjusting the color intensity value of the fifth map by a second adjustment value; and
when the color intensity value of the fifth graph exceeds the limit value range, the color intensity value of the first graph is adjusted through a third adjustment value, the color intensity value of the second graph adjusted through the first adjustment value is adjusted, the color intensity value of the fifth graph adjusted through the second adjustment value is adjusted, and the color intensity value of the first graph adjusted through the third adjustment value, the color intensity value of the second graph adjusted through the first adjustment value and then the third adjustment value, and the color intensity value of the fifth graph adjusted through the second adjustment value and then the third adjustment value are within the limit value range.
7. The method for processing orographic images according to claim 5 or 6, further comprising the steps of:
and projecting a second color light and a third color light, wherein the first color light is blue light, the second color light is green light, and the third color light is red light, and the first image and the second image formed by synthesis are color images.
8. The method for processing oral scan images according to claim 5 or 6, further comprising the steps of:
and obtaining the first adjustment value according to the difference value between the first color intensity value and the second color intensity value.
CN201911034011.0A 2019-10-29 2019-10-29 Oral scanning system and oral scanning image processing method Active CN110853106B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911034011.0A CN110853106B (en) 2019-10-29 2019-10-29 Oral scanning system and oral scanning image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911034011.0A CN110853106B (en) 2019-10-29 2019-10-29 Oral scanning system and oral scanning image processing method

Publications (2)

Publication Number Publication Date
CN110853106A CN110853106A (en) 2020-02-28
CN110853106B true CN110853106B (en) 2022-10-11

Family

ID=69598035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911034011.0A Active CN110853106B (en) 2019-10-29 2019-10-29 Oral scanning system and oral scanning image processing method

Country Status (1)

Country Link
CN (1) CN110853106B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279939A (en) * 2013-04-27 2013-09-04 北京工业大学 Image stitching processing system
TW201545075A (en) * 2014-05-27 2015-12-01 Metal Ind Res & Dev Ct Method for joining tooth images
CN105361968A (en) * 2014-08-29 2016-03-02 财团法人金属工业研究发展中心 Dental model scanning device
CN106162115A (en) * 2015-03-25 2016-11-23 上海分众软件技术有限公司 A kind of image interfusion method based on Play System
CN107424179A (en) * 2017-04-18 2017-12-01 微鲸科技有限公司 A kind of image equalization method and device
CN108734651A (en) * 2017-04-19 2018-11-02 睿致科技股份有限公司 Image splicing method and image splicing device thereof
CN109509146A (en) * 2017-09-15 2019-03-22 腾讯科技(深圳)有限公司 Image split-joint method and device, storage medium
CN109793482A (en) * 2019-01-07 2019-05-24 苏州佳世达光电有限公司 Oral cavity scanning means and its control method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10098713B2 (en) * 2013-03-14 2018-10-16 Ormco Corporation Scanning sequence for an intra-oral imaging system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279939A (en) * 2013-04-27 2013-09-04 北京工业大学 Image stitching processing system
TW201545075A (en) * 2014-05-27 2015-12-01 Metal Ind Res & Dev Ct Method for joining tooth images
CN105361968A (en) * 2014-08-29 2016-03-02 财团法人金属工业研究发展中心 Dental model scanning device
CN106162115A (en) * 2015-03-25 2016-11-23 上海分众软件技术有限公司 A kind of image interfusion method based on Play System
CN107424179A (en) * 2017-04-18 2017-12-01 微鲸科技有限公司 A kind of image equalization method and device
CN108734651A (en) * 2017-04-19 2018-11-02 睿致科技股份有限公司 Image splicing method and image splicing device thereof
CN109509146A (en) * 2017-09-15 2019-03-22 腾讯科技(深圳)有限公司 Image split-joint method and device, storage medium
CN109793482A (en) * 2019-01-07 2019-05-24 苏州佳世达光电有限公司 Oral cavity scanning means and its control method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Using Intraoral Scanning to Capture Complete Denture Impressions,Tooth Positions,and Centric Relation Records;Goodacre Brian J 等;《International Journal of Prosthodontics》;20180831;第31卷(第4期);第377-381页 *
口内数字化印模技术在口腔种植中的应用现状与研究进展;黄若萱 等;《口腔医学》;20190630;第39卷(第6期);第539-543页 *

Also Published As

Publication number Publication date
CN110853106A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
KR101102150B1 (en) Image processing method and apparatus, and computer program
JP3528184B2 (en) Image signal luminance correction apparatus and luminance correction method
JP5171434B2 (en) Imaging apparatus, imaging method, program, and integrated circuit
JP3892648B2 (en) Image input device, white balance adjustment method, and computer-readable recording medium storing program for executing the method
CN103813097B (en) Camera device and image capture method
US7602969B2 (en) Imaging data processing method, imaging data processing device, and computer program
JP3927995B2 (en) Image display control apparatus, image display control method, and imaging apparatus
JP3668014B2 (en) Image processing method and apparatus
JP5761946B2 (en) Image processing apparatus, image processing method, and storage medium
KR101352440B1 (en) Image processing apparatus, image processing method, and recording medium
JP5743696B2 (en) Image processing apparatus, image processing method, and program
JP2010504664A (en) Imaging method and apparatus for photographing color images with a monochrome camera
JP2004212385A (en) Photographic device, photographing method and control method for the photographic device
JP2004159292A (en) Method for generating images used in extended dynamic range image composition
JP2009010566A (en) Method for expanding dynamic range of photographic image and imaging apparatus
JP2010219606A (en) Device and method for white balance adjustment
JP4066803B2 (en) Image processing apparatus, image processing program, image processing method, and electronic camera
JP2016086246A (en) Image processing apparatus and method, and imaging device
JP2007028088A (en) Imaging apparatus and image processing method
US9894339B2 (en) Image processing apparatus, image processing method and program
WO2016170604A1 (en) Endoscope device
JP7059076B2 (en) Image processing device, its control method, program, recording medium
CN110853106B (en) Oral scanning system and oral scanning image processing method
JP5932392B2 (en) Image processing apparatus and image processing method
CN112738410A (en) Imaging method and device and endoscope equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant