CN113781313A - Image data processing method, electronic device, and computer-readable storage medium - Google Patents
Image data processing method, electronic device, and computer-readable storage medium Download PDFInfo
- Publication number
- CN113781313A CN113781313A CN202110908934.5A CN202110908934A CN113781313A CN 113781313 A CN113781313 A CN 113781313A CN 202110908934 A CN202110908934 A CN 202110908934A CN 113781313 A CN113781313 A CN 113781313A
- Authority
- CN
- China
- Prior art keywords
- data
- image data
- frame
- image
- time period
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1407—General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
Abstract
The invention provides an image data processing method, an electronic device and a computer readable storage medium. The image data processing method includes: acquiring ith frame image data acquired by an image acquisition module of the image acquisition device in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period, wherein i is a natural number; acquiring rotation angle change data and azimuth angle change data of the image acquisition device in the (i + 1) th time period, which are sensed by a sensing module of the image acquisition device, compared with the i-th time period; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; and using at least one of the rotation angle change data, the azimuth angle change data and the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
Description
Technical Field
The present invention relates to the field of computer image processing technologies, and in particular, to an image data processing method, an electronic device, and a computer-readable storage medium.
Background
With the continuous development of social economy and the gradual improvement of the living standard of the public society, the computer technology is widely popularized in production and life, and particularly the computer image processing technology becomes one of the important technical types in the current computer application field.
When a computer performs image processing, an image sensor is usually used to acquire an input image and then process the image, and a general image sensor, such as a CCD or a CMOS, is usually input by actual optical projection and converted into a corresponding electrical signal to be output. When the electric signal is restored to the corresponding image, the original input image orientation feature is still retained, that is, if the image pickup position has a tilt, the corresponding tilt is also retained on the output image. In some specific use scenes, such as station operation, equipment maintenance, human body operation, personal care, physical examination, physical and chemical teaching and the like, in order to facilitate accurate, rapid and continuous operation of an operator, it is generally desirable that the direction of an image output by the output end is stable as much as possible, so that the operator can find the spatial orientation of a video object conveniently, and correct observation and operation can be realized.
Disclosure of Invention
In view of this, an embodiment of the present invention provides an image data processing method, an electronic device, and a computer-readable storage medium, including the steps of:
acquiring ith frame image data acquired by an image acquisition module of the image acquisition device in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period, wherein i is a natural number;
acquiring corner change data and azimuth change data of the image acquisition device in the (i + 1) th time period, which are sensed by a sensing module of the image acquisition device, compared with the i-th time period;
analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data;
and using at least one of the rotation angle change data, the azimuth angle change data and the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
In the image data processing method provided by the embodiment of the invention, through acquiring the ith frame of image data acquired by an image acquisition module in the ith time period and the (i + 1) th frame of image data acquired by an i +1 th time period, and the corner change data and the azimuth change data of the image acquisition device sensed by a sensing module in the (i + 1) th time period compared with the ith time period, the i +1 th frame of image data and the i +1 th frame of image data are analyzed to acquire the corner calculation data of the i +1 th frame of image data compared with the i th frame of image data; and finally, at least one of the corner change data, the azimuth change data and the corner calculation data is used as reference data for adjusting the display direction of the (i + 1) th frame of image data, so that the output angles of the images are consistent all the time, the inconvenience brought to an operator by image rotation or shaking is reduced, the purpose of locking the screen is achieved, the operator can keep the output images to be images with basically the same visual angle all the time when rotating, moving or otherwise operating the image acquisition device, and the accuracy, rapidity and accuracy of operations such as station operation, equipment maintenance, human body operation, personal care, physical examination, physical and chemical teaching are guaranteed.
In addition, an embodiment of the present invention further provides an image data processing method, which includes the following steps:
acquiring ith frame image data acquired by an image acquisition module in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period;
judging whether the included angle between the preset axis of the sensing module and the gravity direction in the (i + 1) th time period is 0 degree or not,
if yes, analyzing the ith frame image data and the (i + 1) th frame image data to obtain corner calculation data of the (i + 1) th frame image data compared with the ith frame image data, and using the corner calculation data as reference data for adjusting the display direction of the (i + 1) th frame image data,
if not, acquiring one of the rotation angle change data and the azimuth angle change data of the image acquisition device sensed by the sensing module in the (i + 1) th time period compared with the i-th time period, and taking one of the rotation angle change data and the azimuth angle change data as reference data for adjusting the display direction of the image data.
In the image data processing method provided in the embodiment of the present invention, by determining whether an included angle between a preset axis of the sensing module in the (i + 1) th time period and the gravity direction is 0 degree, under different conditions, especially when the included angle between the sensing module and the gravity direction is 0 degree, so that corner change data cannot be obtained according to a sensing signal of the sensing module, the i frame image data and the (i + 1) th frame image data are selectively analyzed to obtain corner calculation data of the i +1 th frame image data compared with the i frame image data, and the corner calculation data is used as reference data for adjusting the display direction of the i +1 th frame image data, so as to avoid the problem that the display direction of the i +1 th frame image data cannot be adjusted under specific conditions, and a screen can be accurately locked when the image acquisition device is set in the gravity direction (for example, when the preset axis of the sensing module is consistent with the gravity direction), the angle of image output is kept consistent all the time, the image shaking problem is basically avoided, and the operation is convenient. Further, when the sensing module can accurately measure the corner change data and the azimuth change data, one of the corner change data and the azimuth change data is selected as reference data for adjusting the display direction of the (i + 1) th frame of image data, because the sensing module can more accurately calculate the data compared with the corner, the sensing module can select the optimal reference data under various conditions of the whole image data processing method through reasonable selection, so as to adjust the display direction of the (i + 1) th frame of image data, and the image display effect is better.
In addition, to achieve the above object, the present invention also provides an image data processing method including the steps of:
acquiring ith frame image data acquired by an image acquisition module in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period;
analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data;
and taking the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
In the image data processing method provided by the embodiment of the invention, the ith frame of image data acquired by an image acquisition module in the ith time period and the (i + 1) th frame of image data acquired by an (i + 1) th time period are acquired; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; the angle calculation data is used as reference data for adjusting the display direction of the (i + 1) th frame of image data, so that the output angle of the image is consistent all the time, the inconvenience brought to operators by the rotation or shaking of the image is reduced, the screen locking purpose is achieved, the output image is kept to be the image with basically the same visual angle all the time when the operators rotate, move or otherwise operate the image acquisition device, and the accuracy, rapidity and precision of operations such as station operation, equipment maintenance, human body operation, personal care, physical examination, physical and chemical and physical teaching are guaranteed. It can be understood that the problems of improper image display direction, inconvenient operation and the like caused by the sensing failure or inaccuracy of the sensing module can be reduced by using the rotation angle calculation data as the reference data.
In addition, to achieve the above object, the present invention also provides an image data processing method including the steps of:
acquiring ith frame image data acquired by an image acquisition module in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period;
acquiring course angle change data, azimuth angle change data or corner change data of the image acquisition module in the (i + 1) th time period, which is sensed by the sensing module, compared with the ith time period;
analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data;
and taking the rotation angle calculation data, the course angle change data, the azimuth angle change data or the rotation angle change data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
In the image data processing method provided by the embodiment of the invention, the rotation angle calculation data, the course angle change data, the azimuth angle change data or the rotation angle change data are used as reference data for adjusting the display direction of the i +1 th frame of image data, so that the output angles of the images are consistent all the time, the inconvenience brought to operators by image rotation or shaking is reduced, the screen locking purpose is achieved, the operators can keep the output images at basically the same visual angle all the time when rotating, moving or otherwise operating the image acquisition device, and the accuracy, rapidity and accuracy of operations such as station operation, equipment maintenance, human body operation, personal care, physical examination, physicochemical teaching and the like are ensured.
In addition, to achieve the above object, the present invention also provides an image data processing method including the steps of:
acquiring ith frame image data acquired by an image acquisition module of the image acquisition device in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period, wherein i is a natural number;
acquiring rotation angle change data, azimuth angle change data and course angle change data of the image acquisition device in the (i + 1) th time period, which are sensed by a sensing module of the image acquisition device, compared with the i-th time period;
analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data;
and taking the corner change data or one of the azimuth angle change data, the corner calculation data and the heading angle change data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
In the image data processing method provided by the embodiment of the invention, the corner change data or one of the azimuth change data, the corner calculation data and the course angle change data is used as reference data for adjusting the display direction of the i +1 th frame of image data, so that the output angle of the image is consistent all the time, the inconvenience brought to an operator due to the rotation or shaking of the image is reduced, the purpose of locking the screen is achieved, the operator can keep the output image as an image with basically the same visual angle all the time when rotating, moving or otherwise operating the image acquisition device, and the accuracy, rapidity and precision of operations such as station operation, equipment maintenance, human body operation, personal care, physical examination, physicochemical teaching and the like are ensured.
An embodiment of the present invention further provides an electronic device, where the electronic device includes a memory and a processor, where the memory stores computer-readable instructions, and when the computer-readable instructions are executed by the processor, the processor is caused to execute any one of the image data processing methods described above.
Embodiments of the present invention also provide a computer-readable storage medium, where the computer-readable instructions, when executed by one or more processors, cause the one or more processors to perform any one of the image data processing methods described above.
In the electronic device and the computer-readable storage medium provided by the embodiment of the invention, the display direction of the (i + 1) th frame of image data is adjusted according to at least one of the corner change data, the azimuth change data and the corner calculation data, so that the output angle of the image is always consistent, the purpose of locking the screen is achieved, an operator can keep the image output by the image acquisition device to be the image with the same visual angle when rotating, moving or otherwise operating the image acquisition device, the operation of the operator is facilitated, and the accuracy, rapidity and precision of operations such as station operation, equipment maintenance, human body operation, personal care, physicochemical teaching and the like are guaranteed.
Drawings
FIG. 1 is a schematic block diagram of an image capture device according to the present invention;
fig. 2 is a flowchart of an image data processing method according to a first embodiment of the invention;
FIG. 3 is a schematic diagram of an imaging display process of the image data processing method of the present invention;
fig. 4 is a flowchart of an image data processing method according to a second embodiment of the present invention;
fig. 5 is a flowchart of an image data processing method according to a third embodiment of the present invention;
fig. 6 is a flowchart of an image data processing method according to a fourth embodiment of the present invention;
fig. 7 is a flowchart of an image data processing method according to a fifth embodiment of the present invention;
FIG. 8 is a schematic structural diagram of an electronic device according to an embodiment of the invention;
FIG. 9 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention;
fig. 10 is a diagram illustrating several experimental verification results of the step of acquiring rotation angle calculation data in the image processing method according to the embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second" and "third" in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any indication of the number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include at least one of the feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise. All directional indications (such as up, down, left, right, front, and rear … …) in the embodiments of the present application are only used to explain the relative positional relationship between the components, the movement, and the like in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indication is changed accordingly. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic block diagram of an image capturing device 1 according to the present invention. The image capture device 1 may be a hand-holdable image capture device including, but not limited to, one of a hand-holdable visual otoscope, a hand-holdable visual ear pick, a hand-holdable visual mouth mirror, a hand-holdable visual dental scaler, a hand-holdable visual skin instrument, or a hand-holdable visual hair instrument. The handheld image acquisition device comprises an image acquisition module 111 and a sensing module 112, wherein the image acquisition module 111 can be a camera module arranged on the handheld image acquisition device, such as a camera probe and an image sensor connected with the camera probe, and the sensing module 112 can be a sensing module with a motion measurement function, such as an angle measurement sensor (such as a gyroscope), an acceleration measurement sensor (such as an accelerometer), a geomagnetic information measurement sensor (such as a magnetometer) and the like. In this embodiment, the sensing module 112 may be disposed on the image capturing module 111 or disposed adjacent to the image capturing module 111, and the sensing module 112 may detect and obtain the rotation angle change data and the azimuth angle change data, where the rotation angle change data and the azimuth angle change data are not only the rotation angle change data and the azimuth angle change data of the sensing module 112, but also the rotation angle change data and the azimuth angle change data of the image capturing module 111.
In some embodiments, the handheld image capturing device transmits the image data captured by the image capturing module 111 and the sensing data (e.g., rotation angle change data and azimuth angle change data) obtained by the sensing module 112 to a terminal device 2 with an operation function, such as a server, a server cluster, a mobile phone, a tablet computer, a notebook computer, a desktop computer, a personal digital assistant, etc., through wired or wireless communication, such as a network cable, bluetooth, WiFi, etc., and then displays an image for an operator after the terminal device 2 processes the image data and the sensing data.
In this embodiment, the terminal device 2 may be a mobile phone, a tablet computer, a notebook computer, a desktop computer, a personal digital assistant, or other devices having an arithmetic function processor 12 and an image display function display 13, the terminal device 2 may be installed with an application program for image processing, for example, APP, after the hand-held image acquisition device is in communication connection with the terminal device 2 through WiFi and the like, the image data of each frame collected by the image collecting module 111 and the rotation angle change data and the azimuth angle change data detected by the sensing module 112 are transmitted to the terminal device 2, after the application program of the terminal device 2 obtains the collected image data, the corner change data and the azimuth change data, the processor 12 controls the display 13 to display an image after processing the image data according to the rotation angle change data, the azimuth angle change data, and the like. It can be understood that, when the operator performs the operation, the display 13 may observe the image of the operated object collected by the handheld image collecting device, and the operator may change the operation direction by moving the handheld image collecting device to operate the operated object.
In other embodiments, the image capturing device and the terminal device 2 may be integrated, that is, the image capturing device 1 is provided with a processor 12 having an operation function, the image captured by the image capturing module 111 is processed by the sensing module 112, and then directly processed in the processor 12 and output to the display 13, and the display 13 is a display device for actually operating the image to the operator, and may be integrated with the image capturing device, or may be a separate display, and is connected to the image capturing device through wired or wireless communication, such as a network cable, bluetooth, WiFi, or the like.
The following description is given by taking a handheld visual ear picking rod as an example, the image data processing method of the image acquisition device 1 is described, the ear picking personnel can put the handheld visual ear picking rod into the ear canal of the person to be picked, the image acquisition module 111 acquires the image in the ear canal of the human body, the image is processed by the sensing module 112 and then transmitted to the terminal device 2 connected with the handheld visual ear picking rod, and the processor 12 of the terminal device 2 processes the image and then displays the image on the display 13.
Referring to fig. 2, fig. 2 is a flowchart of an image data processing method according to a first embodiment of the invention, it should be noted that the method of the invention is not limited to the flowchart shown in fig. 2 if substantially the same result is obtained. As described above, in some embodiments, the image capturing device 1 and the terminal device 2 are separately provided, and the image data processing method may be executed in an application program of the terminal device 2, that is, the terminal device 2 receives each frame of image data, the rotation angle change data, and the azimuth angle change data provided by the image capturing device 1 and then executes the image data processing method, but in other modified embodiments, when the image capturing device 1 is integrated with the arithmetic function processor 12, the image capturing device 1 may execute the image data processing method by itself and transmit the reference data obtained by the image data processing method and each frame of image data (or each frame of image adjustment data with the display direction adjusted) to the terminal device 2, so that the display 13 of the terminal device 2 may display an image with an appropriate direction. Specifically, as shown in fig. 2, the method includes the steps of:
step S101: acquiring the ith frame of image data acquired by the image acquisition module 111 of the image acquisition device 1 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period, wherein i is a natural number.
It should be noted that, after the image capturing device 1, such as a handheld visual ear picking rod, is placed in the ear canal of the person to be picked, the image capturing module 111 of the handheld visual ear picking rod will capture images in the ear canal of the person at the ith time period and the (i + 1) th time period at preset time intervals. It can be understood that, in this embodiment, according to different requirements on calculation accuracy and calculation efficiency, the ith frame of image data acquired in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period may be two frames of image data acquired at adjacent time intervals, or two frames of image data acquired at several time intervals.
In this embodiment, the image capturing module 111 may be a camera module, such as a camera probe and an image sensor, provided on the image capturing device 1, and is used for capturing images in the ear canal. After the image is shot by the camera probe, the image sensor converts the optical image on the photosensitive surface into an electric signal in a corresponding proportional relation with the optical image by using the photoelectric conversion function of the photoelectric device, and common image sensors comprise a CCD (charge coupled device), a CMOS (complementary metal oxide semiconductor) and the like.
Step S102: and acquiring rotation angle change data and azimuth angle change data of the image acquisition device 1 in the (i + 1) th time period compared with the (i) th time period, which are sensed by a sensing module 112 of the image acquisition device 1.
It can be understood that when an operator uses the handheld visual ear picking rod to perform operation, the operation process can be decomposed into a rotation angle change motion and an azimuth angle change motion, wherein a rotation angle is also called a rotation angle, and is an included angle between a rotation line of one point and a center and a line connecting the corresponding point of the point after rotation and the rotation center when the graph performs the rotation motion, and an azimuth angle is also called a horizontal longitude and is a horizontal included angle from a north-pointing direction line of a certain point to a target direction line along a clockwise direction.
In this embodiment, the sensing module 112 includes a three-axis gyroscope for acquiring the rotation angle change data and a three-axis accelerometer for acquiring the azimuth angle change data, and the three-axis gyroscope (Gyro) is also called a micro-mechanical gyroscope, and is characterized in that the position determination in six directions can be performed simultaneously, and the movement trajectory and acceleration in the directions can be determined. The three-axis gyroscope has the advantages of small volume, light weight, simple structure, good reliability and the like, and in short, the three-axis gyroscope has the greatest function of measuring the angular velocity so as to judge the motion state of an object. A triaxial accelerometer (Acc) is used to detect the acceleration signal. The rotation angle change data and the azimuth angle change data of the image acquisition device 1 in the (i + 1) th time period compared with the (i) th time period sensed by the sensing module 112 can be acquired through the three-axis gyroscope and the three-axis accelerometer.
Obtaining data dt, wx, wy and wz of the three-axis gyroscope in an ith frame of image data acquired in an ith time period and an i +1 th frame of image data acquired in an i +1 th time period, wherein dt is system time, wx, wy and wz are angular velocities of the three-axis gyroscope in x, y and z directions respectively, calculating quaternion angle integrals according to dt, wx, wy and wz, converting integral results into Euler angles, accumulating, converting accumulated results into quaternion again, and finally obtaining the rotation angle change data. The azimuth angle change data is obtained by acquiring projection vectors of the triaxial accelerometer on three planes of acceleration vectors yz, zx and xy in the ith frame of image data acquired in the ith time period and the ith +1 frame of image data acquired in the ith time period, respectively calculating the inclination angle of the triaxial accelerometer in the X-axis direction according to the tangent of the X component and the yz projection vector, calculating the inclination angle of the Y-axis direction according to the tangent of the Y component and the zx projection vector, calculating the inclination angle of the Z-axis direction according to the tangent of the Z component and the xy projection vector, and finally converting and calculating the inclination angle of the X-axis direction, the inclination angle of the Y-axis direction and the inclination angle of the Z-axis direction. Since the acquisition and calculation of the rotation angle change data and the azimuth angle change data are common, they are not described herein again.
Step S103: analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data.
It should be noted that, when an operator uses the handheld visual ear picking rod to perform an operation, in addition to the rotation angle change movement and the azimuth angle change movement, the operator may also generally include linear movement change data along a preset direction, such as a gravity direction, so that, in order to make an output image screen locking effect more accurate, the operator also needs to obtain movement change data of the linear movement along the preset direction, such as the gravity direction, that is, rotation angle calculation data, and it can be understood that, in this embodiment, the rotation angle calculation data may be obtained through the i-th frame image data and the i + 1-th frame image data obtained by the image sensor.
Step S104: and using at least one of the rotation angle change data, the azimuth angle change data and the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
In step S104, at least one of the rotation angle change data, the azimuth angle change data, and the rotation angle calculation data obtained by the three-axis gyroscope, the three-axis accelerometer, and the image sensor may be used as reference data for adjusting the display direction of the i +1 th frame of image data, and the display direction of the i +1 th frame of image data is adjusted to obtain the i +1 th frame of adjusted image data, so that the operator observes the screen-locked image at the same angle and orientation on the display 13, and the operation is facilitated. And adjusting the display direction of the (i + 1) th frame of image data according to at least one of the corner change data, the azimuth change data and the corner calculation data to obtain the (i + 1) th frame of adjusted image data, and further controlling the display 13 to directly display according to the (i + 1) th frame of adjusted image data, so that the directions of the output image data are basically consistent, and the operation of an operator is facilitated. In the image data processing method provided by the embodiment of the present invention, by acquiring the i-th frame of image data acquired by the image acquisition module 111 in the i-th time period and the i + 1-th frame of image data acquired in the i + 1-th time period, and the rotation angle change data and the azimuth angle change data of the image acquisition device 1 in the i + 1-th time period, which are sensed by the sensing module 112, compared with the i-th time period, in the i + 1-th time period, the i-th frame of image data and the i + 1-th frame of image data are analyzed to obtain rotation angle calculation data of the i + 1-th frame of image data compared with the i-th frame of image data; and finally, at least one of the corner change data, the azimuth change data and the corner calculation data is used as reference data for adjusting the display direction of the (i + 1) th frame of image data, so that the output angles of the images are consistent all the time, the inconvenience brought to an operator by image rotation or shaking is reduced, the purpose of locking the screen is achieved, the operator can keep the output images to be images with basically the same visual angle all the time when rotating, moving or otherwise operating the image acquisition device, and the accuracy, rapidity and accuracy of operations such as station operation, equipment maintenance, human body operation, personal care, physical examination, physical and chemical teaching are guaranteed.
Specifically, step S104 includes the following:
step S41: and judging whether an included angle between the preset axis of the sensing module 112 in the (i + 1) th time period and the gravity direction is 0 degree according to the azimuth angle change data, if so, executing step S42, and if not, executing step S43.
Referring to fig. 3, fig. 3 is a schematic diagram of an imaging display process of the image data processing method of the present invention, wherein an azimuth angle of the image capturing device 1 includes an included angle θ between a preset axis of the sensing module 112 and a horizontal directionAccWherein, thetaAccWithin the range of 0 to 90 degrees, it can be understood that the initial direction of the sensing module 112 can be set to be the direction coinciding with the XY coordinate axes, as shown in fig. 3(a), the axis of the sensing module 112 in the direction of the X axis is set to be a preset axis, and when the image capturing device 1 moves, the image capturing module 111 and the sensing module 112 are driven to moveThe angle between the preset axis of the sensing module 112 and the horizontal X-axis is θAccAs shown in FIG. 3(b), then θAccIn the step of determining whether the included angle between the preset axis of the sensing module 112 and the gravity direction in the (i + 1) th time period is 0 degree according to the azimuth angle change data within the range of 0 to 90 degrees, when the included angle θ is larger than the preset axisAccWhen the angle is 90 degrees, it is determined that the included angle between the preset axis of the sensing module 112 and the gravity direction is 0 degree. It can be understood that the preset axis is a virtual axis, and may be an axis passing through a certain preset direction of the center of the sensing module 112 under a reference coordinate system (for example, a reference coordinate system constructed by X, Y, Z axes, where a specific X, Y, Z axis may be defined by itself according to actual needs, or an X axis and a Y axis are horizontal axes, and a Z axis is a vertical axis in which the gravity direction is located).
Step S42: and taking the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
When the included angle between the preset axis of the sensing module 112 and the gravity direction in the (i + 1) th time period is judged to be 0 degree according to the azimuth angle change data obtained by the three-axis accelerometer, the operation track of the operator at the moment can be considered as linear motion along the gravity direction, and at this moment, the display direction of the (i + 1) th frame of image data needs to be adjusted according to the rotation angle calculation data.
Step S43: and using one of the rotation angle change data and the azimuth angle change data as reference data for adjusting the display direction of the image data.
When it is determined that the included angle between the preset axis of the sensing module 112 and the gravity direction in the (i + 1) th time period is not 0 degree according to the azimuth change data obtained by the three-axis accelerometer, the operation trajectory of the operator at the moment may be considered as a rotational motion trajectory or a motion trajectory of a rotational motion combined with a linear motion along the gravity direction, and at this time, the display direction of the image data needs to be adjusted according to one of the rotation angle change data and the azimuth change data.
In this embodiment, by determining whether the included angle between the preset axis of the sensing module 112 and the gravity direction in the (i + 1) th time period is 0 degree according to the azimuth angle change data, under different conditions, especially when the included angle between the sensing module 112 and the gravity direction is 0 degree, so that the rotation angle change data cannot be obtained according to the sensing signal of the sensing module 112, the i frame image data and the (i + 1) th frame image data are selectively analyzed to obtain rotation angle calculation data of the i +1 th frame image data compared with the i frame image data, and the rotation angle calculation data is used as reference data for adjusting the display direction of the i +1 th frame image data, so as to avoid the problem that the display direction of the i +1 th frame image data cannot be adjusted under specific conditions, and the screen can be accurately locked when the image acquisition device 1 is set along the gravity direction (for example, when the preset axis of the sensing module is consistent with the gravity direction), the angle of image output is kept consistent all the time, the image shaking problem is basically avoided, and the operation is convenient.
Further, step S43 includes the steps of:
step S431: and judging whether the moving path of the image acquisition module 111 in the (i + 1) th time period is a first path or a second path, executing a step S432 if the moving path of the image acquisition module 111 in the (i + 1) th time period is the first path, and executing a step S433 if the moving path of the image acquisition module 111 in the (i + 1) th time period is the second path.
The first path is a path from the horizontal direction to the vertical direction, that is, the operator moves the hand-held visual ear picking rod from the horizontal placement position to the vertical placement position in the ear picking operation, and the second path is a path from the vertical direction to the horizontal direction, that is, the operator moves the hand-held visual ear picking rod from the vertical placement position to the horizontal placement position in the ear picking operation.
Step S432: determine | θAcc-whether 90 degrees is less than or equal to a preset angle θ2If yes, go to step S432a, otherwise go to step S432 b.
Wherein, theta2For vertical conversion of angle, theta2In the range of 10 to 25 degrees (I.e., a range of 10 degrees or more and 25 degrees or less).
Step S432 a: and taking the azimuth angle change data as reference data for adjusting the display direction of the image data.
In step S432a, when | θ |, theAcc-90 degrees | less than or equal to a preset angle θ2And adjusting the display direction of the image data according to the azimuth angle change data.
S432 b: and taking the rotation angle change data as reference data for adjusting the display direction of the image data.
In step S432b, when | θ |, theAcc-90 degrees | greater than a preset angle θ2And adjusting the display direction of the image data according to the corner change data.
Step S433: determine | θAcc-whether 90 degrees is less than or equal to a preset angle θ1If yes, go to step S433a, otherwise go to step S433 b.
Wherein, theta1To convert the angle horizontally, theta1In the range of 10 degrees to 25 degrees.
Step S433 a: and taking the rotation angle change data as reference data for adjusting the display direction of the image data.
In step S433a, when | θ |Acc-90 degrees | less than or equal to a preset angle θ1And adjusting the display direction of the image data according to the corner change data.
Step S433 b: and taking the azimuth angle change data as reference data for adjusting the display direction of the image data.
In step S433b, when | θ |Acc-90 degrees | greater than a preset angle θ1And adjusting the display direction of the image data according to the azimuth angle change data.
In this embodiment, by determining whether the moving path of the image capturing module 111 is the first path or the second path in the (i + 1) th time period, when the image capturing device 1 moves from the horizontal direction to the vertical direction or from the vertical direction to the horizontal direction, the output images can be processed respectively, so that the screen locking precision of the output images is improved, the angle of image output is kept consistent all the time, and the operation is facilitated.
In order to make the calculation more efficient and the calculation result more accurate, in other embodiments, the step S104 may further include the following steps:
step S41': and judging whether the included angle between the preset axis of the sensing module and the gravity direction in the (i + 1) th time period is 0 degree or in the range from 0 to a preset angle according to the azimuth angle change data, if so, executing a step S42 ', and if not, executing a step S43'.
In an embodiment, in step S41', the step of determining whether the included angle between the preset axis of the sensing module and the gravity direction in the (i + 1) th time period is 0 degree according to the azimuth angle variation data may be the same as the step of determining in step S41, and is not repeated here, for example, when the included angle θ isAccWhen the angle is 90 degrees, it is determined that the included angle between the preset axis of the sensing module 112 and the gravity direction is 0 degree.
In another embodiment, in step S41 ', in the step of determining whether an included angle between a preset axis of the sensing module and a gravity direction in the i +1 th time period is in a range from 0 to a preset angle according to the azimuth change data, the preset angle may be less than or equal to 25 degrees, and if the preset angle is 10 degrees, that is, determining whether an included angle between the preset axis of the sensing module and the gravity direction in the i +1 th time period is in a range from 0 to 10 degrees according to the azimuth change data, if so, performing step S42 ', and if not, performing step S43 '. Specifically, when the angle θ isAccWhen the difference between the angle greater than or equal to 90 degrees and the preset angle is larger than or equal to the difference between the angle and the preset angle, it is determined that the included angle between the preset axis of the sensing module 112 and the gravity direction is in the range from 0 to the preset angle. E.g. when said angle thetaAccIs 88 degrees, the difference value between 90 degrees and the preset angle is 80 degrees, and the included angle thetaAccIf the angle of 88 degrees is greater than 80 degrees, it is determined that the included angle between the preset axis of the sensing module 112 and the gravity direction is in the range from 0 to the preset angle.
Step S42': using the rotation angle calculation data as reference data for adjusting the display direction of the image data of the (i + 1) th frame,
step S43': and using one of the rotation angle change data and the azimuth angle change data as reference data for adjusting the display direction of the image data.
In this embodiment, steps S42 'and S43' are similar to steps S42 and S43 of the above embodiment, and are not repeated here.
In the above embodiment, it may be determined whether an included angle between the preset axis of the sensing module and the gravity direction in the (i + 1) th time period is 0 degree or in a range from 0 to a preset angle, so that one of the rotation angle calculation data, the rotation angle change data, and the azimuth angle change data is used as reference data for adjusting the display direction of the image data, which may enable calculation to be more efficient and image display to be more accurate.
Further, step S103 includes the steps of:
step S31: and acquiring a first characteristic region in the ith frame of image data and a second characteristic region corresponding to the first characteristic region in the (i + 1) th frame of image data, and calculating the rotation angle of the second characteristic region compared with the first characteristic region to acquire the rotation angle calculation data.
It should be noted that the feature region may also be referred to as a region of interest, and in the field of image processing, the region of interest (ROI) is an image region selected from an image, and this region is a key point of interest for image analysis, and usually the feature region has more or more typical feature points in image analysis, for example, in the ear picking process, a region with a protruding ear canal may be used as the feature region. The characteristic region is defined, so that image processing can be conveniently carried out, the image processing time is reduced, and the precision is increased. In this embodiment, the first characteristic region and the second characteristic region corresponding to the same position in the two frames of image data may be obtained through image recognition to calculate and obtain the corner calculation data.
In step S31, the i frame image data and the i +1 frame image data may be first converted into gray scale maps, and the gray scale maps are subjected to median filtering and binarization processing, respectively, to obtain i frame corrected image data and i +1 frame corrected image data, then the first feature region is obtained in the i frame corrected image data and the second feature region is obtained in the i +1 frame corrected image data, respectively, by a minimum circumscribed rectangle search method, and finally, the corrected coordinate data of the second feature region is obtained by performing affine transformation on the coordinate data of the second feature region, and the corner calculation data is obtained according to the coordinate data of the first feature region and the corrected coordinate data of the second feature region.
In this embodiment, the first feature region is obtained in the i-th frame of corrected image data and the second feature region is obtained in the i + 1-th frame of corrected image data by a minimum circumscribed rectangle search method, affine transformation is performed on the coordinate data of the second feature region to obtain corrected coordinate data of the second feature region, and the corner calculation data is obtained according to the coordinate data of the first feature region and the corrected coordinate data of the second feature region.
Referring to fig. 4, fig. 4 is a flowchart of an image data processing method according to a second embodiment of the invention. In a second embodiment of the present invention, the present invention further provides an image data processing method of the image capturing apparatus 1, which includes the steps of:
step S201: acquiring the ith frame of image data acquired by the image acquisition module 111 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period.
Step S201 is similar to step S101 of the above embodiment, and is not described again here.
Step S202: and judging whether an included angle between the preset axis of the sensing module 112 in the (i + 1) th time period and the gravity direction is 0 degree, if so, executing step S203, and if not, executing step S204.
In this embodiment, to increase the operation speed and reduce the calculation process, it may be determined whether the operation trajectory of the operator at the moment is a linear motion trajectory along the gravity direction.
Step S203: analyzing the ith frame image data and the (i + 1) th frame image data to obtain corner calculation data of the (i + 1) th frame image data compared with the ith frame image data, and using the corner calculation data as reference data for adjusting the display direction of the (i + 1) th frame image data.
In step S203, when the operation trajectory of the operator at the moment is a linear motion trajectory along the gravity direction, the rotation angle calculation data may be directly used as reference data for adjusting the display direction of the image data of the (i + 1) th frame, and the specific calculation process is similar to that in step S103 of the foregoing embodiment and is not described herein again.
Step S204: acquiring one of rotation angle change data and azimuth angle change data of the image acquisition device 1 in the i +1 th time period, which is sensed by the sensing module 112, compared with the i-th time period, and using one of the rotation angle change data and the azimuth angle change data as reference data for adjusting the display direction of the image data.
In step S204, when the operation trajectory of the operator at the moment is not only a linear motion trajectory along the gravity direction, one of the rotation angle change data and the azimuth angle change data of the image acquisition device in the i +1 th time period, which is sensed by the sensing module 112, compared with the i-th time period is used as reference data for adjusting the display direction of the image data.
In the image data processing method provided in the embodiment of the present invention, by determining whether an included angle between a preset axis of the sensor module 112 in the (i + 1) th time period and the gravity direction is 0 degree, under different conditions, especially when the included angle between the sensor module 112 and the gravity direction is 0 degree, so that corner change data cannot be obtained according to a sensing signal of the sensor module 112, the i frame image data and the (i + 1) th frame image data are selectively analyzed to obtain corner calculation data of the i +1 th frame image data compared with the i frame image data, and the corner calculation data is used as reference data for adjusting the display direction of the i +1 th frame image data, so as to avoid a problem that the display direction of the i +1 th frame image data cannot be adjusted under specific conditions, and a screen can be accurately locked when the image acquisition device 1 is set along the gravity direction (for example, when the preset axis of the sensor module is consistent with the gravity direction), the angle of image output is kept consistent all the time, the image shaking problem is basically avoided, and the operation is convenient. Further, when the sensing module 112 can accurately measure the rotation angle change data and the azimuth angle change data, one of the rotation angle change data and the azimuth angle change data is selected as reference data for adjusting the display direction of the i +1 th frame of image data, because the sensing module 112 in this case, the sensing data is more accurate than the rotation angle calculation data, and through reasonable selection, the optimal reference data can be selected in various situations of the whole image data processing method to adjust the display direction of the i +1 th frame of image data, so that the image display effect is better.
Referring to fig. 5, fig. 5 is a flowchart of an image data processing method according to a third embodiment of the invention. In a third embodiment of the present invention, the present invention further provides an image data processing method of the image capturing apparatus 1, which includes the steps of:
step S301: acquiring the ith frame of image data acquired by the image acquisition module 111 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period.
Step S302: analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data;
step S303: and taking the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
Steps S301 and S302 are similar to steps S101 and S103 of the above embodiments, and are not described again here.
In the image data processing method provided by the embodiment of the invention, the ith frame of image data acquired by the image acquisition module 111 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period are acquired; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; the angle calculation data is used as reference data for adjusting the display direction of the (i + 1) th frame of image data, so that the output angle of the image is consistent all the time, the inconvenience brought to an operator by the rotation or shaking of the image is reduced, the screen locking purpose is achieved, the output image is kept to be the image with basically the same visual angle all the time when the operator rotates, moves or otherwise operates the image acquisition device 1, and the accuracy, rapidity and precision of operations such as station operation, equipment maintenance, human body operation, personal care, physical examination, physical and chemical teaching are guaranteed. It can be understood that the problems of improper image display direction, inconvenient operation and the like caused by the sensing failure or inaccuracy of the sensing module can be reduced by using the rotation angle calculation data as the reference data.
Referring to fig. 6, fig. 6 is a flowchart illustrating an image data processing method according to a fourth embodiment of the invention. In a fourth embodiment of the present invention, the present invention further provides an image data processing method of the image capturing apparatus 1, which includes the steps of:
step S401: acquiring the ith frame of image data acquired by the image acquisition module 111 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period.
Step S401 is similar to step S101 of the previous embodiment, and is not described again here.
Step S402: and acquiring course angle change data, azimuth angle change data or rotation angle change data of the image acquisition module sensed by the sensing module 112 in the (i + 1) th time period compared with the i-th time period.
It can be understood that when the operator uses the handheld visual ear picking rod for operation, the operation process may include turning motion around an operation point, which is centered on the operation point, and the turning motion around the operation point may be converted into heading angle change motion, rotation angle change motion, or azimuth angle change motion, where the heading angle is also called a true heading angle, which is an algebraic sum of a magnetic heading angle and a magnetic declination angle, and a range of values is (0, 2 pi), which may also be defined as (-pi, pi), and the rotation angle is also called a rotation angle, which refers to an angle between a rotation line of one point and the center and a line connecting the corresponding point and the rotation center after the rotation of the graph, and an azimuth angle is also called a longitude horizon, which is a horizontal angle between a clockwise direction and a target direction line from a north-pointing direction line of the point.
In this embodiment, the sensing module 112 may include a magnetometer for acquiring the heading angle change data, a three-axis gyroscope for acquiring the corner change data, and one of three-axis accelerometers for acquiring the azimuth change data, where the magnetometer (Magnetic, M-Sensor) is also called as geomagnetic Sensor and Magnetic Sensor, and may be used to test the Magnetic field strength and direction, and the azimuth of the positioning device and the magnetometer are similar to the compass principle, and may measure the included angles between the current device and the four directions of southeast, northwest, and southeast, and it can be understood that in this embodiment, a three-axis magnetometer may be used, and through the three-axis magnetometer may acquire the image acquisition device 1 sensed by the sensing module 112 in the (i + 1) th time period is compared with the heading angle change data in the i-th time period. In other embodiments, the heading angle variation data may also be obtained by calculating the variation of the heading angle twice after acquiring the heading angle of the image capturing device 1 in the i-th time period and the heading angle of the image capturing device 1 in the i + 1-th time period sensed by the sensing module 112 through the magnetometer triads. A three-axis gyroscope (Gyro), also called a micromechanical gyroscope, is characterized by the capability of simultaneously performing position measurements in six directions, and also the capability of measuring the trajectories and accelerations of movements in these directions. The triaxial gyroscope has the advantages of small volume, light weight, simple structure, good reliability and the like, in short, the triaxial gyroscope has the greatest function of measuring the angular velocity to judge the motion state of an object, and the data of the change of the rotation angle of the image acquisition device 1 in the (i + 1) th time period, which is sensed by the sensing module 112, compared with the (i) th time period can be acquired through the triaxial gyroscope. A triaxial accelerometer (Acc) is used to detect the acceleration signal. The data of the change of the azimuth angle of the image capturing device 1 in the i +1 th time period compared to the i th time period sensed by the sensing module 112 can be obtained through the three-axis accelerometer.
Step S403: analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data.
Step S403 is similar to step S103 of the previous embodiment, and is not described again here.
Step S404: and taking the rotation angle calculation data, the course angle change data, the azimuth angle change data or the rotation angle change data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
In step S404, when the motion of the image capturing device 1 is a steering motion, one of the heading angle change data, the rotation angle calculation data, or the azimuth angle change data may be obtained by one of the magnetometer, the three-axis gyroscope, or the three-axis accelerometer in the sensing module 112, so as to use one of the heading angle change data, the rotation angle calculation data, or the azimuth angle change data as a reference data for adjusting the display direction of the i +1 th frame of image data, and when the motion of the image capturing device 1 is a linear motion along a preset direction, such as a direction of gravity, the rotation angle change data may be obtained by the image sensor in the sensing module 112, so as to use the rotation angle change data as a reference data for adjusting the display direction of the i +1 th frame of image data, it is understood that when the motion of the image capturing device 1 is a turning motion and a linear motion along a predetermined direction, such as the direction of gravity, at the same time, one of the heading angle change data, the rotation angle calculation data, or the azimuth angle change data may be obtained by one of the magnetometer, the three-axis gyroscope, or the three-axis accelerometer in the sensing module 112, and the rotation angle change data may be obtained by the image sensor in the sensing module 112, so that the heading angle change data, the rotation angle calculation data, or the azimuth angle change data and the rotation angle change data may be used as reference data for adjusting the display direction of the i +1 th frame of image data. Therefore, the operator can observe the screen locking image with the same angle and direction on the display 13, and the operation is convenient. And adjusting the display direction of the (i + 1) th frame of image data according to at least one of the course angle change data, the corner change data, the azimuth angle change data or the corner calculation data to obtain the (i + 1) th frame of adjusted image data, and further controlling the display 13 to directly display according to the (i + 1) th frame of adjusted image data, so that the directions of the output image data are basically consistent, and the operation of an operator is facilitated. In the image data processing method provided by the embodiment of the invention, the rotation angle calculation data, the course angle change data, the azimuth angle change data or the rotation angle change data are used as reference data for adjusting the display direction of the i +1 th frame of image data, so that the output angles of the images are consistent all the time, the inconvenience brought to operators by image rotation or shaking is reduced, the screen locking purpose is achieved, the operators can keep the output images to be images with basically the same visual angle all the time when rotating, moving or otherwise operating the image acquisition device 1, and the accuracy, rapidity and precision of operations such as station operation, equipment maintenance, human body operation, personal care, physical examination, physicochemical teaching and the like are ensured.
Referring to fig. 7, fig. 7 is a flowchart of an image data processing method according to a fifth embodiment of the present invention. In a fifth embodiment of the present invention, the present invention further provides an image data processing method of the image capturing apparatus 1, which includes the steps of:
step S501: acquiring the ith frame of image data acquired by the image acquisition module 111 of the image acquisition device 1 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period, wherein i is a natural number.
Step S502: and acquiring the rotation angle change data, the azimuth angle change data and the heading angle change data of the image acquisition device 1 in the (i + 1) th time period compared with the i-th time period, which are sensed by a sensing module 112 of the image acquisition device 1.
Step S503: analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data.
It should be noted that, in this embodiment, the sensing module 112 may include at least one or two or more of a magnetometer for acquiring the heading angle change data, a three-axis gyroscope for acquiring the rotation angle change data, and a three-axis accelerometer for acquiring the azimuth angle change data.
Steps S501, S502, and S503 are similar to steps S401, S402, and S403 in the above embodiments, and are not repeated here.
Step S504: and taking the corner change data or one of the azimuth angle change data, the corner calculation data and the heading angle change data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
In step S504, when the motion of the image capturing device 1 is a steering motion, one or more of the heading angle change data, the rotation angle change data, or the azimuth angle change data may be obtained by one or more of the magnetometer, the three-axis gyroscope, or the three-axis accelerometer in the sensing module 112, so as to use one or more of the heading angle change data, the rotation angle change data, or the azimuth angle change data as a reference data for adjusting the display direction of the i +1 th frame of image data, in this embodiment, a suitable reference data may be selected according to the motion characteristics of the image capturing device 1, for example, the azimuth angle change data may be preferably used as a calculation basis when the calculation is made, when the three-axis accelerometer fails to obtain data due to a fault or other reasons, the turning angle change data can be selected as a calculation basis, and when the accumulated error of the three-axis gyroscope is overlarge, the course angle change data is switched to be used as the calculation basis, so that the calculation result is more accurate. It can be understood that the above is only a selection manner of this embodiment, and in practical application, a suitable calculation basis may be selected or multiple data may be combined for calculation according to a scene of practical application, and details are not described here. When the motion of the image capturing device 1 is a linear motion along a predetermined direction, such as the direction of gravity, the rotation angle variation data may be obtained by the image sensor in the sensing module 112, so that according to the rotation angle variation data as the reference data for adjusting the display direction of the i +1 th frame of image data, it can be understood that when the motion of the image capturing device 1 is a turning motion and a linear motion along a predetermined direction, such as the direction of gravity, one or more of the heading angle variation data, the rotation angle calculation data, or the azimuth angle variation data may be obtained by one or more of the magnetometer, the three-axis gyroscope, or the three-axis accelerometer in the sensing module 112, and the rotation angle variation data may be obtained by the image sensor in the sensing module 112, and therefore, the display direction of the (i + 1) th frame of image data is adjusted according to the corner change data or one of the azimuth angle change data, the corner calculation data and the heading angle change data. Therefore, the operator can observe the screen locking image with the same angle and direction on the display 13, and the operation is convenient. And adjusting the display direction of the (i + 1) th frame of image data according to at least one of the course angle change data, the corner change data, the azimuth angle change data and the corner calculation data to obtain the (i + 1) th frame of adjusted image data, and further controlling the display 13 to directly display according to the (i + 1) th frame of adjusted image data, so that the directions of the output image data are basically consistent, and the operation of an operator is facilitated. In the image data processing method provided by the embodiment of the invention, according to the corner change data or one of the azimuth change data, the corner calculation data and the course angle change data, the output angle of the image can be consistent all the time, the inconvenience brought to an operator by the rotation or shaking of the image is reduced, the purpose of locking the screen is achieved, the operator can keep the output image as the image with basically the same visual angle all the time when rotating, moving or otherwise operating the image acquisition device 1, and the accuracy, rapidity and precision of operations such as station operation, equipment maintenance, human body operation, personal care, physical examination, physical and chemical teaching are ensured.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an electronic device 30 according to an embodiment of the invention. The electronic device 30 includes a memory 32, a processor 31, and a computer program stored on the memory 32 and executable on the processor 31, and the processor 31 implements the image processing method according to any one of the first to fifth embodiments when executing the computer program.
Specifically, in the first embodiment, the processor 31 implements the following steps when executing the computer program: acquiring ith frame image data acquired by the image acquisition module 111 in an ith time period and (i + 1) th frame image data acquired in an (i + 1) th time period, wherein i is a natural number; acquiring rotation angle change data and azimuth angle change data of the image acquisition device 1 in the (i + 1) th time period, which are sensed by the sensing module 112, compared with the i-th time period; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; and using at least one of the rotation angle change data, the azimuth angle change data and the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data. It should be understood that the electronic device 30 includes, but is not limited to, the terminal device 2 described above, or a device integrating the image capturing apparatus 1 and the terminal device 2.
Further, the step of using at least one of the rotation angle change data, the azimuth angle change data, and the rotation angle calculation data as reference data for adjusting the display direction of the image data of the (i + 1) th frame includes: and judging whether an included angle between a preset axis of the sensing module 112 and the gravity direction in the (i + 1) th time period is 0 degree or not according to the azimuth change data, if so, using the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data, and if not, using one of the rotation angle change data and the azimuth change data as reference data for adjusting the display direction of the image data.
Further, the azimuth angle of the image capturing device 1 includes an included angle θ between the preset axis of the sensing module 112 and the horizontal directionAccWherein thetaAccIn the step of determining whether the included angle between the preset axis of the sensing module 112 and the gravity direction in the (i + 1) th time period is 0 degree according to the azimuth angle change data within the range of 0 to 90 degrees, when the included angle θ is larger than the preset axisAccWhen the angle is 90 degrees, it is determined that the included angle between the preset axis of the sensing module 112 and the gravity direction is 0 degree.
Further, the step of using one of the rotation angle change data and the azimuth angle change data as reference data for adjusting the display direction of the image data may include: judging whether the moving path of the image acquisition module 111 in the (i + 1) th time period is a first path or a second path, wherein the first path is a path from the horizontal direction to the vertical direction, the second path is a path from the vertical direction to the horizontal direction, and if the moving path of the image acquisition module 111 in the (i + 1) th time period is the first path, judging the thetaAcc-whether 90 degrees is less than or equal to a preset angle θ 2, if so, using the azimuth angle change data as reference data for adjusting the display direction of the image data, and if not, using the rotation angle change data as reference data for adjusting the display direction of the image data, wherein θ2In the range of 10 degrees to 25 degrees (i.e., in the range of 10 degrees or more to 25 degrees or less); if the moving path of the image acquisition module 111 in the (i + 1) th time period is the second path, determining | θ |Acc-whether 90 degrees | is less than or equal to a preset angle θ 1, if yes, using the rotation angle change data as reference data for adjusting the display direction of the image data, and if no, using the azimuth angle change data as reference data for adjusting the display direction of the image data, wherein θ1In the range of 10 degrees to 25 degrees.
The step of using at least one of the rotation angle change data, the azimuth angle change data, and the rotation angle calculation data as reference data for adjusting the display direction of the i +1 th frame image data may also include: and judging whether an included angle between a preset axis of the sensing module and the gravity direction in the (i + 1) th time period is 0 degree or in a range from 0 to a preset angle according to the azimuth change data, if so, using the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data, and if not, using one of the rotation angle change data and the azimuth change data as reference data for adjusting the display direction of the image data.
Further, the preset angle is less than or equal to 25 degrees; the preset angle is 10 degrees.
Further, the step of analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain the rotation angle calculation data of the ith +1 th frame of image data compared with the ith frame of image data includes acquiring a first feature region in the ith frame of image data and acquiring a second feature region corresponding to the first feature region in the ith +1 th frame of image data, and calculating a rotation angle of the second feature region compared with the first feature region to obtain the rotation angle calculation data.
Further, the step of analyzing the ith frame image data and the (i + 1) th frame image data to obtain corner calculation data of the (i + 1) th frame image data compared with the ith frame image data further includes converting the ith frame image data and the (i + 1) th frame image data into gray scale maps respectively, performing median filtering and binarization processing on the gray scale maps respectively to obtain ith frame correction image data and (i + 1) th frame correction image data, obtaining the first feature region in the ith frame correction image data and the second feature region in the (i + 1) th frame correction image data respectively by a minimum circumscribed rectangle search method, performing affine transformation on the coordinate data of the second feature region to obtain corrected coordinate data of the second feature region, and obtaining the corner calculation data according to the coordinate data of the first feature region and the corrected coordinate data of the second feature region Accordingly.
Further, the processing method further includes a step of adjusting the display direction of the i +1 th frame of image data according to at least one of the rotation angle change data, the azimuth angle change data, and the rotation angle calculation data to obtain an i +1 th frame of adjusted image data.
Further, the image acquisition device 1 is one of a handheld visual otoscope, a handheld visual ear picking rod, a handheld visual mouth mirror, a handheld visual tooth cleaner, a handheld visual skin instrument or a handheld visual hair instrument; the sensing module 112 includes a three-axis gyroscope for acquiring the corner change data and a three-axis accelerometer for acquiring the azimuth change data.
Specifically, in the second embodiment, the processor 31 implements the following steps when executing the computer program: acquiring the ith frame of image data acquired by the image acquisition module 111 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period; judging whether an included angle between a preset axis of the image acquisition module 111 in the (i + 1) th time period and the gravity direction is 0 degree, if so, analyzing the i-th frame image data and the i + 1-th frame image data to obtain corner calculation data of the i + 1-th frame image data compared with the i-th frame image data, taking the corner calculation data as reference data for adjusting the display direction of the i + 1-th frame image data, if not, acquiring one of corner change data and azimuth change data of the image acquisition device 1 in the i + 1-th time period compared with the i-th time period, which is sensed by the sensing module 112, and taking the one of the corner change data and the azimuth change data as reference data for adjusting the display direction of the image data.
Specifically, in the third embodiment, the processor 31 implements the following steps when executing the computer program: acquiring the ith frame of image data acquired by the image acquisition module 111 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; and taking the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
Specifically, in the fourth embodiment, the processor 31 implements the following steps when executing the computer program: acquiring the ith frame of image data acquired by the image acquisition module 111 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period; acquiring course angle change data, azimuth angle change data or rotation angle change data of the image acquisition module 111 in the (i + 1) th time period, which is sensed by the sensing module 112, compared with the ith time period; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; and taking the rotation angle calculation data, the course angle change data, the azimuth angle change data or the rotation angle change data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
In particular, in the fifth embodiment, the processor 31, when executing the computer program, implements the following steps: acquiring the ith frame of image data acquired by an image acquisition module 111 of the image acquisition device 1 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period, wherein i is a natural number; acquiring rotation angle change data, azimuth angle change data and course angle change data of the image acquisition device in the (i + 1) th time period, which are sensed by a sensing module 112 of the image acquisition device 1, compared with the i-th time period; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; and taking the corner change data or one of the azimuth angle change data, the corner calculation data and the heading angle change data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the invention. As shown in fig. 9, a storage medium storing computer readable instructions 41 is provided, and when executed by one or more processors, the computer readable instructions 41 cause the one or more processors to execute the image processing method according to any one of the first to fifth embodiments.
Specifically, in the first embodiment, the computer readable instructions 41, when executed by the one or more processors, cause the one or more processors to perform the steps of: acquiring ith frame image data acquired by the image acquisition module 111 in an ith time period and (i + 1) th frame image data acquired in an (i + 1) th time period, wherein i is a natural number; acquiring rotation angle change data and azimuth angle change data of the image acquisition device 1 in the (i + 1) th time period, which are sensed by the sensing module 112, compared with the i-th time period; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; and using at least one of the rotation angle change data, the azimuth angle change data and the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
Further, the step of using at least one of the rotation angle change data, the azimuth angle change data, and the rotation angle calculation data as reference data for adjusting the display direction of the image data of the (i + 1) th frame includes: and judging whether an included angle between a preset axis of the sensing module 112 and the gravity direction in the (i + 1) th time period is 0 degree or not according to the azimuth change data, if so, using the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data, and if not, using one of the rotation angle change data and the azimuth change data as reference data for adjusting the display direction of the image data.
Further, the azimuth angle of the image capturing device 1 includes an included angle θ between the preset axis of the sensing module 112 and the horizontal directionAccWherein thetaAccIn the step of determining whether the included angle between the preset axis of the sensing module 112 and the gravity direction in the (i + 1) th time period is 0 degree according to the azimuth angle change data within the range of 0 to 90 degrees, when the included angle θ is larger than the preset axisAccWhen the angle is 90 degrees, the sensing module 112 is determinedThe included angle between the preset shaft and the gravity direction is 0 degree.
Further, the step of using one of the rotation angle change data and the azimuth angle change data as reference data for adjusting the display direction of the image data may include: judging whether the moving path of the image acquisition module 111 in the (i + 1) th time period is a first path or a second path, wherein the first path is a path from the horizontal direction to the vertical direction, the second path is a path from the vertical direction to the horizontal direction, and if the moving path of the image acquisition module 111 in the (i + 1) th time period is the first path, judging the thetaAcc-whether 90 degrees | is less than or equal to a preset angle θ 2, if so, using the azimuth angle change data as reference data for adjusting the display direction of the image data, and if not, using the rotation angle change data as reference data for adjusting the display direction of the image data, wherein θ 2 is in a range of 10 degrees to 25 degrees (e.g., a range of 10 degrees or more to 25 degrees or less); if the moving path of the image acquisition module 111 in the (i + 1) th time period is the second path, determining | θ |Acc-whether 90 degrees is less than or equal to a preset angle θ1If so, the rotation angle change data is used as reference data for adjusting the display direction of the image data, and if not, the azimuth angle change data is used as reference data for adjusting the display direction of the image data, wherein theta1In the range of 10 degrees to 25 degrees.
The step of using at least one of the rotation angle change data, the azimuth angle change data, and the rotation angle calculation data as reference data for adjusting the display direction of the i +1 th frame image data may also include: and judging whether an included angle between a preset axis of the sensing module and the gravity direction in the (i + 1) th time period is 0 degree or in a range from 0 to a preset angle according to the azimuth change data, if so, using the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data, and if not, using one of the rotation angle change data and the azimuth change data as reference data for adjusting the display direction of the image data.
Further, the preset angle may be less than or equal to 25 degrees; the preset angle is 10 degrees.
Further, the step of analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain the rotation angle calculation data of the ith +1 th frame of image data compared with the ith frame of image data includes acquiring a first feature region in the ith frame of image data and acquiring a second feature region corresponding to the first feature region in the ith +1 th frame of image data, and calculating a rotation angle of the second feature region compared with the first feature region to obtain the rotation angle calculation data.
Further, the step of analyzing the ith frame image data and the (i + 1) th frame image data to obtain corner calculation data of the (i + 1) th frame image data compared with the ith frame image data further includes converting the ith frame image data and the (i + 1) th frame image data into gray scale maps respectively, performing median filtering and binarization processing on the gray scale maps respectively to obtain ith frame correction image data and (i + 1) th frame correction image data, obtaining the first feature region in the ith frame correction image data and the second feature region in the (i + 1) th frame correction image data respectively by a minimum circumscribed rectangle search method, performing affine transformation on the coordinate data of the second feature region to obtain corrected coordinate data of the second feature region, and obtaining the corner calculation data according to the coordinate data of the first feature region and the corrected coordinate data of the second feature region Accordingly.
Further, the processing method further includes a step of adjusting the display direction of the i +1 th frame of image data according to at least one of the rotation angle change data, the azimuth angle change data, and the rotation angle calculation data to obtain an i +1 th frame of adjusted image data.
Further, the image acquisition device 1 is one of a handheld visual otoscope, a handheld visual ear picking rod, a handheld visual mouth mirror, a handheld visual tooth cleaner, a handheld visual skin instrument or a handheld visual hair instrument; the sensing module 112 includes a gyroscope (e.g., a three-axis gyroscope) for acquiring the rotation angle variation data and an accelerometer (e.g., a three-axis accelerometer) for acquiring the azimuth angle variation data.
In particular, in the second embodiment, the computer readable instructions 41, when executed by the one or more processors, cause the one or more processors to perform the steps of: acquiring the ith frame of image data acquired by the image acquisition module 111 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period; judging whether an included angle between a preset axis of the image acquisition module 111 in the (i + 1) th time period and the gravity direction is 0 degree, if so, analyzing the i-th frame image data and the i + 1-th frame image data to obtain corner calculation data of the i + 1-th frame image data compared with the i-th frame image data, taking the corner calculation data as reference data for adjusting the display direction of the i + 1-th frame image data, if not, acquiring one of corner change data and azimuth change data of the image acquisition device 1 in the i + 1-th time period compared with the i-th time period, which is sensed by the sensing module 112, and taking the one of the corner change data and the azimuth change data as reference data for adjusting the display direction of the image data.
In particular, in a third embodiment, the computer readable instructions 41, when executed by the one or more processors, cause the one or more processors to perform the steps of: acquiring the ith frame of image data acquired by the image acquisition module 111 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; and taking the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
In particular, in a fourth embodiment, the computer readable instructions 41, when executed by the one or more processors, cause the one or more processors to perform the steps of: acquiring the ith frame of image data acquired by the image acquisition module 111 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period; acquiring course angle change data, azimuth angle change data or rotation angle change data of the image acquisition module 111 in the (i + 1) th time period, which is sensed by the sensing module 112, compared with the ith time period; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; and taking the rotation angle calculation data, the course angle change data, the azimuth angle change data or the rotation angle change data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
In particular, in the fifth embodiment, the computer readable instructions 41, when executed by the one or more processors, cause the one or more processors to perform the steps of: acquiring the ith frame of image data acquired by an image acquisition module 111 of the image acquisition device 1 in the ith time period and the (i + 1) th frame of image data acquired in the (i + 1) th time period, wherein i is a natural number; acquiring rotation angle change data, azimuth angle change data and course angle change data of the image acquisition device 1 in the (i + 1) th time period compared with the i-th time period, which are sensed by a sensing module 112 of the image acquisition device 1; analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data; and taking the corner change data or one of the azimuth angle change data, the corner calculation data and the heading angle change data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
The embodiments of the computer-readable storage medium of the present invention are substantially the same as the embodiments of the image data processing method and the electronic device, and are not repeated herein.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
Referring to fig. 10, fig. 10 is a schematic diagram illustrating several experimental verification results of the corner calculation data acquisition steps in steps S103, S203, S302, S403, and S503 of the image processing method according to the embodiments of the present invention, specifically, in each step, the corner calculation data can be calculated and obtained by identifying a first feature region in the ith frame of image data and a second feature region in the (i + 1) th frame of image data; wherein, to ensure that the corner calculation data obtained in step S103 is accurate, it is verified through a plurality of experiments whether the first characteristic region and the second characteristic region using six types of data (1), (2), (3), (4), (5) and (6) in fig. 4 can obtain accurate corner calculation data, the rotation angle change data detected by the sensing module, the rotation angle change data measured by a manual measurement or a tool, and the like are combined for verification, the result of the experimental verification basically obtains more accurate rotation angle calculation data, as shown in (1), (2), (3), (4), (5) and (6) of FIG. 4, compared with the first characteristic regions, the second characteristic regions are respectively rotated by 59 degrees, 34 degrees, 89.5 degrees, 60 degrees, 65.8 degrees and 143.8 degrees, that is, the image processing method provided by the embodiments of the invention is simple and reliable. In addition, it can be understood that, during each experimental verification, the preset shaft of the sensing module can be basically kept in the gravity direction for experimental verification, so that unnecessary movement interference is avoided.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (16)
1. A method of processing image data, the method comprising the steps of:
acquiring ith frame image data acquired by an image acquisition module of the image acquisition device in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period, wherein i is a natural number;
acquiring corner change data and azimuth change data of the image acquisition device in the (i + 1) th time period, which are sensed by a sensing module of the image acquisition device, compared with the i-th time period;
analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data;
and using at least one of the rotation angle change data, the azimuth angle change data and the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
2. The image data processing method according to claim 1, wherein the step of using at least one of the rotation angle change data, the azimuth angle change data, and the rotation angle calculation data as reference data for adjusting the display direction of the image data of the i +1 th frame includes:
judging whether the included angle between the preset axis of the sensing module and the gravity direction in the (i + 1) th time period is 0 degree or not according to the azimuth angle change data,
if yes, the rotation angle calculation data is used as reference data for adjusting the display direction of the image data of the (i + 1) th frame,
and if not, taking one of the corner change data and the azimuth change data as reference data for adjusting the display direction of the image data.
3. The image data processing method of claim 2, wherein the azimuth angle of the image capturing device comprises an angle θ between a preset axis of the sensing module and a horizontal directionAccWherein thetaAccIn the range of 0 to 90 degrees, in the step of judging whether the included angle between the preset axis of the sensing module and the gravity direction in the (i + 1) th time period is 0 degree according to the azimuth angle change data, when the included angle theta isAccAnd when the angle is 90 degrees, judging that the included angle between the preset shaft of the sensing module and the gravity direction is 0 degree.
4. The image data processing method according to claim 3, wherein the step of using one of the rotation angle change data and the azimuth angle change data as reference data for adjusting the display direction of the image data comprises:
judging whether the moving path of the image acquisition module in the (i + 1) th time period is a first path or a second path, wherein the first path is a path from the horizontal direction to the vertical direction, the second path is a path from the vertical direction to the horizontal direction,
if the moving path of the image acquisition module in the (i + 1) th time period is the first path, judging the | theta | (theta)Acc-whether 90 degrees is less than or equal to a preset angle θ2If so, using the azimuth angle change data as reference data for adjusting the display direction of the image data, and if not, using the corner change data as reference data for adjusting the display direction of the image data, wherein theta2In the range of 10 to 25 degrees;
if the moving path of the image acquisition module in the (i + 1) th time period is the second path, judging the | theta | (theta)Acc-whether 90 degrees is less than or equal to a preset angle θ1If so, the rotation angle change data is used as reference data for adjusting the display direction of the image data, and if not, the azimuth angle change data is used as reference data for adjusting the display direction of the image data, wherein theta1In the range of 10 degrees to 25 degrees.
5. The image data processing method according to claim 1, wherein the step of using at least one of the rotation angle change data, the azimuth angle change data, and the rotation angle calculation data as reference data for adjusting the display direction of the image data of the i +1 th frame includes:
judging whether the included angle between the preset axis of the sensing module and the gravity direction in the (i + 1) th time period is 0 degree or in the range from 0 to the preset angle according to the azimuth angle change data,
if yes, the rotation angle calculation data is used as reference data for adjusting the display direction of the image data of the (i + 1) th frame,
and if not, taking one of the corner change data and the azimuth change data as reference data for adjusting the display direction of the image data.
6. The image data processing method according to claim 5, wherein the preset angle is 25 degrees or less; the preset angle is 10 degrees.
7. The image data processing method of claim 1, wherein the step of analyzing the ith frame image data and the (i + 1) th frame image data to obtain corner calculation data of the (i + 1) th frame image data compared with the ith frame image data comprises:
and acquiring a first characteristic region in the ith frame of image data and a second characteristic region corresponding to the first characteristic region in the (i + 1) th frame of image data, and calculating the rotation angle of the second characteristic region compared with the first characteristic region to acquire the rotation angle calculation data.
8. The image data processing method of claim 7, wherein the step of analyzing the ith frame image data and the (i + 1) th frame image data to obtain corner calculation data of the (i + 1) th frame image data compared with the ith frame image data further comprises:
respectively converting the ith frame of image data and the (i + 1) th frame of image data into gray-scale images, respectively performing median filtering and binarization processing on the gray-scale images to obtain ith frame of corrected image data and (i + 1) th frame of corrected image data, respectively obtaining the first characteristic region in the ith frame of corrected image data and the second characteristic region in the (i + 1) th frame of corrected image data by a minimum circumscribed rectangle searching method, performing affine transformation on the coordinate data of the second characteristic region to obtain corrected coordinate data of the second characteristic region, and obtaining the corner calculation data according to the coordinate data of the first characteristic region and the corrected coordinate data of the second characteristic region.
9. The image data processing method according to claim 1, wherein the processing method further comprises a step of adjusting image data for an i +1 th frame obtained by adjusting a display direction of the image data for the i +1 th frame in accordance with at least one of the rotation angle change data, the azimuth angle change data, and the rotation angle calculation data.
10. The image data processing method according to claim 1, wherein the image acquisition device is one of a hand-holdable visual otoscope, a hand-holdable visual ear-pick-up stick, a hand-holdable visual mouth mirror, a hand-holdable visual dental scaler, a hand-holdable visual skin instrument, or a hand-holdable visual hair instrument; the sensing module comprises a triaxial gyroscope for acquiring the corner change data and a triaxial accelerometer for acquiring the azimuth change data.
11. A method of processing image data, the method comprising the steps of:
acquiring ith frame image data acquired by an image acquisition module in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period;
judging whether the included angle between the preset axis of the sensing module and the gravity direction in the (i + 1) th time period is 0 degree or not,
if yes, analyzing the ith frame image data and the (i + 1) th frame image data to obtain corner calculation data of the (i + 1) th frame image data compared with the ith frame image data, and using the corner calculation data as reference data for adjusting the display direction of the (i + 1) th frame image data,
if not, acquiring one of the rotation angle change data and the azimuth angle change data of the image acquisition device sensed by the sensing module in the (i + 1) th time period compared with the i-th time period, and taking one of the rotation angle change data and the azimuth angle change data as reference data for adjusting the display direction of the image data.
12. A method of processing image data, the method comprising the steps of:
acquiring ith frame image data acquired by an image acquisition module in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period;
analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data;
and taking the rotation angle calculation data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
13. A method of processing image data, the method comprising the steps of:
acquiring ith frame image data acquired by an image acquisition module in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period;
acquiring course angle change data, azimuth angle change data or corner change data of the image acquisition module in the (i + 1) th time period, which is sensed by the sensing module, compared with the ith time period;
analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data;
and taking the rotation angle calculation data, the course angle change data, the azimuth angle change data or the rotation angle change data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
14. A method of processing image data, the method comprising the steps of:
acquiring ith frame image data acquired by an image acquisition module of the image acquisition device in an ith time period and (i + 1) th frame image data acquired by an (i + 1) th time period, wherein i is a natural number;
acquiring rotation angle change data, azimuth angle change data and course angle change data of the image acquisition device in the (i + 1) th time period, which are sensed by a sensing module of the image acquisition device, compared with the i-th time period;
analyzing the ith frame of image data and the (i + 1) th frame of image data to obtain corner calculation data of the (i + 1) th frame of image data compared with the ith frame of image data;
and taking the corner change data or one of the azimuth angle change data, the corner calculation data and the heading angle change data as reference data for adjusting the display direction of the (i + 1) th frame of image data.
15. An electronic device, comprising a memory and a processor, wherein the memory has stored therein computer-readable instructions, which, when executed by the processor, cause the processor to perform the image data processing method of any one of claims 1 to 14.
16. A computer-readable storage medium, wherein the computer-readable instructions, when executed by one or more processors, cause the one or more processors to perform the image data processing method of any one of claims 1 to 14.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2021105291291 | 2021-05-14 | ||
CN202110529129 | 2021-05-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113781313A true CN113781313A (en) | 2021-12-10 |
Family
ID=78837074
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110907827.0A Pending CN113793243A (en) | 2021-05-14 | 2021-08-09 | Image data processing method, electronic device, and computer-readable storage medium |
CN202110907832.1A Pending CN113793244A (en) | 2021-05-14 | 2021-08-09 | Image data processing method, electronic device, and computer-readable storage medium |
CN202110908934.5A Pending CN113781313A (en) | 2021-05-14 | 2021-08-09 | Image data processing method, electronic device, and computer-readable storage medium |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110907827.0A Pending CN113793243A (en) | 2021-05-14 | 2021-08-09 | Image data processing method, electronic device, and computer-readable storage medium |
CN202110907832.1A Pending CN113793244A (en) | 2021-05-14 | 2021-08-09 | Image data processing method, electronic device, and computer-readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (3) | CN113793243A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103474050A (en) * | 2012-06-06 | 2013-12-25 | 富泰华工业(深圳)有限公司 | Image display device and image display method |
CN103795892A (en) * | 2012-11-01 | 2014-05-14 | 华为终端有限公司 | Method and apparatus for processing collected image data |
CN108012080A (en) * | 2017-12-04 | 2018-05-08 | 广东欧珀移动通信有限公司 | Image processing method, device, electronic equipment and computer-readable recording medium |
US20180181196A1 (en) * | 2016-12-22 | 2018-06-28 | Samsung Electronics Co., Ltd. | Method for displaying image, storage medium, and electronic device |
CN110381808A (en) * | 2017-01-06 | 2019-10-25 | 飞通尼护理股份有限公司 | Self orientation imaging device and application method |
CN111096759A (en) * | 2018-10-26 | 2020-05-05 | 深圳迈瑞生物医疗电子股份有限公司 | X-ray photography system, flat panel detector thereof and related method |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104869298A (en) * | 2014-02-21 | 2015-08-26 | 联想(北京)有限公司 | Data processing method and electronic device |
CN109035308A (en) * | 2017-06-09 | 2018-12-18 | 株式会社理光 | Image compensation method and device, electronic equipment and computer readable storage medium |
CN111405173B (en) * | 2019-01-03 | 2022-08-26 | 北京字节跳动网络技术有限公司 | Image acquisition method and device, point reading equipment, electronic equipment and storage medium |
-
2021
- 2021-08-09 CN CN202110907827.0A patent/CN113793243A/en active Pending
- 2021-08-09 CN CN202110907832.1A patent/CN113793244A/en active Pending
- 2021-08-09 CN CN202110908934.5A patent/CN113781313A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103474050A (en) * | 2012-06-06 | 2013-12-25 | 富泰华工业(深圳)有限公司 | Image display device and image display method |
CN103795892A (en) * | 2012-11-01 | 2014-05-14 | 华为终端有限公司 | Method and apparatus for processing collected image data |
US20180181196A1 (en) * | 2016-12-22 | 2018-06-28 | Samsung Electronics Co., Ltd. | Method for displaying image, storage medium, and electronic device |
CN110381808A (en) * | 2017-01-06 | 2019-10-25 | 飞通尼护理股份有限公司 | Self orientation imaging device and application method |
CN108012080A (en) * | 2017-12-04 | 2018-05-08 | 广东欧珀移动通信有限公司 | Image processing method, device, electronic equipment and computer-readable recording medium |
CN111096759A (en) * | 2018-10-26 | 2020-05-05 | 深圳迈瑞生物医疗电子股份有限公司 | X-ray photography system, flat panel detector thereof and related method |
Also Published As
Publication number | Publication date |
---|---|
CN113793244A (en) | 2021-12-14 |
CN113793243A (en) | 2021-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107314778B (en) | Calibration method, device and system for relative attitude | |
JP4167263B2 (en) | Mobile terminal device | |
US10760904B2 (en) | Wearable device, posture measurement method, and non-transitory recording medium | |
CN207923150U (en) | A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude | |
JP4010753B2 (en) | Shape measuring system, imaging device, shape measuring method, and recording medium | |
JP2000097637A (en) | Attitude position detecting device | |
CN108939512A (en) | A kind of swimming attitude measurement method based on wearable sensor | |
EP4160532A1 (en) | Object measurement method and apparatus, virtual object processing method and apparatus, medium and electronic device | |
JP4590511B2 (en) | Electronic compass | |
KR20120121595A (en) | Position calculation apparatus and method that use acceleration sensor | |
JP2009031295A (en) | Mobile object posture detection device | |
CN115666396A (en) | Acquisition system for ultrasound images of internal body organs | |
US10783376B2 (en) | Information processing apparatus | |
CN108801250B (en) | Real-time attitude acquisition method and device based on underwater robot | |
CN111435083A (en) | Pedestrian track calculation method, navigation method and device, handheld terminal and medium | |
CN112087728B (en) | Method and device for acquiring Wi-Fi fingerprint spatial distribution and electronic equipment | |
CN113781313A (en) | Image data processing method, electronic device, and computer-readable storage medium | |
CN115096336A (en) | Environmental magnetic field interference determination method based on nine-axis MEMS MARG sensor and computer system | |
US20130211714A1 (en) | Self-position measuring terminal | |
EP3995396A1 (en) | Information processing device, information processing method, program, and information processing system | |
JP2020201183A (en) | Camera position adjustment method | |
JP7309097B2 (en) | POSITION DETECTION DEVICE, POSITION DETECTION METHOD, AND POSITION DETECTION PROGRAM | |
JP6972675B2 (en) | Posture evaluation device, posture calculation device, posture measurement system, posture evaluation method and program | |
JP6897215B2 (en) | Image processing equipment, image processing methods and programs | |
Sikeridis et al. | An imu-based wearable presentation pointing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |