CN117045281B - Ultrasound imaging system, control method, imaging controller, and storage medium - Google Patents

Ultrasound imaging system, control method, imaging controller, and storage medium Download PDF

Info

Publication number
CN117045281B
CN117045281B CN202311319500.7A CN202311319500A CN117045281B CN 117045281 B CN117045281 B CN 117045281B CN 202311319500 A CN202311319500 A CN 202311319500A CN 117045281 B CN117045281 B CN 117045281B
Authority
CN
China
Prior art keywords
ultrasonic
probe
imaging
ultrasonic probe
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311319500.7A
Other languages
Chinese (zh)
Other versions
CN117045281A (en
Inventor
康聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wisonic Medical Technology Co ltd
Original Assignee
Shenzhen Wisonic Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wisonic Medical Technology Co ltd filed Critical Shenzhen Wisonic Medical Technology Co ltd
Priority to CN202311319500.7A priority Critical patent/CN117045281B/en
Publication of CN117045281A publication Critical patent/CN117045281A/en
Application granted granted Critical
Publication of CN117045281B publication Critical patent/CN117045281B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention provides an ultrasonic imaging system, a control method, an imaging controller and a storage medium, wherein the ultrasonic imaging system comprises an acoustic head sensor, an attitude sensor and a control device, wherein the acoustic head sensor is used for acquiring ultrasonic imaging data; at least one of the ultrasonic probe and the ultrasonic equipment is internally provided with a gesture analysis module which is connected with a gesture sensor and is used for determining a probe working section corresponding to the ultrasonic probe according to the motion gesture data; an imaging control module is arranged in at least one of the ultrasonic probe and the ultrasonic equipment, and is connected with the sound head sensor and the gesture analysis module, and is used for generating an original ultrasonic image according to ultrasonic imaging data and performing imaging frame rate control according to a probe working section corresponding to the ultrasonic probe. According to the invention, the imaging frame rate of the ultrasonic probe can be reduced or lowered according to the working section of the probe, the working scene and characteristics of the ultrasonic equipment and the diagnosis position of the patient.

Description

Ultrasound imaging system, control method, imaging controller, and storage medium
Technical Field
The invention belongs to the technical field of ultrasonic equipment, and particularly relates to an ultrasonic imaging system, a control method, an imaging controller and a storage medium.
Background
At present, ultrasonic equipment is widely applied nondestructive detection medical imaging equipment which is based on various ultrasonic probes, utilizes the piezoelectric effect of a sensor to transmit and receive ultrasonic waves and realizes human body detection through signal processing. The ultrasonic dynamic imaging is composed of an ultrasonic picture of one frame, and the ultrasonic frame rate is a key index of ultrasonic equipment.
The traditional ultrasonic medical imaging equipment does not adopt an integrated motion/posture sensor for the probe sound head to control the imaging frame rate of the ultrasonic equipment. The ultrasonic medical imaging equipment needs the ultrasonic probe to be in direct contact with the part to be detected of the human body for real-time imaging, an operator can spend a great deal of time to confirm the position of the patient to be diagnosed, the position for imaging is adjusted in most of the time in the whole diagnosis process, the ultrasonic probe is always in a complete working scanning state, the time for effectively acquiring the image of the patient by the ultrasonic probe is short, and a great deal of power consumption of the equipment is consumed as a whole. The large-scale color ultrasonic equipment is possibly insensitive due to direct adoption of alternating current power supply, but for portable color ultrasonic medical equipment powered by batteries, even handheld wireless ultrasonic equipment has strict requirements on power consumption, most of electric energy is wasted, the cruising ability of the equipment is influenced, and the use experience of users is influenced.
Disclosure of Invention
The invention provides an ultrasonic imaging system, a control method, an imaging controller and a storage medium, which are used for solving the problem that the power consumption is high when the existing ultrasonic probe is continuously in a working scanning state.
An ultrasound imaging system includes an ultrasound probe and an ultrasound device;
the ultrasonic probe is provided with an acoustic head sensor and a gesture sensor, the acoustic head sensor is used for acquiring ultrasonic imaging data, and the gesture sensor is used for acquiring motion gesture data;
at least one of the ultrasonic probe and the ultrasonic equipment is internally provided with a gesture analysis module, and the gesture analysis module is connected with the gesture sensor and is used for determining a probe working section corresponding to the ultrasonic probe according to the motion gesture data;
an imaging control module is arranged in at least one of the ultrasonic probe and the ultrasonic equipment, and is connected with the sound head sensor and the gesture analysis module, and is used for generating an original ultrasonic image according to the ultrasonic imaging data and performing imaging frame rate control according to a probe working section corresponding to the ultrasonic probe.
Preferably, the gesture sensor is a gyroscope, and the gyroscope is used for acquiring motion gesture data of the ultrasonic probe, wherein the motion gesture data comprises triaxial acceleration and triaxial angular velocity.
Preferably, the ultrasonic device is further provided with an image processing module, and the image processing module is connected with the imaging control module and the gesture analysis module and is used for performing image processing on the original ultrasonic image to obtain a target ultrasonic image.
Preferably, a first data transmission interface is arranged on the ultrasonic probe, a second data transmission interface is arranged on the ultrasonic equipment, and the first data transmission interface is connected with the second data transmission interface through a cable;
and/or the ultrasonic probe is provided with a first wireless communication unit, the ultrasonic equipment is provided with a second wireless communication unit, and the first wireless communication unit is in communication connection with the second wireless communication unit.
Preferably, the gesture analysis module includes:
the gesture change rate determining unit is used for determining the gesture change rate corresponding to the ultrasonic probe according to the motion gesture data;
an ineffective imaging section determining unit, configured to determine that a probe working section corresponding to the ultrasonic probe is an ineffective imaging section when the posture change rate is greater than a preset rate threshold;
and the effective imaging section determining unit is used for determining that the probe working section corresponding to the ultrasonic probe is an effective imaging section when the gesture change rate is not greater than a preset rate threshold.
Preferably, the motion gesture data includes a triaxial acceleration and a triaxial angular velocity;
the gesture change rate determining unit is used for determining an acceleration change rate and an angular velocity change rate corresponding to the ultrasonic probe according to the triaxial acceleration and the triaxial angular velocity respectively;
the ineffective imaging section determining unit is used for determining that the probe working section corresponding to the ultrasonic probe is an ineffective imaging section when any one of the acceleration change rate and the angular velocity change rate is larger than a preset rate threshold;
the effective imaging section determining unit is used for determining that the probe working section corresponding to the ultrasonic probe is an effective imaging section when the acceleration change rate and the angular velocity change rate are not greater than a preset rate threshold value.
Preferably, the imaging control module includes:
and the frame rate control unit is used for judging the probe working section corresponding to the ultrasonic probe and performing imaging frame rate control according to the probe working section corresponding to the ultrasonic probe.
An ultrasound imaging control method comprising the steps of:
in the working process of the ultrasonic probe, acquiring ultrasonic imaging data and motion attitude data corresponding to the ultrasonic probe;
Determining a probe working section corresponding to the ultrasonic probe according to the motion gesture data;
and generating an original ultrasonic image according to the ultrasonic imaging data, and performing imaging frame rate control according to a probe working section corresponding to the ultrasonic probe.
Preferably, the determining, according to the motion gesture data, a probe working section corresponding to the ultrasonic probe includes:
determining the gesture change rate corresponding to the ultrasonic probe according to the motion gesture data;
if the gesture change rate is greater than a preset rate threshold, determining that a probe working section corresponding to the ultrasonic probe is an invalid imaging section;
and if the gesture change rate is not greater than a preset rate threshold, determining that the probe working section corresponding to the ultrasonic probe is an effective imaging section.
Preferably, the motion gesture data includes a triaxial acceleration and a triaxial angular velocity;
determining the gesture change rate corresponding to the ultrasonic probe according to the motion gesture data comprises the following steps:
according to the triaxial acceleration and the triaxial angular velocity, respectively determining an acceleration change rate and an angular velocity change rate corresponding to the ultrasonic probe;
if any one of the acceleration change rate and the angular velocity change rate is larger than a preset rate threshold, determining that a probe working section corresponding to the ultrasonic probe is an invalid imaging section;
And if the acceleration change rate and the angular velocity change rate are not greater than a preset rate threshold, determining that the probe working section corresponding to the ultrasonic probe is an effective imaging section.
Preferably, the performing imaging frame rate control according to the probe working section corresponding to the ultrasonic probe includes:
if the probe working section corresponding to the ultrasonic probe is an ineffective imaging section, reducing the imaging frame rate of the ultrasonic probe;
if the probe working section corresponding to the ultrasonic probe is an effective imaging section, determining a current working scene based on the current scene image, and adjusting the imaging frame rate of the ultrasonic probe based on the current working scene.
Preferably, the adjusting the imaging frame rate of the ultrasonic probe based on the current working scene includes:
if the current working scene is a scene for searching a diagnosis position, reducing the imaging frame rate of the ultrasonic probe;
and if the current working scene is a scanning diagnosis position scene, improving the imaging frame rate of the ultrasonic probe.
An imaging controller comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing an ultrasound imaging control method as described when the computer program is executed.
A computer readable storage medium storing a computer program which, when executed by a processor, implements the ultrasound imaging control method as described.
The ultrasonic imaging system comprises an ultrasonic probe and ultrasonic equipment, wherein the ultrasonic probe is directly contacted with a human body when in use, and a doctor generally adjusts the contact position of the ultrasonic probe and the human body according to different use requirements. The ultrasonic probe is internally integrated with an acoustic head sensor which can be used for converting an ultrasonic signal and an electric signal for transmitting and receiving and is used for acquiring ultrasonic imaging data; the ultrasonic probe is internally integrated with an attitude sensor, is realized by adopting a single sensor or a sensor array and is used for acquiring motion attitude data; at least one of the ultrasonic probe and the ultrasonic equipment is internally provided with a gesture analysis module which is connected with a gesture sensor and is used for processing motion gesture data such as original gesture or space position information of the ultrasonic probe transmitted by the gesture sensor, and determining a probe working section corresponding to the ultrasonic probe according to the motion gesture data and combining the working scene and characteristics of the ultrasonic equipment and the diagnosis position of a patient. In the example, the gesture analysis module can be realized through an upper computer CPU algorithm module, can be realized by using a special processor (such as a DSP, an FPGA and the like), can also be realized through a digital signal processing module integrated in the sensor, has simple algorithm realization and saves FPGA operation resources. An imaging control module is arranged in at least one of the ultrasonic probe and the ultrasonic equipment, and is respectively connected with the sound head sensor and the gesture analysis module, and is used for generating an original ultrasonic image according to ultrasonic imaging data, performing imaging frame rate control according to a probe working section corresponding to the ultrasonic probe, and particularly reducing or lowering the imaging frame rate of the ultrasonic probe according to the probe working section, the working scene and the characteristics of the ultrasonic equipment and the diagnosis position of a patient.
In this embodiment, motion gesture data is acquired through a gesture sensor integrated on the ultrasonic probe, that is, the real-time motion state and gesture information of the ultrasonic probe are acquired, and the intelligent control of the ultrasonic imaging scanning and processing part of the device is realized through the data fusion of the imaging control of the ultrasonic device and the real-time motion state and gesture information of the ultrasonic probe by the gesture analysis module and the imaging control module. The gesture analysis module and the gesture sensor are utilized to identify and judge the probe working section corresponding to the ultrasonic probe, and on the basis, the power consumption control of the equipment is carried out through the real-time imaging scanning control and the signal processing control of the ultrasonic equipment, and the method can be further expanded to various related controls under other different working conditions of the ultrasonic probe.
When the ultrasonic imaging system in the embodiment is used, different ultrasonic probes are selected first and are installed on ultrasonic equipment, the ultrasonic probes are switched to required ultrasonic probes, preset positions corresponding to the ultrasonic probes are selected, intelligent frame rate control is selected, the configuration is carried out by utilizing parameters preset in the gesture analysis module, the processing results of the gesture analysis module after the configuration are input to an imaging control module in the ultrasonic equipment, and ultrasonic imaging control and frame rate adjustment are carried out. The ultrasonic probe is internally integrated with the gesture sensor, so that the traditional ultrasonic probe is changed into an intelligent probe capable of sensing gesture change of the ultrasonic probe, motion gesture data acquired by the gesture sensor is utilized for processing, a probe working section of ultrasonic equipment imaging can be judged in real time by combining application of products, an original ultrasonic image is generated according to the ultrasonic imaging data, and frame rate of equipment scanning and processing is intelligently controlled according to the probe working section corresponding to the ultrasonic probe, so that the signal processing capability of equipment is improved, the cruising capability of the equipment is improved, the energy consumption is reduced, the experience and efficiency of equipment users are improved, and the ultrasonic probe is especially suitable for product designs with strict requirements on power consumption control of wireless ultrasonic equipment and the like.
Drawings
FIG. 1 is a first block diagram of an ultrasound imaging system of the present invention;
FIG. 2 is a second block diagram of an ultrasound imaging system of the present invention;
FIG. 3 is a first flow chart of the ultrasound imaging control method of the present invention;
FIG. 4 is a second flowchart of the ultrasound imaging control method of the present invention;
fig. 5 is a third flowchart of the ultrasound imaging control method of the present invention.
1, an acoustic head sensor; 2. an attitude sensor; 3. a gesture analysis module; 4. an imaging control module; 5. an image processing module; 6. a first wireless communication unit; 7. a second wireless communication unit.
Detailed Description
In order to make the technical problems, technical schemes and beneficial effects solved by the invention more clear, the invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In the description of the present invention, it should be understood that the terms "longitudinal," "radial," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships that are based on the orientation or positional relationships shown in the drawings, merely to facilitate describing the present invention and simplify the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention. In the description of the present invention, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
An embodiment of the present invention provides an ultrasound imaging system, referring to fig. 1-2, comprising an ultrasound probe and an ultrasound device; the ultrasonic probe is provided with a sound head sensor 1 and a gesture sensor 2, wherein the sound head sensor 1 is used for acquiring ultrasonic imaging data, and the gesture sensor 2 is used for acquiring motion gesture data; at least one of the ultrasonic probe and the ultrasonic equipment is internally provided with a gesture analysis module 3, and the gesture analysis module 3 is connected with the gesture sensor 2 and is used for determining a probe working section corresponding to the ultrasonic probe according to the motion gesture data; an imaging control module 4 is arranged in at least one of the ultrasonic probe and the ultrasonic equipment, and the imaging control module 4 is connected with the sound head sensor 1 and the gesture analysis module 3 and is used for generating an original ultrasonic image according to ultrasonic imaging data and controlling imaging frame rate according to a probe working section corresponding to the ultrasonic probe.
Wherein, the ultrasonic imaging data is data for generating an ultrasonic image acquired by the sound head sensor 1 of the ultrasonic probe in real time. As an example, the ultrasound probe may be controlled to emit ultrasound waves to human tissue by the imaging control module 4 provided on the ultrasound probe and/or the ultrasound apparatus, receive ultrasound waves transmitted in a medium such as human tissue by the sound head sensor 1, generate ultrasound imaging data such as reflected waves and scattered waves, and transmit the ultrasound imaging data to the imaging control module 4, so that the imaging control module 4 may generate an original ultrasound image from the ultrasound imaging data. In this example, the ultrasonic probe may be composed of a plurality of elongated piezoelectric transducers (each single piezoelectric transducer is called an array element) with the same size and arranged at equal intervals; or the piezoelectric transducers are arranged in a two-dimensional array, i.e. array elements are arranged in a two-dimensional matrix shape. The piezoelectric transducer in the ultrasonic probe converts the voltage pulse excitation applied to the piezoelectric transducer into mechanical vibration, so that ultrasonic waves are emitted outwards; the ultrasonic wave propagates in the medium such as human tissue, and echo analog signals such as reflected waves and scattered waves are generated, each piezoelectric transducer can convert the echo analog signals into echo electric signals, amplify and analog-to-digital convert the echo electric signals, and transmit ultrasonic imaging data converted into echo digital signals to the imaging control module 4. The imaging control module 4 may receive the ultrasonic imaging data, such as echo digital signals, and perform imaging control operations such as beam forming, modulo, logarithmic compression, and spatial compounding on the echo digital signals of one or more channels to generate an original ultrasonic image. The data analysis is to analyze the echo synthesized signals into a two-dimensional matrix of the number of sampling points multiplied by the number of scanning lines, but the signals at each position are complex signals, and in order to image more intuitively, the complex signals are subjected to modulo operation to obtain the energy of the signals, and the energy is used for representing the signals at the position. The logarithmic compression is to make a logarithmic function transformation on the data after the modulo operation, and compress the dynamic range to make the organization level of the image clearer. After that, the same area image transmitted and received at a plurality of angles is compounded by using a space compounding operation, so that the coherence effect is weakened, the speckle noise is reduced, and the resolution of the whole image is improved.
The motion gesture data refers to data reflecting the motion gesture of the ultrasonic probe, which is detected in real time by a gesture sensor 2 built in the ultrasonic probe during the moving process of the ultrasonic probe. As an example, during the movement of the ultrasonic probe, the gesture sensor 2 built in the ultrasonic probe may collect motion gesture data in real time and send the motion gesture data to the gesture analysis module 3, so that the gesture analysis module 3 analyzes the motion gesture data to determine a probe working section corresponding to the ultrasonic probe, where the probe working section may be an effective imaging section or an ineffective imaging section, when the ultrasonic probe is in a working scanning state, the probe working section corresponding to the ultrasonic probe is an effective imaging section, and when the probe is in an inactive scanning state, the probe working section corresponding to the ultrasonic probe is an ineffective imaging section.
As an example, an ultrasound imaging system includes an ultrasound probe and an ultrasound device, where the ultrasound probe is in direct contact with the human body during use, and a physician typically adjusts the location at which the ultrasound probe is in contact with the human body according to different needs of use. The ultrasonic probe is internally integrated with an acoustic head sensor 1 which can transmit and receive ultrasonic signals and convert electric signals and is used for acquiring ultrasonic imaging data; the ultrasonic probe is internally integrated with an attitude sensor 2 which is realized by a single sensor or a sensor array and is used for acquiring motion attitude data; at least one of the ultrasonic probe and the ultrasonic equipment is internally provided with a gesture analysis module 3, the gesture analysis module 3 is connected with the gesture sensor 2 and is used for processing motion gesture data such as original gesture or space position information of the ultrasonic probe transmitted by the gesture sensor 2, and according to the motion gesture data, the probe working section corresponding to the ultrasonic probe is determined by combining the working scene and characteristics of the ultrasonic equipment and the diagnosis position of a patient. In this example, the gesture analysis module 3 may be implemented by an upper computer CPU algorithm module, may be implemented by a special processor (such as a DSP, an FPGA, etc.), may also be implemented by a digital signal processing module integrated inside the sensor, and the algorithm implementation is simple, so as to save the operation resources of the FPGA. An imaging control module 4 is arranged in at least one of the ultrasonic probe and the ultrasonic equipment, and the imaging control module 4 is respectively connected with the sound head sensor 1 and the gesture analysis module 3 and is used for generating an original ultrasonic image according to ultrasonic imaging data and performing imaging frame rate control according to a probe working section corresponding to the ultrasonic probe, and particularly can reduce or reduce the imaging frame rate of the ultrasonic probe according to the probe working section, the working scene and the characteristics of the ultrasonic equipment and the diagnosis position of a patient.
There are six arrangements of the ultrasound imaging system in this example:
the first is that the sound head sensor 1, the gesture sensor 2, the gesture analysis module 3 and the imaging control module 4 are integrated in the ultrasonic probe, at this time, the gesture analysis module 3 and the imaging control module 4 are not arranged in the ultrasonic equipment, namely, the ultrasonic imaging system only has one gesture analysis module 3 and one imaging control module 4;
the second is that the sound head sensor 1, the gesture sensor 2 and the gesture analysis module 3 are integrated in the ultrasonic probe, the imaging control module 4 is integrated in the ultrasonic equipment, at this time, the gesture analysis module 3 is not arranged in the ultrasonic equipment, namely the ultrasonic imaging system only has one gesture analysis module 3 and one imaging control module 4;
the third is that the sound head sensor 1 and the gesture sensor 2 are integrated in the ultrasonic probe, the gesture analysis module 3 and the imaging control module 4 are integrated in the ultrasonic equipment, at this time, the gesture analysis module 3 and the imaging control module 4 are not arranged in the ultrasonic probe, namely, the ultrasonic imaging system only has one gesture analysis module 3 and one imaging control module 4;
the fourth mode is that the sound head sensor 1, the gesture sensor 2 and the gesture analysis module 3 are integrated in the ultrasonic probe, the gesture analysis module 3 and the imaging control module 4 are integrated in the ultrasonic equipment, and at the moment, two gesture analysis modules 3 and one imaging control module 4 are arranged in the ultrasonic imaging system;
Fifthly, an acoustic head sensor 1, an attitude sensor 2, an attitude analysis module 3 and an imaging control module 4 are integrated in the ultrasonic probe, and the attitude analysis module 3 and the imaging control module 4 are integrated in the ultrasonic equipment, wherein at the moment, two attitude analysis modules 3 and two imaging control modules 4 are arranged in the ultrasonic imaging system;
the sixth is that the sound head sensor 1, the gesture sensor 2, the gesture analysis module 3 and the imaging control module 4 are integrated in the ultrasonic probe, and the imaging control module 4 is integrated in the ultrasonic equipment, at this time, the ultrasonic imaging system is internally provided with the gesture analysis module 3 and the two imaging control modules 4.
In this example, the arrangement structure may be selected according to actual requirements, and when the gesture analysis module 3 may be complex according to the complexity of the analysis algorithm, and a large amount of operation resource support is required, the situation that power consumption is large due to complete placement on the ultrasonic probe may be avoided.
In this embodiment, motion gesture data is collected through a gesture sensor 2 integrated on the ultrasonic probe, that is, real-time motion state and gesture information of the ultrasonic probe are obtained, and data fusion of imaging control of the ultrasonic equipment and real-time motion state and gesture information of the ultrasonic probe is realized through a gesture analysis module 3 and an imaging control module 4, so that intelligent control of an ultrasonic imaging scanning and processing part of the equipment is realized. The gesture analysis module 3 and the gesture sensor 2 are utilized to identify and judge the probe working section corresponding to the ultrasonic probe, and on the basis, the power consumption control of the equipment is carried out through the real-time imaging scanning control and the signal processing control of the ultrasonic equipment, and the method can be further expanded to various related controls under other different working conditions of the ultrasonic probe.
The gesture analysis module 3 in this example also has gesture training and learning functions, and the original gesture or spatial position information of the ultrasonic probe is converted into specific use scenes of the ultrasonic probe and motion gesture data of the ultrasonic probe under different user working scenes, so that data processing can be performed, typical judgment settings exist in factory, and the intelligent function of training and learning according to the operation habits and use scenes of users to improve the data processing at this stage is subsequently provided. Wherein, the process of gesture training and learning includes: factory configuration, for example, a large number of clinicians in a company organize to train probe gestures and movement information of different application positions and application scenes of ultrasonic equipment, mainly identifies the commonly used probe gestures and movement conditions of a user, and leaves a factory for reference use of the user; the client, for example, the user can choose to adopt factory trained default configuration of the company, and also can choose a custom mode to input, so as to define personalized settings of the user; by combining the use habits of different users, more complex control can be realized through training and learning the gesture of the ultrasonic probe.
When the ultrasonic imaging system in this embodiment is used, different ultrasonic probes are selected first, and are installed on an ultrasonic device, the ultrasonic probes are switched to the required ultrasonic probes, the preset positions corresponding to the ultrasonic probes are selected, intelligent frame rate control is selected, the configuration is performed by using parameters preset in the gesture analysis module 3, and the processing results of the gesture analysis module 3 after the configuration are input to the imaging control module 4 in the ultrasonic device to perform ultrasonic imaging control and frame rate adjustment. The gesture sensor 2 is integrated in the ultrasonic probe, so that the traditional ultrasonic probe is changed into an intelligent probe capable of sensing gesture change of the ultrasonic probe, motion gesture data acquired by the gesture sensor 2 are utilized for processing, a probe working section for ultrasonic equipment imaging can be judged in real time in combination with application of products, an original ultrasonic image is generated according to the ultrasonic imaging data, intelligent control is carried out on frame rate of equipment scanning and processing according to the probe working section corresponding to the ultrasonic probe, the signal processing capacity of equipment is improved, the cruising capacity of the equipment is improved, the energy consumption is reduced, the experience and efficiency of equipment users are improved, and the ultrasonic equipment is particularly suitable for product designs with strict requirements on power consumption control, such as wireless ultrasonic equipment.
In an embodiment, referring to fig. 1-2, the attitude sensor 2 is a gyroscope for acquiring motion attitude data of the ultrasonic probe, the motion attitude data including triaxial acceleration and triaxial angular velocity.
As an example, the gesture sensor 2 is used as a sensor (sensing array) for acquiring the position and/or motion of the ultrasonic probe, and typically uses a gyroscope(s) to monitor the acceleration and rotation angular velocity of each ultrasonic probe in real time, for acquiring motion gesture data of the ultrasonic probe, where the motion gesture data includes three-axis acceleration and three-axis angular velocity, i.e., X-axis acceleration, Y-axis acceleration, and Z-axis acceleration, and X-axis angular velocity, Y-axis angular velocity, and Z-axis angular velocity. In this example, the gyroscope provides three-axis acceleration and three-axis angular velocity of the ultrasonic probe according to the internal sensor, the three-axis acceleration and three-axis angular velocity are given to the gesture analysis module 3 through six-dimensional information, the gesture analysis module 3 can calculate the relative motion gesture data of the ultrasonic probe, namely gesture information and motion information, in real time according to the existing acceleration and angular velocity algorithm, the probe working section corresponding to the ultrasonic probe is determined through the gesture analysis module 3 according to the current working scene of the ultrasonic equipment and the diagnostic part of the patient, and the user can also input the customized motion gesture data.
In an embodiment, referring to fig. 1 and 2, an image processing module 5 is further provided on the ultrasound apparatus, where the image processing module 5 is connected to the imaging control module 4 and the gesture analysis module 3, and is configured to perform image processing on an original ultrasound image to obtain a target ultrasound image.
As an example, the ultrasonic device is further provided with an image processing module 5, one end of the image processing module 5 is connected with the imaging control module 4, and the other end of the image processing module 5 is connected with the gesture analysis module 3, so that the original ultrasonic image is processed to obtain a target ultrasonic image, the ultrasonic image parameter extraction process can be realized, the probe gesture training and learning data of different users (user name distinction) can be stored by combining the information provided by the gesture analysis module 3 with the imaging frame rate adjustment of the imaging control module 4, and the gesture analysis module 3 is configured in a related manner when the ultrasonic device starts to operate.
In an embodiment, referring to fig. 1 and 2, a first data transmission interface is arranged on an ultrasonic probe, a second data transmission interface is arranged on an ultrasonic device, and the first data transmission interface is connected with the second data transmission interface through a cable; and/or the ultrasonic probe is provided with a first wireless communication unit 6, the ultrasonic equipment is provided with a second wireless communication unit 7, and the first wireless communication unit 6 is in communication connection with the second wireless communication unit 7.
As an example, the ultrasonic probe and the ultrasonic device form an ultrasonic imaging system, and the ultrasonic probe and the ultrasonic device in the ultrasonic imaging system can adopt a wired connection mode to perform data transmission and control, and also can adopt a wireless connection mode to perform data transmission and control, which is specifically expressed as follows: the ultrasonic probe is provided with a first data transmission interface, the ultrasonic equipment is provided with a second data transmission interface, and the first data transmission interface is connected with the second data transmission interface through a cable so as to realize data transmission and control in a wired connection mode. In the wireless ultrasonic equipment, the ultrasonic probe is provided with a first wireless communication unit 6, the ultrasonic equipment is provided with a second wireless communication unit 7, and the first wireless communication unit 6 is in communication connection with the second wireless communication unit 7 so as to realize data transmission and control in a wired connection mode. The first wireless communication unit 6 and the second wireless communication unit 7 here may be WiFi wireless communication units.
In an embodiment, referring to fig. 1 and 2, the gesture resolving module 3 includes: the gesture change rate determining unit is used for determining the gesture change rate corresponding to the ultrasonic probe according to the motion gesture data; an ineffective imaging section determining unit for determining that a probe working section corresponding to the ultrasonic probe is an ineffective imaging section when the posture change rate is greater than a preset rate threshold; and the effective imaging section determining unit is used for determining that the probe working section corresponding to the ultrasonic probe is an effective imaging section when the gesture change rate is not greater than the preset rate threshold.
As an example, the preset speed threshold is a speed value set in advance according to an actual situation to determine a posture change speed corresponding to the ultrasonic probe. The pose resolving module 3 includes a pose change rate determining unit, an ineffective imaging section determining unit, and an effective imaging section determining unit; when the ultrasonic probe works, the gesture sensor 2 collects motion gesture data of the ultrasonic probe, and the gesture change rate determining unit determines the gesture change rate corresponding to the ultrasonic probe according to the motion gesture data; when the controller judges that the gesture change rate corresponding to the ultrasonic probe is greater than a preset rate threshold, the invalid imaging section determining unit determines that the probe working section corresponding to the ultrasonic probe is an invalid imaging section; when the controller judges that the gesture change rate corresponding to the ultrasonic probe is not greater than the preset rate threshold, the effective imaging section determining unit determines that the probe working section corresponding to the ultrasonic probe is an effective imaging section.
In one embodiment, referring to fig. 1 and 2, the motion gesture data includes a tri-axial acceleration and a tri-axial angular velocity; the gesture change rate determining unit is used for respectively determining the acceleration change rate and the angular velocity change rate corresponding to the ultrasonic probe according to the triaxial acceleration and the triaxial angular velocity; an ineffective imaging section determining unit for determining that a probe working section corresponding to the ultrasonic probe is an ineffective imaging section when any one of the acceleration change rate and the angular velocity change rate is greater than a preset rate threshold; and the effective imaging section determining unit is used for determining the probe working section corresponding to the ultrasonic probe as an effective imaging section when the acceleration change rate and the angular velocity change rate are not greater than a preset rate threshold value.
As an example, the motion gesture data includes three-axis acceleration and three-axis angular velocity, i.e., X-direction acceleration, Y-direction acceleration, and Z-direction acceleration, X-direction angular velocity, Y-direction angular velocity, and Z-direction angular velocity. The preset speed threshold is a speed value which is set according to actual conditions and used for judging the gesture change speed corresponding to the ultrasonic probe. The pose resolving module 3 includes a pose change rate determining unit, an ineffective imaging section determining unit, and an effective imaging section determining unit; during operation, the gesture sensor 2 collects the X-direction acceleration, the Y-direction acceleration and the Z-direction acceleration of the ultrasonic probe, the X-direction angular velocity, the Y-direction angular velocity and the Z-direction angular velocity are determined by the gesture change rate determining unit according to the X-direction acceleration, the Y-direction acceleration and the Z-direction acceleration, and the acceleration change rate and the angular velocity change rate corresponding to the ultrasonic probe are determined respectively; when the controller judges that any one of the acceleration change rate and the angular velocity change rate corresponding to the ultrasonic probe is larger than a preset rate threshold, the invalid imaging section determining unit determines that the probe working section corresponding to the ultrasonic probe is an invalid imaging section; when the controller judges that any one of the acceleration change rate and the angular velocity change rate corresponding to the ultrasonic probe is not greater than a preset rate threshold, the effective imaging section determining unit determines that the probe working section corresponding to the ultrasonic probe is an effective imaging section.
In one embodiment, referring to fig. 1 and 2, the imaging control module 4 includes: and the frame rate control unit is used for judging the probe working section corresponding to the ultrasonic probe and performing imaging frame rate control according to the probe working section corresponding to the ultrasonic probe.
As an example, the imaging control module 4 includes a frame rate control unit that determines a probe working section corresponding to the ultrasonic probe using a time frame rate control unit, and performs imaging frame rate control according to the probe working section corresponding to the ultrasonic probe. When the probe working section corresponding to the ultrasonic probe is an invalid imaging section, the imaging frame rate of the ultrasonic probe is reduced, and when the probe working section corresponding to the ultrasonic probe is an effective imaging section, the imaging frame rate of the ultrasonic probe is adjusted according to the current working scene.
The embodiment of the invention provides an ultrasonic imaging control method, which can be applied to the ultrasonic imaging system, in particular to an imaging controller, wherein the imaging controller can be a single controller integrating the gesture analysis module 3 and the imaging control module 4, and can also comprise a first controller integrating the function module of the gesture analysis module and a second controller integrating the function module of the imaging control module 4. Referring to fig. 3, the method comprises the steps of:
S1: in the working process of the ultrasonic probe, acquiring ultrasonic imaging data and motion attitude data corresponding to the ultrasonic probe;
s2: determining a probe working section corresponding to the ultrasonic probe according to the motion gesture data;
s3: and generating an original ultrasonic image according to the ultrasonic imaging data, and controlling the imaging frame rate according to the probe working section corresponding to the ultrasonic probe.
As an example, in step S1, the imaging controller obtains ultrasonic imaging data corresponding to the ultrasonic probe through the sound head sensor 1 integrated on the ultrasonic probe and obtains motion gesture data corresponding to the ultrasonic probe through the gesture sensor 2 integrated on the ultrasonic probe, that is, obtains real-time motion information and gesture information of the ultrasonic probe during the operation of the ultrasonic probe. For example, as a probe sound head position or a motion sensor (sensing array), the gesture sensor 2 generally monitors acceleration and rotation angular velocity of each ultrasonic probe sound head direction in real time by using a gyroscope(s) for acquiring motion gesture data of the ultrasonic probe, wherein the motion gesture data includes three-axis acceleration and three-axis angular velocity, namely X-direction acceleration, Y-direction acceleration and Z-direction acceleration, and X-direction angular velocity, Y-direction angular velocity and Z-direction angular velocity.
As an example, in step S2, the imaging controller determines a probe working section corresponding to the ultrasonic probe according to the motion gesture data, specifically, analyzes motion gesture data such as triaxial acceleration and triaxial angular velocity of the ultrasonic probe provided by the gesture sensor, and determines motion gesture changes between different moments according to the motion gesture data acquired at different moments, so as to determine the probe working section corresponding to the ultrasonic probe according to the motion gesture changes at different moments.
As an example, in step S3, the imaging controller generates an original ultrasound image according to the ultrasound imaging data, and specifically performs imaging control operations such as beam forming, modulo, logarithmic compression, and spatial compounding on the ultrasound imaging data, which is the received echo digital signal, to generate the original ultrasound image; and the imaging frame rate control is carried out according to the probe working section corresponding to the ultrasonic probe which is determined by analyzing the motion gesture data, so that the imaging control and the motion gesture data of the ultrasonic equipment are fused, and the intelligent control of the ultrasonic imaging scanning and processing part of the equipment is realized. The gesture analysis module 3 integrated on the ultrasonic probe is used for identifying an effective imaging section of the ultrasonic probe, and an identification result is input into the related imaging control module 4; an imaging control module 4 integrated on the ultrasound device controls the imaging frame rate of the ultrasound device in accordance with the identified probe working section in combination with different working scenarios. For example, the gesture analysis module 3 integrated on the ultrasonic probe can be used for judging that the ultrasonic probe moves rapidly, the probe working section opposite to the ultrasonic probe is in an ineffective imaging section, and the processing result is transmitted to the imaging control module 4 of the ultrasonic equipment in real time, and the imaging control module 4 integrated on the ultrasonic equipment can reduce the imaging frame rate of the ultrasonic equipment, so that the overall power consumption of the equipment is reduced.
In this embodiment, the motion gesture data of the ultrasonic probe by using the gesture sensor 2 is combined with a specific use scenario of the ultrasonic device, especially the spatial gesture of the ultrasonic probe, to determine the use position state of the ultrasonic probe, so as to optimize the user operation. For example, when the posture sensor 2 detects that the ultrasonic probe points downwards from top to bottom, the current ultrasonic probe checks that the part of the patient does not have an application scene of the probe state, and the imaging control module 4 can reduce the frame rate of the imaging control of the device or directly freeze the image through scanning control; the user can define various desired control instructions by inputting a specific custom gesture by himself.
In an embodiment, referring to fig. 4, step S2, that is, determining a probe working section corresponding to the ultrasonic probe according to the motion gesture data, includes:
s211: determining the gesture change rate corresponding to the ultrasonic probe according to the motion gesture data;
s212: if the gesture change rate is greater than a preset rate threshold, determining that a probe working section corresponding to the ultrasonic probe is an invalid imaging section;
s213: and if the gesture change rate is not greater than the preset rate threshold, determining that the probe working section corresponding to the ultrasonic probe is an effective imaging section.
The preset speed threshold is a speed value which is set according to actual conditions and used for judging the gesture change speed corresponding to the ultrasonic probe.
As an example, in step S211, the imaging controller collects motion gesture data of the ultrasonic probe through the gesture sensor 2 integrated on the ultrasonic probe, and calculates and determines a gesture change rate corresponding to the ultrasonic probe according to the motion gesture data, where the gesture change rate may calculate a difference between the two according to the motion gesture data between the two moments, and calculate a difference between the two and a corresponding time difference to determine a gesture change rate corresponding to the difference. For example, as a probe sound head position or a motion sensor (sensing array), the gesture sensor 2 generally adopts a gyroscope(s) to monitor acceleration and rotation angular velocity of each ultrasonic probe sound head direction in real time, so as to acquire motion gesture data of the ultrasonic probe. Second wireless communication
As an example, in step S212, the imaging controller may compare the calculated posture change rate corresponding to the ultrasonic probe with a preset rate threshold, and determine that the ultrasonic probe is in a fast moving process when the posture change rate corresponding to the ultrasonic probe is greater than the preset rate threshold, where the process has a high probability of being a process of searching for a diagnostic position, rather than a process of performing ultrasonic scanning on the diagnostic position, and thus may determine that the probe working section corresponding to the ultrasonic probe is an ineffective imaging section.
As an example, in step S213, the imaging controller may compare the calculated gesture change rate corresponding to the ultrasound probe with a preset speed threshold, and determine that the movement speed of the ultrasound probe is slower when the gesture change rate corresponding to the ultrasound probe is not greater than the preset speed threshold, and the maximum probability is that the ultrasound scanning process is performed for the diagnostic position, so that the probe working section corresponding to the ultrasound probe may be determined to be an effective imaging section.
In this embodiment, the imaging controller calculates the gesture change rate of the ultrasonic probe in real time according to the motion gesture data of the gesture sensor 2 integrated on the ultrasonic probe, and determines whether the ultrasonic probe is in the fast moving process based on the comparison result of the gesture change rate and the preset speed threshold, so as to determine that the probe working section corresponding to the ultrasonic probe is an ineffective imaging section or an effective imaging section, so that the imaging rate of the ultrasonic probe is controlled in the effective imaging section based on the finally determined probe working section, so as to achieve the purposes of not scanning in the ineffective imaging section and reducing power consumption.
In one embodiment, referring to FIG. 5, the motion gesture data includes tri-axial acceleration and tri-axial angular velocity;
Step S2, namely determining the gesture change rate corresponding to the ultrasonic probe according to the motion gesture data, wherein the step comprises the following steps:
s221: according to the triaxial acceleration and the triaxial angular velocity, respectively determining the acceleration change rate and the angular velocity change rate corresponding to the ultrasonic probe;
s222: if any one of the acceleration change rate and the angular velocity change rate is larger than a preset rate threshold, determining that a probe working section corresponding to the ultrasonic probe is an ineffective imaging section;
s223: and if the acceleration change rate and the angular velocity change rate are not greater than the preset rate threshold, determining that the probe working section corresponding to the ultrasonic probe is an effective imaging section.
Step S221 is an embodiment of step S211, step S222 is an embodiment of step S212, and step S223 is an embodiment of step S223.
The motion gesture data comprises three-axis acceleration and three-axis angular velocity, wherein the three-axis acceleration comprises X-direction acceleration, Y-direction acceleration and Z-direction acceleration, and the three-axis angular velocity comprises X-direction angular velocity, Y-direction angular velocity and Z-direction angular velocity. The preset speed threshold is a speed value which is preset according to actual conditions and used for judging the gesture change speed corresponding to the ultrasonic probe.
As an example, in step S221, the imaging controller acquires X-direction acceleration, Y-direction acceleration, and Z-direction acceleration of the ultrasonic probe at different moments through the attitude sensor 2 integrated on the ultrasonic probe, and acquires X-direction angular velocity, Y-direction angular velocity, and Z-direction angular velocity; and then, determining the acceleration change rate corresponding to the ultrasonic probe according to the X-direction acceleration, the Y-direction acceleration and the Z-direction acceleration corresponding to different moments, and determining the angular velocity change rate corresponding to the ultrasonic probe according to the X-direction angular velocity, the Y-direction angular velocity and the Z-direction angular velocity corresponding to different moments. For example, as a probe sound head position or a motion sensor (sensing array), the gesture sensor 2 generally adopts a gyroscope(s) to monitor acceleration and rotation angular velocity of each ultrasonic probe sound head direction in real time, so as to acquire motion gesture data of the ultrasonic probe.
As an example, in step S222, the acceleration change rate and the angular velocity change rate corresponding to the ultrasonic probe may be used as variables for determining the operation state of the ultrasonic probe, and when one of the acceleration change rate and the angular velocity change rate is changed, the other is also changed, so that either one of the acceleration change rate and the angular velocity change rate changes, and the operation state of the ultrasonic probe is affected, so that whether the probe operation section is effective or not can be determined. When any one of the acceleration change rate and the angular velocity change rate corresponding to the ultrasonic probe is larger than a preset rate threshold value, the imaging controller determines that any one of the acceleration and the angular velocity of the ultrasonic probe is in a rapid change state, and the maximum probability is a process of searching for a diagnosis position and not a process of performing ultrasonic scanning on the diagnosis position, so that a probe working section corresponding to the ultrasonic probe can be determined to be an ineffective imaging section;
As an example, in step S223, when the acceleration change rate and the angular velocity change rate corresponding to the ultrasound probe are not greater than the preset rate threshold, the imaging controller determines that the acceleration change and the angular velocity change of the ultrasound probe are slower, and the maximum probability is that the ultrasound scanning is performed on the diagnostic position, so that the probe working section corresponding to the ultrasound probe may be determined to be an effective imaging section.
In this embodiment, the imaging controller collects the triaxial acceleration and triaxial angular velocity of the ultrasonic probe through the attitude sensor 2 integrated on the ultrasonic probe, and determines the acceleration change rate and the angular velocity change rate corresponding to the ultrasonic probe according to the triaxial acceleration and the triaxial angular velocity at different moments, and compares and judges the acceleration change rate and the angular velocity change rate corresponding to the ultrasonic probe with a preset rate threshold through the attitude analysis module 3 integrated in the imaging controller, so as to determine the probe working section corresponding to the ultrasonic probe according to the comparison result.
In one embodiment, step S3, that is, performing imaging frame rate control according to a probe working section corresponding to the ultrasound probe, includes:
s31: if the probe working section corresponding to the ultrasonic probe is an invalid imaging section, reducing the imaging frame rate of the ultrasonic probe;
S32: if the probe working section corresponding to the ultrasonic probe is an effective imaging section, determining a current working scene based on the current scene image, and adjusting the imaging frame rate of the ultrasonic probe based on the current working scene.
The current scene image refers to a scene image shot at the moment, and may be an image acquired by a camera or other devices during the working process of the ultrasonic probe held by a user (i.e. doctor). The current working scene refers to the working scene of the current moment of the user identified according to the current scene image, and the current working scene can be any one of searching for a diagnosis position scene or scanning the diagnosis position scene.
As an example, the imaging controller identifies the probe working section corresponding to the ultrasonic probe through the integrated imaging control module 4, and performs imaging frame rate control according to the result of identifying the probe working section corresponding to the ultrasonic probe, when the probe working section corresponding to the ultrasonic probe is an ineffective imaging section, which indicates that the ultrasonic probe is not in the ultrasonic scanning process for the diagnosis position of human tissue at the moment, the imaging frame rate of the ultrasonic probe is reduced, so as to achieve the purpose of reducing the power consumption of the ultrasonic probe; when the probe working section corresponding to the ultrasonic probe is an effective imaging section, the ultrasonic probe is possibly in an ultrasonic scanning process for the diagnosis position of human tissues or is not possibly in an ultrasonic scanning process for the diagnosis position of human components, at the moment, a current scene image is required to be acquired through a camera or other equipment, then the current scene image is identified by adopting a pre-trained neural network model so as to determine a current working scene corresponding to the current scene image, and the current working scene can reflect the specific scene of the user currently in searching for the diagnosis position or scanning the diagnosis position so as to adjust the imaging frame rate of the ultrasonic probe based on the current working scene, thereby achieving the purpose of reducing power consumption.
In this embodiment, the imaging controller may identify the motion gesture data by using the integrated gesture analysis module 3, determine a probe working section corresponding to the ultrasonic probe, and input the probe working section corresponding to the ultrasonic probe into the imaging control module 4; the imaging controller reuses the integrated imaging control module 4 to control the frame rate control of the imaging of the ultrasonic equipment according to the identified working section of the probe and combining different working scenes. Specifically, the gesture analysis module 3 integrated on the ultrasonic probe by the imaging controller judges that the ultrasonic probe moves rapidly, when the ultrasonic probe is determined to be in an ineffective imaging section, the processing result is transmitted to the imaging control module 4 integrated on the ultrasonic equipment in real time, and the imaging controller controls the ultrasonic equipment to reduce the frame rate of imaging of the ultrasonic equipment, so that the overall power consumption of the equipment is reduced. The gesture analysis module 3 integrated on the ultrasonic probe by the imaging controller judges that the ultrasonic probe moves rapidly, and when the ultrasonic probe is determined to be in an effective imaging section, the current working scene of the ultrasonic probe needs to be determined according to the current scene image acquired in real time, so that imaging frame rate control is performed based on the current working scene, and the purpose of reducing power consumption is achieved. In this example, the imaging controller can realize imaging control of the ultrasonic equipment and data fusion of real-time motion state and gesture information of the ultrasonic probe through the gesture analysis module 3 integrated on the ultrasonic probe and the imaging control module 4 integrated on the ultrasonic equipment, so as to realize intelligent control of the ultrasonic imaging scanning and processing part of the equipment.
In an embodiment, step S32, that is, adjusting the imaging frame rate of the ultrasound probe according to the current working scenario, includes:
s321: if the current working scene is the scene for searching the diagnosis position, reducing the imaging frame rate of the ultrasonic probe;
s322: and if the current working scene is a scanning diagnosis position scene, the imaging frame rate of the ultrasonic probe is improved.
As an example, a user selects an ultrasonic probe according to a diagnosis position, the ultrasonic probe selects the ultrasonic probe, then determines the position to be diagnosed, then acquires a current scene image through a camera or other equipment, and adopts a pre-trained neural network model to identify a user moving path and a gesture in the current scene image so as to determine a current working scene according to the change of the user moving path and the gesture. When the imaging controller monitors that the moving path is rapid and disordered and the gesture is continuously changed, the fact that the final diagnosis position is not determined at the moment is indicated, and the current working scene is determined to be the scene for searching the diagnosis position is determined, so that the imaging frame rate of the ultrasonic probe needs to be reduced, and the purpose of reducing the power consumption is achieved. When the imaging controller monitors that the moving path is slow and orderly and the gestures remain unchanged, the method indicates that the maximum probability is that the diagnosis position is determined, and ultrasonic scanning is carried out on the diagnosis position, at the moment, the imaging frame rate of the ultrasonic probe needs to be improved to acquire more effective image features, so that the final imaging effect is ensured.
In this embodiment, when determining that the working section of the probe is an effective imaging section according to the motion gesture data, the current working scene of the object needs to be further determined by combining with the current scene image, so as to control the frame rate control of imaging of the ultrasonic device, and both the power consumption and the imaging quality of the ultrasonic imaging system can be considered.
The embodiment of the invention provides an imaging controller, which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor realizes the ultrasonic imaging control method as described above, for example, S1-S3 shown in fig. 3 when executing the computer program, and the repetition is avoided, and the description is omitted here.
The embodiment of the invention provides a computer readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the method for controlling ultrasonic imaging as described above, for example, S1-S3 shown in fig. 3, is implemented, and in order to avoid repetition, a description is omitted here.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention.

Claims (9)

1. An ultrasonic imaging system, comprising an ultrasonic probe and an ultrasonic device;
the ultrasonic probe is provided with a sound head sensor and a gesture sensor, the sound head sensor is used for acquiring ultrasonic imaging data, the gesture sensor is used for acquiring motion gesture data, and the motion gesture data comprises triaxial acceleration and triaxial angular velocity;
At least one of the ultrasonic probe and the ultrasonic equipment is internally provided with a gesture analysis module which is connected with the gesture sensor and used for determining a gesture change rate corresponding to the ultrasonic probe according to the motion gesture data; if the gesture change rate is greater than a preset rate threshold, determining that a probe working section corresponding to the ultrasonic probe is an invalid imaging section; if the gesture change rate is not greater than a preset rate threshold, determining that a probe working section corresponding to the ultrasonic probe is an effective imaging section, wherein the probe working section comprises an effective imaging section of the ultrasonic probe in a working scanning state and an ineffective imaging section of the ultrasonic probe in an inactive scanning state;
an imaging control module is arranged in at least one of the ultrasonic probe and the ultrasonic equipment, and is connected with the sound head sensor and the gesture analysis module, and is used for generating an original ultrasonic image according to the ultrasonic imaging data, judging a probe working section corresponding to the ultrasonic probe, and reducing the imaging frame rate of the ultrasonic probe if the probe working section corresponding to the ultrasonic probe is an invalid imaging section; if the probe working section corresponding to the ultrasonic probe is an effective imaging section, based on the current scene image, a pre-trained neural network model is adopted to identify a user moving path and a gesture in the current scene image so as to determine the current working scene according to the change of the user moving path and the gesture; if the current working scene is a scene for searching a diagnosis position, reducing the imaging frame rate of the ultrasonic probe; and if the current working scene is a scanning diagnosis position scene, improving the imaging frame rate of the ultrasonic probe.
2. The ultrasound imaging system of claim 1, wherein the attitude sensor is a gyroscope for acquiring motion attitude data of the ultrasound probe, the motion attitude data including tri-axial acceleration and tri-axial angular velocity.
3. The ultrasonic imaging system of claim 1, wherein the ultrasonic device is further provided with an image processing module, and the image processing module is connected with the imaging control module and the gesture analysis module, and is used for performing image processing on the original ultrasonic image to obtain a target ultrasonic image.
4. The ultrasonic imaging system of claim 1, wherein a first data transmission interface is arranged on the ultrasonic probe, a second data transmission interface is arranged on the ultrasonic equipment, and the first data transmission interface is connected with the second data transmission interface through a cable;
and/or the ultrasonic probe is provided with a first wireless communication unit, the ultrasonic equipment is provided with a second wireless communication unit, and the first wireless communication unit is in communication connection with the second wireless communication unit.
5. The ultrasound imaging system of claim 4, wherein the motion pose data comprises tri-axial acceleration and tri-axial angular velocity;
The gesture change rate determining unit is used for determining the acceleration change rate and the angular speed change rate corresponding to the ultrasonic probe according to the triaxial acceleration and the triaxial angular speed respectively;
an ineffective imaging section determining unit configured to determine that a probe working section corresponding to the ultrasonic probe is an ineffective imaging section when any one of the acceleration change rate and the angular velocity change rate is greater than a preset rate threshold;
and the effective imaging section determining unit is used for determining the probe working section corresponding to the ultrasonic probe as an effective imaging section when the acceleration change rate and the angular velocity change rate are not greater than a preset rate threshold value.
6. An ultrasonic imaging control method, characterized by comprising the steps of:
in the working process of the ultrasonic probe, ultrasonic imaging data and motion gesture data corresponding to the ultrasonic probe are obtained, wherein the motion gesture data comprises triaxial acceleration and triaxial angular velocity;
determining the gesture change rate corresponding to the ultrasonic probe according to the motion gesture data; if the gesture change rate is greater than a preset rate threshold, determining that a probe working section corresponding to the ultrasonic probe is an invalid imaging section; if the gesture change rate is not greater than a preset rate threshold, determining that a probe working section corresponding to the ultrasonic probe is an effective imaging section, wherein the probe working section comprises an effective imaging section of the ultrasonic probe in a working scanning state and an ineffective imaging section of the ultrasonic probe in an inactive scanning state;
Generating an original ultrasonic image according to the ultrasonic imaging data, judging a probe working section corresponding to the ultrasonic probe, and reducing the imaging frame rate of the ultrasonic probe if the probe working section corresponding to the ultrasonic probe is an invalid imaging section; if the probe working section corresponding to the ultrasonic probe is an effective imaging section, based on the current scene image, a pre-trained neural network model is adopted to identify a user moving path and a gesture in the current scene image so as to determine the current working scene according to the change of the user moving path and the gesture; if the current working scene is a scene for searching a diagnosis position, reducing the imaging frame rate of the ultrasonic probe; and if the current working scene is a scanning diagnosis position scene, improving the imaging frame rate of the ultrasonic probe.
7. The ultrasonic imaging control method of claim 6, wherein the motion gesture data comprises a tri-axial acceleration and a tri-axial angular velocity;
determining the gesture change rate corresponding to the ultrasonic probe according to the motion gesture data comprises the following steps:
according to the triaxial acceleration and the triaxial angular velocity, respectively determining an acceleration change rate and an angular velocity change rate corresponding to the ultrasonic probe;
If any one of the acceleration change rate and the angular velocity change rate is larger than a preset rate threshold, determining that a probe working section corresponding to the ultrasonic probe is an invalid imaging section;
and if the acceleration change rate and the angular velocity change rate are not greater than a preset rate threshold, determining that the probe working section corresponding to the ultrasonic probe is an effective imaging section.
8. An imaging controller comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the ultrasound imaging control method of any of claims 6-7 when the computer program is executed.
9. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the ultrasound imaging control method according to any one of claims 6 to 7.
CN202311319500.7A 2023-10-12 2023-10-12 Ultrasound imaging system, control method, imaging controller, and storage medium Active CN117045281B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311319500.7A CN117045281B (en) 2023-10-12 2023-10-12 Ultrasound imaging system, control method, imaging controller, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311319500.7A CN117045281B (en) 2023-10-12 2023-10-12 Ultrasound imaging system, control method, imaging controller, and storage medium

Publications (2)

Publication Number Publication Date
CN117045281A CN117045281A (en) 2023-11-14
CN117045281B true CN117045281B (en) 2024-01-26

Family

ID=88661269

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311319500.7A Active CN117045281B (en) 2023-10-12 2023-10-12 Ultrasound imaging system, control method, imaging controller, and storage medium

Country Status (1)

Country Link
CN (1) CN117045281B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102885639A (en) * 2011-07-21 2013-01-23 索尼公司 Signal processing apparatus, control method, signal processing system, and signal processing method
CN102949209A (en) * 2011-08-22 2013-03-06 通用电气公司 Ultrasound imaging system, ultrasound probe, and method of reducing power consumption
CN108629170A (en) * 2018-04-20 2018-10-09 北京元心科技有限公司 Personal identification method and corresponding device, mobile terminal
CN109310394A (en) * 2016-04-26 2019-02-05 安科诺思公司 Ultrasonic adaptive power management system and method
CN109498064A (en) * 2018-12-29 2019-03-22 深圳开立生物医疗科技股份有限公司 Ultrasonic scanning control method and ultrasonic diagnostic equipment
CN110688910A (en) * 2019-09-05 2020-01-14 南京信息职业技术学院 Method for realizing wearable human body basic posture recognition
CN115153637A (en) * 2022-07-22 2022-10-11 郑州市中心医院 Method, apparatus, device, and medium for detecting posture of arm operating ultrasonic probe
CN116058867A (en) * 2023-01-06 2023-05-05 华东师范大学 Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050228284A1 (en) * 2004-03-31 2005-10-13 Charles Edward Baumgartner System and method for power management in an ultrasound system
US20140187946A1 (en) * 2012-12-31 2014-07-03 General Electric Company Active ultrasound imaging for interventional procedures

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102885639A (en) * 2011-07-21 2013-01-23 索尼公司 Signal processing apparatus, control method, signal processing system, and signal processing method
CN102949209A (en) * 2011-08-22 2013-03-06 通用电气公司 Ultrasound imaging system, ultrasound probe, and method of reducing power consumption
CN109310394A (en) * 2016-04-26 2019-02-05 安科诺思公司 Ultrasonic adaptive power management system and method
CN108629170A (en) * 2018-04-20 2018-10-09 北京元心科技有限公司 Personal identification method and corresponding device, mobile terminal
CN109498064A (en) * 2018-12-29 2019-03-22 深圳开立生物医疗科技股份有限公司 Ultrasonic scanning control method and ultrasonic diagnostic equipment
CN110688910A (en) * 2019-09-05 2020-01-14 南京信息职业技术学院 Method for realizing wearable human body basic posture recognition
CN115153637A (en) * 2022-07-22 2022-10-11 郑州市中心医院 Method, apparatus, device, and medium for detecting posture of arm operating ultrasonic probe
CN116058867A (en) * 2023-01-06 2023-05-05 华东师范大学 Light-weight imaging ultrasonic system without fixed scanning probe and ultrasonic detection method

Also Published As

Publication number Publication date
CN117045281A (en) 2023-11-14

Similar Documents

Publication Publication Date Title
JP4942666B2 (en) Respiration signal synchronization method by capturing ultrasound data
WO2021004076A1 (en) Ai chip-based conformal wearable biological information monitoring device and system
CN104379064B (en) The bearing calibration of diagnostic ultrasound equipment and view data
KR101747305B1 (en) Ultrasound diognosis apparatus and mehtod for communication connecting thereof
US20100138191A1 (en) Method and system for acquiring and transforming ultrasound data
US20140005547A1 (en) Remotely controlled ultrasound apparatus and ultrasound treatment system
WO2019150715A1 (en) Ultrasound diagnostic device and control method for ultrasound diagnostic device
CN102949209A (en) Ultrasound imaging system, ultrasound probe, and method of reducing power consumption
US20180214134A1 (en) Ultrasound diagnosis apparatus and method of operating the same
KR101792592B1 (en) Apparatus and method for displaying ultrasound image
JP7119127B2 (en) Ultrasonic system and method of controlling the ultrasonic system
KR101987776B1 (en) Portable ultrasonic diagnostic apparatus and system, and operating method using the portable ultrasonic diagnostic apparatus
CN114760928A (en) Method and apparatus for monitoring fetal heartbeat signal and uterine contraction signal
CN107440720A (en) The bearing calibration of diagnostic ultrasound equipment and view data
WO2023282743A1 (en) Robotized imaging system
US20180000458A1 (en) Ultrasonic imaging device and method for controlling same
CN117045281B (en) Ultrasound imaging system, control method, imaging controller, and storage medium
EP3520704B1 (en) Ultrasound diagnosis apparatus and method of controlling the same
CN101669830B (en) Ultrasonic elastograph imaging method
JP6257935B2 (en) Ultrasonic diagnostic apparatus, biological signal acquisition apparatus, and control program for ultrasonic diagnostic apparatus
CN113456106B (en) Carotid artery scanning method, carotid artery scanning device and carotid artery scanning computer readable storage medium
JP7216818B2 (en) ultrasonic control unit
US11224400B2 (en) Ultrasonic diagnostic apparatus and control method thereof
CN112716443A (en) Ultrasonic pupil measuring method, ultrasonic host and ultrasonic diagnostic equipment
KR102161880B1 (en) Apparatus and system for displaying of ultrasonic image, and method for detecting size of biological tissue using thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant