I am using ImFusion with the goal of selecting certain pixels from my US image and expressing their pixel coordinates as real world coordinates.
To do so, I have managed so far to successfully obtain from the Calibration Wizard the calibration matrix. I am later introducing this matrix in my UltrasoundStream data thanks to the Edit Transform tool. However, is there a way now to click on a point in the live image displayed from the UltrasoundStream in order to obtain both the pixel coordinates of that point (e.g. 50, 60) and its real world coordinates which take into consideration the provided calibration matrix?
Hi Marcos,
The calibration matrix defines the transformation between the probe and the tracker attached to it. To go from Pixel Coordinates (on the Ultrasound frame) to World coordinates (wrt to the tracking system) you need both the Calibration matrix and the matrix emitted by the tracking stream (see here for more details).
The “Ultrasound Stream” doesn’t contain any tracking information, therefore you cannot apply the calibration matrix to it. The correct way of applying the calibration matrix is to use the “Record Ultrasound Sweeps” algorithm. In order to open it you should select both the “Ultrasound Stream” and the Tracking stream, right click, Ultrasound => Record Ultrasound Sweeps.
When you open this algorithm you should see the Ultrasound frames in the 3D views together with a 3D model of the probe. Inside the controller of this algorithm, under “Advanced settings”, you can find the Calibration tab and you can set the calibration there.
Alternatively, if you have already recorded a sweep, you can edit its calibration by opening “Sweep Properties” (right click on the sweep, Ultrasound => Sweep Properties) and going to the Calibration tab.
Our UI currently doesn’t offer the possibility of getting the World coordinates of a point on an ultrasound frame. However, this would be very easy to do in code both from Python or C++. Let me know which language you prefer and I can provide you with a small example.
thanks a lot for your valuable feedback and for answering so fast. Also, excuse for replying this late, I was out of office for some time.
I have tested what you recommended me to do, i.e. using the algorithm “Record Ultrasound Sweeps” including in its Calibration tab the calibration matrix obtained from the “Calibration Wizard” to record some images. However, I still do not see how after this I can take advantage of having introduced the calibration matrix in the recording. I was expecting that since we are introducing the transformation from probe to tracker, I could maybe interact with the pixels to see their coordinates w.r.t. the tracking frame, but I did not manage to find anything like that.
Regarding an example code, it would be great if you could provide it to me in Python
After introducing the calibration matrix into “Record Ultrasound Sweeps” you can verify that it is correct and correctly applied by moving the ultrasound probe back and forth, up and down and checking that the probe motion is consistent with what you see in the live 3D view.
Furthermore, with a correct calibration, once you scan record an ultrasound sweep you can visualize it in 3D (both in the 3D view and the MPRs).
After recording a sweep you can export it as ImFusion file (select the sweep, right click, export, imfusion file) after that you can use this python code to go from pixel to world coordinates:
import imfusion as imf
import numpy as np
def pixel_to_world(sweep, frame, x, y):
M = sweep.matrix_to_world(frame) @ sweep.get().descriptor.pixel_to_image_matrix
p = M @ np.asarray([x, y, 0, 1])
return p[:3]
sweep = imf.load("/path/to/sweep.imf")[0]
print(pixel_to_world(sweep, 0, 10, 20))
thanks a lot again for the information and for the code!
May I also ask one more question. Where does exactly lie the US image coordinate system? As far as I have experienced so far it is in the middle point of the US image, but I would still like to verify it with someone from the support.
Hi Marcos,
Sorry for a delayed response!
The image coordinate system indeed is placed in the center of an image. It is done so to increase the numerical stability (e…g in optimization problems, such as image registration). At the same time, the pixel coordinate system is placed at the upper left corner.
Hope this helps!