Import tracking stream through ROS

Hello everyone,

We are working with NDI Aurora and the ultrasound scanner to do ultrasound calibration. We are able to connect Aurora and ImFusion Suite using ROS. We are able to display the flow correctly on ImFusion, but when we run the ultrasound calibration wizard it seems that the sensor is no longer being tracked and any movement or sweep made with the sensor probe generates an error (such as ‘The probe was not moved on a straight line’). Do you have any suggestions on this problem? Would it be better to connect Aurora with an IGT connection?

Thank you in advance,

Riccardo.

Hi Riccardo

On the first page of the wizard, there is a selection of the tracking stream. Could you verify that the correct tracking stream is selected? In the wizard, do you see the sphere moving?

In theory, it should not matter if the connection is via ROS or IGT connection. Both should generate a TrackingStream that the calibration wizard can consume.

Hi Martin,

Thanks for your answer!
In the calibration wizard I can see the the correct tracking stream but I cannot see the sphere moving even if I can see the tracking data in the tracking visualization. I haven’t tried yet the IGT connection for NDI Aurora in the calibration wizard. I tried it with NDI Polaris and It worked. Have you ever met this problem before?

Could you record some of the data and send it over?

  1. Record a sample of the ROS tracking data that fails
  2. Record a sample of the NDI Polaris data that worked

We have not met this specific issue before. Investigating the tracking sequences might give some insight into the problem.

Hi Martin,

Sorry for the delay in my reply but I had limited access to the equipment and could not repeat the test in the last week. I don’t know what happened last time, but now the tracking data stream with ROS is working. I think the problem was related to a faulty EM sensor.
By the way, I still have a couple of questions on how to perform an EM calibration correctly (let me know if it’s better to open a new thread, but in the meantime I’ll post my questions here).
In my case I want to perform a US calibration using the Aurora EM sensor and a GE scanner with a transesophageal probe.
My first question concerns the image data stream. So far I have been using a frame grabber to stream the US scanner screen into the Imfusion software. I have seen that the quality of the flow is very low and I am concerned that this may affect the calibration result. Is there a better way to perform the image stream? I saw in the documentation that Imfusion is supposed to use the GE API to perform the image flow, but I could not find any more information on how to use it and whether it is provided in the user interface.
The second question concerns the calibration phantom. So far we have used the calibration phantom provided here GitHub - ImFusionGmbH/PRO-TIP-Automatic-Ultrasound-Calibration. We have seen that it is different from the one used in the video tutorial. Can the shape of the phantom affect the calibration result? And if so, do you have any suggestions on what a good phantom shape should be for proper calibration?
My last question concerns the use of the transesophageal probe. Your calibration algorithm is designed to be used with external US probes, but in my opinion can be translated to a transesophageal probe. I have seen in a couple of previous discussions that the calibration result may depend on the way the sweeps are performed (painting style). Since performing this movement could be complicated with a transesophageal probe, we are thinking of 3D printing a handler to attach to the probe to perform the sweeps more easily. Do you think this is a viable option?

Thanks in advance,
Riccardo.

  1. About the image flow: are you referring to doppler image? Or simply the image stream framerate?
    If it’s image stream framerate, the GEAppAPI is not in Academic packages. Also, the GE machine has to support it and the option needs to be turned on, but this is only available on some systems. It could be the configuration of the grabber, or simply that the receiving system is not powerful enough. What is the current framerate?
  2. The phantom used in the video is a 3D printed phantom, we can make the phantom model available. If the the cones calibration works already, there is no need to change the calibration system. Of course, for an image based calibration the shape of the phantom will affect how well determined the calibration matrix is. If there are rotational symmetries for example, it can lead to erroneous calibrations. If the user does not cover enough degrees of freedom of the tracking while sweeping, the matrix will be ill-defined. Typically the issue is not enough rotation, which does not constrain the Z axis enough.
  3. As for printing a holder, that makes sense, whatever makes the handling of the probe easier. Of course the important part is that the EM tracker does not move with respect to the US transducer array when the handle is attached/removed.
  1. I am referring to the framerate of the imagestream. We potentially have the licence to activate imagestream from the GE machine, but unfortunately we have an academic licence. We are using a magewell grabber to capture the stream.

  2. The 3D printed phantom worked, but the calibration error is still high. We tried to compare the EM positions of the cone tips with the positions of the cone tips acquired from the image and then brought back into EM space through the calibration matrix. To do this, we acquired some sweep traces of the cones and manually selected the tip position in the image. We then retrieve the position using the python imfusion library in the following way:

def pixel_to_world(sweep, frame, x, y):
    M = sweep.matrix_to_world(frame) @ sweep.get().descriptor.pixel_to_image_matrix
    p = M @ np.asarray([x, y, 0, 1])
    return p[:3]

I attach a 3D image of the calibration results. There seems to be a big error along the vertical direction.

Figure_1

Do you have any suggestions about how to improve the calibration result? So far we got average euclidean distance of about 24 mm.

Hi Ricardo,

In general, if there is a large error in the Z axis translation, one should “add more X or Y axis rotation” to add more data to the optimization process on where the Z translation should be. This means tilting the probe more during the sweeping motion. Imaging the opposite case where we dont sweep and just translate the probe horizontally, then any Z translation would be a valid calibration to match the images from both sweeps.

As for the image stream being low quality, what are the settings of the magewell grabber?