No images after start acquisition

We are using Aravis with a Mv4 Camera from PhotonFocus. We have a small patch to Aravis to detect the camera and apply the required features to enable acquisition (Once we have it properly working, we will upstream the patch). We are using it with a long living stream object and starting/stoping acquisition from time to time. Our application requires to capture images for short periods of time (100 msec to 1 sec) with a frame rate of around 200 Hz. Afterwards there is a short break and then the cycle starts again.

This works quite well for quite some time. But after about 900 such cycles we don’t get any images from the camera any more. All error checks succeed - from software and API side everything looks fine, but there are just no images coming in any more. This does not improve till we restart the software. Unfortunately we do not see a pattern. The number of images captured (about 130000) and the number of cycles varies when the problem occurs.

Do you have any ideas what could trigger this problem and how we could further debug this problem?

Hi,

If I have understood correctly, the video stream stops working, but if you just stop your software and restart your software, it works again ?

What would help is a small test applications that exhibits this issue. Please open an issue on github.

Regarding the small patch that makes aravis work with your camera, don’t hesitate to create a merge request, even if it is still a work in progress. That way we can comment on the patch sooner, and most importantly your work will not be lost if for some reason you can not finish it.

I’m not sure whether restarting the software would be sufficient. We actually restart the whole system, which might change the situation even more.

I’ll try to develop a test application which shows the problem.

I tried a simplified test application and had it running from Friday to Monday without any problems…

We now started to test again the complete system and were able to trigger the problem repeatedly after already 10 minutes. Interestingly we could see that the camera is still triggering images. The external lightning is still getting the trigger signal.

Now we slightly changed the setup and haven’t hit the issue for one hour.

In the code which shows the problem we used to do:

    arv_camera_stop_acquisition (m_arvCamera, &error);
    arv_stream_set_emit_signals (m_arvStream, FALSE);

and:

     arv_stream_set_emit_signals (m_arvStream, TRUE);
     arv_camera_start_acquisition (m_arvCamera, &error);

Now we changed that to only call set_emit_signals with true once after creating the stream. Does that make sense that it could cause problems if we did call into these functions all the time? Especially if there is only a short time (< 100 msec) between stop and start acquisition?

Normally, it should not matter. set_emit_signals just sets a boolean protected by a mutex. It does not really make sense to call this function at every start/stop thread. It just stops the signal emission, but does not prevent the stream receiving thread to push a buffer to the incoming queue. The next time the signal is re-enabled, you may find more than one buffer in the queue.

We received a new firmware from the vendor and since then we haven’t seen the issue any more (a few weeks in testing). So I assume that this was a firmware issue and not an issue with aravis.

Hi,

Nice to hear. Thanks for the follow up.

Cheers.