I am new to aravis and trying to understand more about it like where to start, how to start , what are the speeds that can be handled by aravis , how is it in comparison with firmware implemented standard gige vision cores ?
Please point me in the right direction.
The maximum bandwidth achieved using Aravis depends on the hardware running the library, and also on your network setup.
With a camera connected directly to the receiving machine, there should be no issue in receiving a full 1GB/s stream on an intel platform, even without system tuning.
More performance is achieved using the packet socket filter and ethernet jumbo frames.
Thank you very much Emmanuel. Appreciate the efforts for building such a project.
Did anyone used them to make an actual camera hardware as far as you know ??
I am planning on running it in Linux on an fpga board. Do you have any inputs or pointers ??
Aravis is not really designed to be used for an image source implementation. But it can be used as a source of inspiration.
If you don’t plan to publish your source code, you should use the GigEVision or USB3Vision standard documentations.
I don’t have a pointer to a FPGA implementation, but you may have look at João implenentation of a USB3Vision source on a ARM platform: https://aravis-project.discourse.group/t/aravis-on-ultrascale-using-usb3/124/5