I've measure time with a loopback flow graph (w/o USRP; examples/wifi_loopback.grc)
The result says that it takes from 5,000 to 30,000 us, which is 5 to 30 ms to decode a signal with a length of 9,000 samples (samples are either 1 or -1.)
* Test environment: Ubuntu 14.04 on VMWare, 2 CPUs and 4 GB RAM allocated
* Host environmetn: Windows 7 with i7-3770 3.7 GHz
Since I am not familiar with error correcting codes, I have no idea how large the order of time taken is. But I think that one of the most efficient decoding algorithm is Viterbi and that IT++ must use it.'
Then I can deduce that CC decoding takes a quite long time even though the algorithm (Viterbi) is very efficient. And is it a natural limitation of software decoding and SDR?
Another question comes that, the commercial off the shelf (COTS) Wi-Fi device achieves really high throughput and that must be based on super faster CC decoding. Is that because COTS is using heaviliy optimized FPGA and dedicated decoding chips?
Regards,
Jeon.