I was wondering if anyone has had any issues with the interpretation of shorts by the GNU Radio udp source function, when the shorts are transmitted from a big endian based platform? In my situation I am transmitting UDP packets comprised of 16 bit samples from an AVR32 (big endian) which utilises the LWIP open source TCP/IP stack. On my GNU Radio destination PC (Intel Pentium D CPU, little endian) I construct a simple flowgraph (using GRC/GNU Radio v3.2.1) with a UDP source set to interpret incoming data as shorts, into the short to float block and then straight into the GUI scope. I'm receiving the UDP packets OK which I assume means that all the protocol header info is being interpreted with the correct endianness, but the waveform displayed in the scope is corrupt, until that is, I manually re-order the LSB and MSB of the transmitted samples at the AVR32 end. Normally the lower level network code would take care of byte reordering as required to match network byte order to the relevant host byte order, however this doesn't appear to be happening correctly on the GNU Radio side. I must be missing something simple here, can anyone shed some light?