[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Discuss-gnuradio] Introducing noise/ considerable BER

From: Marcus D. Leech
Subject: Re: [Discuss-gnuradio] Introducing noise/ considerable BER
Date: Tue, 09 Aug 2011 20:22:16 -0400
User-agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv: Gecko/20110621 Fedora/3.1.11-1.fc14 Thunderbird/3.1.11

Keep in mind the old information theorist's adage: if you don't have
bit errors, you're using too much power! (ok, I don't know how old
that is; fred harris always quotes it, but he credits someone else
with it, probably Tony Constantinides).

In other words, we normally design our systems around having bit
errors, and indeed we recognize that they are unavoidable except under
extreme SNR conditions. To compensate, you really want to you some
kind of channel coding. The way things are in our benchmark code, a
single bit error means that an entire packet is lost.


In radio systems, I agree--bit errors are a fact of life, and you can cope with them either with protocol design, or "frame design". The trend in the last couple of decades for radio systems has been to incorporate some sort of FEC, to reduce the impact of channel distortions--the receiver can simply reconstruct from the FEC data, or, force a re-transmit. Systems that use FEC almost always assume that there's a higher-layer protocol mechanism in place for dealing with packets that were too damaged to decode, and thus must be re-transmitted.

On the other hand, there are plenty of extant *wired* communications systems in which bit-errors are exceedingly rare. The various Ethernet standards for example, assume that bit errors aren't common, and there's no FEC (at least at
  100Mbit and 10Mbit levels--I'm not sure about 1000Mbit and 10000Mbit).

The problem is that many communications/networking engineering types who are new to radio don't really understand, on a visceral level, that the radio channel environment is different from wired, not just in degree, but in type, of channel distortions. And further, their experience with a channel model for wireless may include only simulations, rather than "real world".

My very earliest internet connection at home, back in the mid-to-late 1980s, was wireless. Over an amateur-radio 56KBps radio link using a "split repeater" on 220Mhz and 432Mhz. It wasn't a very nice environment. Various RFI issues, hidden-terminal issues. Collisions. Multi-path. Receiver de-sense. And a complete lack of any FEC. Given all of that, I'm stunned that LTE and WiFi and all its
  modern friends work at all :-)

Marcus Leech
Principal Investigator
Shirleys Bay Radio Astronomy Consortium

reply via email to

[Prev in Thread] Current Thread [Next in Thread]