avr-chat
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [avr-chat] Ascii to BCD conversion


From: Zane D. Purvis
Subject: Re: [avr-chat] Ascii to BCD conversion
Date: Wed, 17 Aug 2005 23:12:43 -0400



On Aug 17, 2005, at 12:23 PM, Kingston Co. wrote:

Does anyone have any suggestions for converting ASCII input from the uart into BCD to send to a peripheral I2C Clock chip?  My C is not as good as it should be.


To convert a character from ASCII to its binary value, take a look at a copy of the ASCII table:  
    '0' is 0x30  (decimal 48)
    '1' is 0x31  (decimal 49)
    ...
    '9' is 0x39  (decimal 57)
The table was designed like that intentionally.  Just subtract 0x30 (decimal 48) from the ASCII character to get the binary representation of that digit:

    #include <inttypes.h>
    ...
    uint8_t ascii;
    uint8_t bcd;
    bcd = ascii - 0x30;

Alternatively, you can use (ascii & 0xF) to turn off the extra bits, since you're only interested in the least significant 4 bits.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]