An Introduction to BCD

If you work with real time clock (RTC) peripherals, you may run into a very strange data encoding format, binary coded decimal or BCD. In this post, I’ll describe how BCD works but I can’t give much of a history or explanation as to why it is still around. It’s a mystery.


An RTC is a chip that keeps track of the real time (wall clock time). It’s becoming more common for embedded microprocessors to include an RTC, and it’s very common for these RTCs to store the time in BCD format, but the manufacturers may not tell you how BCD works.

Not all RTCs exclusively use BCD, some are configurable for binary, others just accumulate milliseconds or seconds, but I find the common chips in embedded systems use BCD.

Since RTCs keep track of wall clock time, the most basic data items are the time and the date. The time, say 23:40:13, would be kept in 3 bytes, one containing the number 23, one containing 40, and one containing 13, 14, 15, 16 (I type slowly). It keeps changing; it’s a clock after all.

Instead of these bytes being decimal values, say the hour 23, it contains the BCD value 0x23 (decimal 35). But since we’re dealing with time and hours, the next value after 0x23 is 0x00, because we’ve just gone over a day. That seems reasonable.

BCD, as used in RTCs, stores each decimal digit in a separate nibble.  So the decimal number 42 is stored with 4 in one nibble and 2 in another. Of course decimal 42 should be hex 0x2A, but in BCD 42 is stored as 0x42.

It’s quite confusing debugging RTC code. If you display the time register in your debugger, it gets shown in hex and the values superficially look correct. But your code will seem to read seemingly random numbers until you figure out that the values have to be converted from this weird hex-ish format to decimal.

Screen Shot 2018-06-03 at 21.49.45.png

This is the time register from the STMicroelectronics STM32F407’s RTC peripheral. Notice that the tens digits for hours (HT), minutes (MNT), and seconds (ST) are less than 4 bits wide since they don’t have to span the whole range of 0 through 9, some RTCs will save bits by allocating only the number of bits that are absolutely necessary.

Now let’s look at the minutes. A minute value of 40 is stored as BCD 0x40 and it will count all of the way up to BCD 0x49, then roll over to BCD 0x50. But if you forget to convert the values from BCD, you’ll get a very weird pattern in decimal. Like so:

BCD Decimal
0x39 57
0x40 64
0x41 65
0x42 66
0x48 72
0x49 73
0x50 80
0x51 81

The RTC takes care of seconds and minutes having the range 0x00-0x59, hours in the range 0x00-0x23 or 0x01-0x12 depending on how they are configured, as well as months, years, leap years, centuries, and days of the week, all in BCD format.

Time Out

The calculation of BCD to binary is reasonably simple:

(High nibble portion * 10) + low nibble.

But getting at the digits is a pain, especially since the high digit can be 1, 2, 3, or 4 bits wide.

ST, in their HAL for STM32, uses this code a lot. Every time you get the time from the RTC, you convert it from BCD. Every time you set the time, you convert it into BCD. So they provided helper functions like:

 * @brief  Converts from 2 digit BCD to Binary.
 * @param  Value BCD value to be converted
 * @retval Converted word
uint8_t RTC_Bcd2ToByte(uint8_t Value)
 uint32_t tmp = 0U;
 tmp = ((uint8_t)(Value & (uint8_t)0xF0) >> (uint8_t)0x4) * 10;
 return (tmp + (Value & (uint8_t)0x0F));

Check the datasheet for your RTC for the specific layout of the date, time, and alarms. Your format will vary. Also, look in your manufacturers libraries to see if they provide BCD helper functions.

Converting to ASCII is even easier, just take the BCD nibbles and add 0x30.

Why BCD?

Why do RTCs use BCD? At his point it seems to be a historic holdover. If you are starved for computer power and are making a clock, you can take the BCD digits and write them into a 7446 BCD into 7-segment display decoder chip to drive your display. The processor doesn’t have to do any computation, just move the digits to the output ports.

Now, your projects are probably going to be a little more complex than a clock, but you have enough cycles to decode the BCD and make the timestamps in the format that you need.

Why does this persist in new designs, why not just use binary? I don’t know. I can only assume that the chip manufacturers take a standard RTC circuit layout description and use it again, and again, because the RTC isn’t a sexy peripheral and changing the design adds little value.

Tick Tick Tick

So when you get the RTC going on your embedded system, check your data sheet. You may have to convert BCD values into something more useful.

This post is part of a series. Please see the other posts here.

9 thirty something

 Music to work by: Okay, it's edje-a-ma-cation time. This video is a bit over the top, but has a lot of things to say about the current state of music. Why Modern Music is Awful.