C represent int in base 2
Solution 1
The standard describes no way to create "binary literals". However, the latest versions of GCC and Clang support this feature using a syntax similar to the hex syntax, except it's b
instead of x
:
int foo = 0b00100101;
As stated, using such binary literals locks you out of Visual Studio's C and C++ compilers, so you may want to take care where and how you use them.
C++14 (I know, I know, this isn't C) standardizes support for binary literals.
Solution 2
You can't do this in C.
But you said you are learning C++. In C++ you can use BOOST_BINARY until C++0x which will allow user-defined literals.
Please note, however, that it is very easy to become comfortable translating hex to binary and back.
For a given binary number, just group the digits in groups of four and learn that
0000 <-> 0
0001 <-> 1
0010 <-> 2
0011 <-> 3
0100 <-> 4
0101 <-> 5
0110 <-> 6
0111 <-> 7
1000 <-> 8
1001 <-> 9
1010 <-> A
1011 <-> B
1100 <-> C
1101 <-> D
1110 <-> E
1111 <-> F
After a few attempts at doing this translation in your head, you will become very comfortable with it. (Of course, you could do the same with octal, but hex is even more compact than octal.)
For your specific example:
1000000000 -> 10 0000 0000 -> 0010 0000 0000 -> 0x200
Solution 3
In C (and I believe C++) there is no binary representation for numbers. However, for the simple number you show, you can use a shortcut
int i = 1 << 9; /* binary 1 followed by 9 binary 0's */
Solution 4
You can't do this in standard C—the language does not cater for specifying integer literals in binary.
dan.dev.01
Updated on January 06, 2020Comments
-
dan.dev.01 over 4 years
Possible Duplicate:
Can I use a binary literal in C or C++?I am learning C and i recently found out that we can represent integers in different ways, like that:
(Assuming
i
has "human-readable" value of 512.) Here are the representations:Decimal:
int i = 512;
Octal:
int i = 01000;
Hexadecimal:
int i = 0x200;
In base 2 (or binary representation) 512 is 1000000000. How to write this in C?
Something like
int i = 1000000000b
? This is funny but unfortunately no C compiler accepts that value.