I was looking at some of the solutions in Google Code Jam and some people used this things that I had never seen before. For example,
What does 2LL and 1LL mean?
Their includes look like this:
LL makes the integer literal of type
2LL, is a 2 of type
LL, the literal would only be of type
This matters when you’re doing stuff like this:
1 << 40 1LL << 40
With just the literal
int to be 32-bits, you shift beyond the size of the integer type -> undefined behavior).
1LL, you set the type to
long long before hand and now it will properly return 2^40.