What is the behaviour of the sizeof() operator with a class having one or more inner classes? Let's consider the following code:
#include <cstdint>
#include <iostream>
class A
{
class B
{
uint64_t i64;
};
B vec[ 10 ];
};
int main()
{
std::cout << sizeof( A ) << std::endl;
//...
return( 0 );
}
So far so god, it will print "80".
But now consider the next example:
#include <cstdint>
#include <iostream>
class A
{
class B
{
uint64_t i64;
};
B vec[ 10 ];
uint8_t arr[ 10 ];
};
int main()
{
std::cout << sizeof( A ) << std::endl;
//..
return( 0 );
}
Instead of "90" (80 bytes for vec + 10 bytes for arr) it prints 96, it counts 6 "extra" bytes.
Please not that not having vec B array, sizeof returns 10.
Having a single uint8_t instead an array of ten it will print 88 instead of 81.
.... so.... why? Is all this due data alignment?
All this happens on MSVC12 (Visual studio 2013), compiling in both debug and release mode.