IonicWind Software

Aurora Compiler => General Discussion => Topic started by: kryton9 on September 11, 2006, 05:30:37 PM

Title: BOOL why int?
Post by: kryton9 on September 11, 2006, 05:30:37 PM
Why are bools defined as an int and not as a byte?
Title: Re: BOOL why int?
Post by: Kale on September 11, 2006, 05:47:21 PM
Good question! I think somebody at Microsoft thought it a good idea to store in 32bits what really only needs 1 as part of their on-going bloat policy. ;)

To be honest i think it's a standard C++ type that uses 32bits for future compatibility.
Title: Re: BOOL why int?
Post by: Ionic Wind Support Team on September 11, 2006, 05:47:51 PM
Speed.

A 32 bit processor natively can move a 32 bit integer in one instuction.  While a byte has to be extended to 32 bit before beiing compared.  Same goes for memory access.  Accessing odd memory locations requires extra clock cycles.  Internally what happens is a memory exception is generated, the processor handles the exception by moving the entire 32 bit dword and extracting the byte at the odd location.

Most languages define BOOL as integer for that very reason.  The latest C++ specification has two boolean types BOOL and bool.  With the latter being compiler defined and the former is a #typedef to an int.  The compiler defined version can only be assigned a 'true' or 'false' quantity, and converts all positive values greater than 0 to 1.

So in C++ if you were to type:

bool b;
b = 37;

b would actually be 1.

As an interesting note it is compiler specific as to what size the native bool type is.  Some C++ compilers use 8bits while others use 32bits.  Which causes problems when converting code between compilers.  Which I ran into with Microsoft compilers:

Quote
In Visual C++4.2, the Standard C++ header files contained a typedef that equated bool with int. In Visual C++ 5.0 and later, bool is implemented as a built-in type with a size of 1 byte. That means that for Visual C++ 4.2, a call of sizeof(bool) yields 4, while in Visual C++ 5.0 and later, the same call yields 1. This can cause memory corruption problems if you have defined structure members of type bool in Visual C++ 4.2 and are mixing object files (OBJ) and/or DLLs built with the 4.2 and 5.0 or later compilers.

The __BOOL_DEFINED macro can be used to wrap code that is dependent on whether or not bool is supported.

In Aurora there is not a built-in BOOL type.  There is a typedef in acommon.inc that defines BOOL as an INT.  And it will remain that way for the life of the language, however long that is ;)

Paul.
Title: Re: BOOL why int?
Post by: kryton9 on September 11, 2006, 06:10:05 PM
Very interesting and I am glad I didn't have to worry about going through that transition. INT it is then :)
Title: Re: BOOL why int?
Post by: Parker on September 11, 2006, 06:48:44 PM
When I was first starting programming, I wondered why they didn't use a single bit to represent boolean values. If you know anything about the processor, that's hardly convenient/possible, but that's what I thought then.
Title: Re: BOOL why int?
Post by: Zen on September 12, 2006, 02:27:33 AM
Quote from: Paul Turley on September 11, 2006, 05:47:51 PM
it will remain that way for the life of the language, however long that is ;)

Maybe its just the way i interpreted it, but that sounds like something you would say in a concerned tone of voice.

Lewis
Title: Re: BOOL why int?
Post by: Kale on September 12, 2006, 02:35:02 AM
Quote from: Paul Turley on September 11, 2006, 05:47:51 PM
...for the life of the language, however long that is

:o Is there something we should know? How long is the projected lifespan of Aurora?
Title: Re: BOOL why int?
Post by: Zen on September 12, 2006, 02:43:28 AM
Forever i hope :'(

Lewis
Title: Re: BOOL why int?
Post by: Kale on September 12, 2006, 04:31:20 AM
Quote from: Zen on September 12, 2006, 02:43:28 AM
Forever i hope :'(

Lewis

Me too.  :'(
Title: Re: BOOL why int?
Post by: Ionic Wind Support Team on September 12, 2006, 05:20:31 AM
Well we can all hope it will be forever, but personally I won't live that long ;)

I am sure the designers of COBOL didn't think it would still be used by the year 2000 either.  So you never know.