A type determines what does the information stored (in data of 0 and 1) mean.
As an example, a weather forecast can have one of the four values:
Sunny ☀️ : Stored as [ 0 0 ]
Rainy 🌧 : Stored as [ 0 1 ]
Snowy ❄️ : Stored as [ 1 0 ]
Bad (Others) ⛈ : Stored as [ 1 1 ]
They all belong to the same type: Weather.
This simple data translation system allows for faster communication than writing the words "Sunny", "Rainy", etc... simply because it's a shorter message.
To send all the letters of the alphabet we would need to code those into binary (the computer's language).
Just because of that reason, every letter would take... 🤔round_Up(log2(number_of_letters)) 🤔 binary digits... That's 8 zeros or ones per letter. So 24~40 bits per transmission. What a nightmare!!!
We only needed 2.
Types allow us to think about abstract data entities as real life (or mathematical) objects. This permit's thinking a solution at the same level as the problem, not so much at machine level on how to implement it.
type Color is (
Red,, Violet, Yellow, Green);
Streetlight : Color;
Once you created a type you can build new types using a previous type as a part or a new type. Therefor, all processes of that type can be easily use as heritage.
Using strong types allow us to control every outcome. To discern every possibility before it happens. The system cannot work outside the given parameters. Hackers have no room to get in.
Input is easier to check and validate.
Errors are detected earlier.
Maintenance is easier, as your understanding of the code is only related to your understanding of the subject at hand (not the subject and the underlying architecture).
Ada has literally saved millions of dollars to the USA just with this feature.
If you know what options there are you can prepare for every outcome. For example, you can set an automated emergency system if the weather is Bad.
But what should your program do if another coder just writes "Hurricane" using your program? That sounds like an emergency, but your code doesn't know what to do with it!!!
Also, if the next programmer writes "bad" instead of "Bad", I'm sorry to say it will not do the expected processes!
This is not a problem in Ada types, but it remains a problem in C-family types, not just in literals.
This problem has been baptized as "Legacy code malfunction" because is always easier to blame the previous guy.
To avoid this, always striving for the safest option, Ada doesn't differentiate uppercase from lowercase unless it's a literal.
You can recognize literals because they come in "double quotes" (for Text Strings: "Hello") or 'single quotes' (for characters 'e'). Numbers are also literals, for what we care at the moment.
The best part of rockets is that they can take you to the moon!
What good are things if you cannot use them to do anything with them?
Part of what defines a type is what it can do. We call the actions that a type, or combination f types, can do a process.
We'll learn about processes in the next chapter.