So you write a timer app that counts in milliseconds in an int variable, and it works on your machine where that's 32 bits. I compile it for Arduino, and I can't set a timer for longer than 32 seconds in the future. Isn't that a problem? You shouldn't have used int — you should have declared a maximum time value (99:59:59) and how many bits are needed to store that value.
In what situation is plain int sufficient? If your values always fit in 16 bits, it works, but what's special about 16? A value that goes from 0 to 100 only needs 7 bits, so int is not efficient there either, you should have asked for 7 bits which would round up to 8 on most platforms. Arduino is an 8–bit platform that requires several instructions to add 16 bits to 16 bits — int isn't the most efficient type.