how the size of an integer is decided?
- is it based on processor or compiler or OS?
Answer Posted / lanchai
Different hardware systems might have a different size for
an integer. you might get a different number in different
OS because the hardware running the OSes were different to
begin with. also, the "sizeof" command is actually a
compile-time command calculated by the compiler (see
wikipedia on sizeof)
So strictly speaking, its the hardware (or processor) that
determines the size of an integer.
| Is This Answer Correct ? | 11 Yes | 2 No |
Post New Answer View All Answers
What are the general description for loop statement and available loop types in c?
How can you determine the maximum value that a numeric variable can hold?
What is union in c?
Why is %d used in c?
What is return in c programming?
Explain the array representation of a binary tree in C.
What is the translation phases used in c language?
Explain zero based addressing.
What is the difference between a function and a method in c?
In this problem you are to write a program that will cut some number of prime numbers from the list of prime numbers between 1 and N.Your program will read in a number N; determine the list of prime numbers between 1 and N; and print the C*2 prime numbers from the center of the list if there are an even number of prime numbers or (C*2)-1 prime numbers from the center of the list if there are an odd number of prime numbers in the list.
What is c value paradox explain?
How is a pointer variable declared?
What is the use of gets and puts?
What is the difference between break and continue?
Can you pass an entire structure to functions?