difference between %d and %i

Difference between %d and %i in C

In C, both %d and %i are format specifiers used with the printf function to print integers. They are used interchangeably and produce the same output. However, there is a subtle difference between the two.

The %d format specifier is used to print signed decimal integers, while %i can be used to print both signed and unsigned decimal integers. When used with printf, %i treats the input as a signed integer if it starts with a sign character (+ or -), and as an unsigned integer otherwise.

Here's an example to illustrate the difference:

int num = -10;
printf("%d\n", num);  // Output: -10
printf("%i\n", num);  // Output: -10

unsigned int uNum = 10;
printf("%d\n", uNum);  // Output: 10
printf("%i\n", uNum);  // Output: 10

In the first example, %d and %i both print the signed integer -10 correctly. In the second example, %d and %i both print the unsigned integer 10 correctly.

It's important to note that the difference between %d and %i is not significant in most cases, as the behavior is the same for most practical purposes. However, it's good to be aware of this distinction in case you encounter situations where it might be relevant.

[1]