Base conversion is a fundamental part of computer science.
For the most part, computers operate in base two.
Base sixteen (hexadecimal), and to some extent base eight (octal),
are important because they provide a much shorter way to write (print)
essentially base two values.
Note the consequence of memory allocation in units of 1, 2, 4, or 8 bytes
on the limits of the integer values that can be represented on the computer.
Note the consequence of memory allocation in units of 4 or 8 bytes on the
number of significant digits and the limits on exponents for
rational values. Also point out the difference between scientific
representation of values and the computer science representation of values.
We should take note that unless you are working in some very specialized and
very time and space critical fields, most programming does not involve
actually doing base conversions.
However, there is great use of hexadecimal, binary, and dot notation
in networking.
Binary coded decimal has great use in financial applications, and essentially no use in scientific
programming.