We rely on random numbers or sequences in many places. Scientific simulations for example use them a lot. The public lottery is based on them. And, every time we are encrypting something on the computer we rely in one shape or the other on a random number.

Every encryption method uses some kind of key or secret that can be used to encrypt the message and decrypt it again at the other end. Modern encryption algorithms usually use a random bit sequence as key. The strength of the encryption is usually measured in bits, which refers to the length of the key. A good cryptographic algorithm will provide an encryption that can only be cracked by trying all possible keys. If my key has eight bits, an attacker has 256 possible keys to deal with, that means on average they will succeed after 128 attempts. That works this way only when all keys are equally likely. Above I mentioned that most people select a two-digit number when asked to think of a random number. That means for an attacker it makes sense to restrict the search for the key to only two digit numbers too (or at least start with those). That reduces the number of keys to try to only 100 instead of 256.

Modern cryptographic algorithms rely on keys with significantly more bits. AES for example uses 128 or even 256 bits. 256 bits are 115792089237316195423570985008687907853269984665640564039457584007913129639936 possible keys (115 quattuorvigintillion ...). That means an attacker has to do a lot of work to brute force that encryption. However, that requires that the probability that one key was selected is equal to the probability that any other key was selected. Otherwise, an attacker can easily reduce the amount of keys to try through.

So, what does all this have to do with random numbers? In a cryptographic context, for a key to be a good key, it needs to be as likely as any other key. That is however also a description for a random number. A number is random if it is as likely to occur as any other number.

Let me put that a little more formal: Given a set of possible numbers, like all integers with up to 256 bits, a selected member of that set can be considered having been selected at random, if it is as likely for that member to have been selected, as it is for any other member.

Usually however, when generating e.g. a 256 bit key, we do not select a number out of the about 10^77 possibilities, but rather we generate a 256 sequence of random bits. If every single bit, independent from all others, is as likely to be 1 as it is to be 0, the result is again a truly random number.

If a number is truly random, the main consequence is that you cannot predict which number will be chosen. This is, what cryptographic strength relies on.

A number is random, if it is as likely to occur as any other number within a set of possible numbers. For cryptographic use, it is paramount that the keys or secrets are selected truly at random, as the cryptographic strength is directly dependent on that randomness.