Generating huge quantities of random data

Before encrypting a disk it is recommended to fill it with random data to make prediction of encrypted blocks more difficult.

In practice this has become an issue with disks’ capacity increasing - be it a SSD, with 120GB or a harddisk with 3TB : where to get such an amount of random data in reasonable time ?

Usually proposed solutions include use of /dev/random and /dev/urandom, both only dripping few kB/s. I already resolved to copy misc data (like movies, DVD images …) using dd directly on the disk, repeating the step with a different offset until the whole disk was filled.

Now I came across this page which suggests to use OpenSSL to encrypt zeroes (from /dev/zero) and take the resulting stream as random bitstream. As this is encryption, its performance depends on the CPU and no longer on the entropy pool,so on modern CPUs it should be possible to reach the disk’s write speed limit.

1
openssl enc -aes-256-ctr -pass pass:"$(dd if=/dev/urandom bs=128 count=1 2>/dev/null | base64)" -nosalt < /dev/zero > randomfile.bin

Another solution I’ve found is frandom but didn’t evaluate it.