Entropy from erasing computer memory

Reference: Daniel V. Schroeder, An Introduction to Thermal Physics, (Addison-Wesley, 2000) – Problem 3.16.

If we regard a single bit of computer memory as a ‘particle’ that can be in 2 different states, then a collection of ${N}$ bits has a possible ${2^{N}}$ states, so its entropy is

$\displaystyle S=Nk\ln2 \ \ \ \ \ (1)$

If we start with a gigabyte (${2^{30}}$ bytes = ${2^{33}}$ bits) that stores some definite information and then erase it without backing it up, we have effectively lost ${2^{33}}$ bits of information. Does this entail an increase in entropy?

This is rather unusual problem, since I would think the answer would depend on what we mean by ‘erase’. If we replace the original information by a known pattern (say, all zeroes), then it would seem to me that the entropy (at least the entropy contained in the information in the memory; clearly the electrical processes required to perform the erasing would generate entropy) hasn’t changed, since we’ve merely gone from one definite state to another, and we know the bit patterns in both cases.

However, if by ‘erase’ we mean ‘replace the original information with a random pattern’, then presumably we do increase the entropy, since the final state can now be in any of ${2^{N}}$ states. Going on that assumption, the amount of entropy generated by randomizing a gigabyte is

$\displaystyle S=2^{33}k\ln2=8.2\times10^{-14}\mbox{ J K}^{-1} \ \ \ \ \ (2)$

At room temperature, this is equivalent to an amount ${Q}$ of heat

$\displaystyle Q=T\Delta S=298\times\left(8.2\times10^{-14}\right)=2.4\times10^{-11}\mbox{ J} \ \ \ \ \ (3)$

Clearly the heat generated by a computer doesn’t arise mostly from erasing the information in memory!