The storage and retrieval of information in networks of biological neurons can be modeled by certain types of content addressable memories (CAMs). We demonstrate numerically that the amount of information that can be stored in such CAMs is substantially increased by an unlearning algorithm. Mechanisms for the increase in capacity are identified and illustrated in terms of an energy function that describes the convergence properties of the network.