Many optimization problems that are designed to have sparse solutions employ the ℓ1 or ℓ0 penalty functions. Consequently, several algorithms for compressive sensing or sparse representations make use of soft or hard thresholding, both of which are examples of shrinkage mappings. Their usefulness comes from the fact that they are the proximal mappings of the ℓ1 and ℓ0 penalty functions, meaning that they provide the solution to the corresponding penalized least-squares problem. In this paper, we both generalize and reverse this process: we show that one can begin with any of a wide class of shrinkage mappings, and be guaranteed that it will be the proximal mapping of a penalty function with several desirable properties. Such a shrinkage-mapping/penalty-function pair comes ready-made for use in efficient algorithms. We give an example of such a shrinkage mapping, and use it to advance the state of the art in compressive sensing.