Comment by Eliezer Yudkowsky - I may never actually use this in a story, but in another universe I had thought of having a character mention that... call it the forces of magic with normative dimension... had evaluated one pedophile who had known his desires were harmful to innocents and never acted upon them, while living a life of above-average virtue; and another pedophile who had acted on those desires, at harm to others. So the said forces of normatively dimensioned magic transformed the second pedophile's body into that of a little girl, delivered to the first pedophile along with the equivalent of an explanatory placard. Problem solved. And indeed the 'problem' as I had perceived it was, "What if a virtuous person deserving our aid wishes to retain their current sexual desires and not be frustrated thereby?"
(As always, pedophilia is not the same as ephebophilia.)
I also remark that the human equivalent of a utility function, not that we actually have one, often revolves around desires whose frustration produces pain. A vanilla rational agent (Bayes probabilities, expected utility max) would not see any need to change its utility function even if one of its components seemed highly probable though not absolutely certain to be eternally frustrated, since it would suffer no pain thereby.
Welcome to Yudkowsky.