So, background for this rant: there's this phenomenon where the more women join a profession, the less "social value" it's seen to have (and the less money those working in it are paid!) . Like teaching, or nursing, psychiatry, social work, the arts-- these careers used to be entirely male-dominated, and seen as respectable, difficult, and important professions. Now that these careers skew female... » Continue Reading