Steelman · slot B
The case against outsourcing desire
A skeptic of algorithmic intimacy would argue —An AI that asks me five questions — height, job, religion, one adjective, kids y/n — and then writes a paragraph pitching me to strangers is not a matchmaker; it's a Mad Libs template with a smiling avatar. The Amata user who told it she needed a Jewish partner got sent on a date with a non-Jewish man. The reporter who said shared faith was "nice to have" got pitched a parade of devout churchgoers. These aren't edge-case bugs — they're the predictable result of a system that has no way to verify what users say, no theory of attraction beyond stated preferences, and no grandmother's intuition about who's lying about their height. Chemistry lives in the gap between what we claim to want and what actually moves us. A model trained on self-reports can't see into that gap, and pretending otherwise asks people to hand over their most intimate data for a service that performs worse than a friend with good instincts.