Friday, September 5, 2025

Shadow Contours Cyber and MIT Gaydar Study Reflects Rights Maps 2007-2017

Shadow Contours Cyber and MIT Gaydar Study Reflects Rights Maps 2007-2017

Around 2009, college campuses were in the golden age of Facebook. Many profiles were visible in campus network units, and friendships were semi-public records. In 2009, Carter Jernigan and Behram F T Mistree of MIT showed the reality of how far it is possible to infer same-sex orientation from this friendship structure alone. They collected the MIT network in the fall of 2007 with the automatic crawler Arachne and extracted 480 self-reported gender profiles from 6077 profiles for analysis. Simple characteristics such as the proportion of same-sex oriented male friends were included in the logistic regression, and a practical result of sensitivity 078 and specificity 083 was obtained for males with an AUC of 083 and a threshold of 189 percent. Even attributes that the individual leaves blank can be exposed by surrounding patterns of disclosure and association. They positioned this as a serious threat to privacy.

This finding is not just about technology. The tendency for homophily, the tendency for homogeneous people to associate with each other, which sociology has been describing since the 1990s, has been visualized in the data of a huge online laboratory. Moreover, social networking friendships are not owned by the individual. If one side is public, the presumption proceeds from the implicit connection. In other words, privacy is a function of the structure of the group, not the individual, and cannot be protected by individual silence.

By 2005, the gaze of presumption has turned to mug shots. Michal Kosinski and Yilun Wang of Stanford extracted features from facial images from dating sites and tested a binary classification of sexual orientation. The figures of 81% male and 71% female for a single image were widely reported as superior to human judgments, but also provoked strong opposition over external validity, data bias, and risk of abuse.

Technology is not the only source of danger. In 2012, the MIT Media Lab presented an online venue for practicing advocacy through dialogue with virtual characters. The idea is to test phrases of support and intervention in dialogues that mimic real-life situations, training empathy and understanding. The design philosophy of directing technology toward support rather than presumption has a very different range for the same data science.

The implications of the technology become even more weighty when the rights situation in the world at that time is superimposed. An international report in 2001 summarized that 76 countries have laws criminalizing homosexual acts. In the Commonwealth, 41 of the 53 countries retained penalties, many of which were noted to be derived from colonial criminal laws. In Jamaica, 17-year-old Dwayne Jones lost his life that same year when he was attacked by a crowd for his sexual expression. In such an environment, the presumption of an attribute against a person's will can go beyond discrimination and loss of employment opportunities to a literal threat to life.

The 2009 MIT study made visible the structural privacy problem of attribute exposure from network structures. Facial image estimation in 2005 showed the power and danger of statistical inference from individual appearance. That is why we need to assume in principle the consent of the individual for the estimation of sensitive attributes, and simultaneously ensure in the system, operation, and design the restriction of their secondary use, prohibition of re-identification, remedy in case of misclassification, and accountability for data sources and limitations. The higher the resolution of the mirror, the thicker the ethics and consideration of those who reflect it - that is the starting point of the cyber age.

No comments:

Post a Comment