Intercultural relations skills, AI in healthcare, Adidas

February 6, 2019

A Duke professor was recently asked to step down from her administrative post after sending a threatening email to students that strongly discouraged them from speaking Chinese in the student areas. In the email, the professor wrote that some of her colleagues had approached her, saying that “they were disappointed that these students were not taking the opportunity to improve their English and were being so impolite as to have a conversation that not everyone on the floor could understand.” She said that her colleagues wanted to keep a list of the students’ names “so they could remember them if the students ever interviewed for an internship or asked to work with them for a master’s project.” The professor also acknowledged how difficult it must be for the students to study in their non-native language, adding, “I encourage you to commit to using English 100 percent of the time.” The university has since released a statement of apology, calling the email “demeaning, disrespectful, and wholly discriminatory” as well as “extraordinarily xenophobic.” Elizabeth Redden of Inside Higher Ed says the incident makes a strong case for the development of intercultural relations skills among faculty, especially at a school like Duke, where the student body is increasingly international.

Telling Students Not to Speak Chinese, by Elizabeth Redden for Inside Higher Ed

 

Dr. Dhruv Khullar warns that artificial intelligence could introduce new biases into the medical field, in addition to codifying existing biases we observe in the field today (like routine inferior care, faulty diagnosis, and medical inattention). AI needs large, real-world data sets to learn how to diagnose diseases and make decisions. The problem is that those data sets often lack sufficient data on marginalized and underrepresented groups. They also mirror societal trends, even when those trends are shaped by bias and discrimination. That means that decisions made by AI may be unreliable or even outright dangerous. For example, if the data shows that in general, low income levels are linked to worse recovery after an organ transplant, a machine learning algorithm may recommend against transplants for low-income patients. Dr. Khullar says that medical professionals must not over-rely on new technologies or automatically accept machine-made decisions in lieu of their own “clinical and moral intuition.” He likens this to our overreliance on an everyday technology: “As automation becomes pervasive, will we catch that spell-check autocorrected ‘they’re’ to ‘there’ when we meant ‘their?’” It’s frightening to consider the medical equivalent of this error, like failure to diagnose a life-threatening condition. Dr. Khullar urges medical professionals to remember that while AI will undoubtedly revolutionize the medical field, “humans, not machines, are still responsible for caring for patients.”

A.I. Could Worsen Health Disparities, by Dr. Dhruv Khullar for The New York Times Opinion Section

Just a few days into Black History Month, Adidas has come under fire from consumers for a pair of shoes released as part of its Harlem Renaissance-inspired collection. The shoes—which were all white, right down to the grommets—were added to the Ultraboost Uncaged line. While the line existed prior to February, consumers were quick to point out that, through a historical lens, cages carry “a dark meaning, one linked to slavery” among the African American community. Of the shoes, journalist Elizabeth Segran says that “if you were looking to create a metaphor for how literal whiteness has usurped a sneaker meant to commemorate Black culture, you couldn’t come up with a more apt example than this sneaker.” Adidas has since removed the shoe from its Black History Month collection, leaving consumers wondering how it got the green light to be added in the first place.

Adidas drops an all-white shoe called “Uncaged” for Black History Month, by Elizabeth Segran for Fast Company