When the podcast Interest by Jonathan Van Ness first released in 2015, it was titled, “What’s the Difference Between Sunni and Shia Muslims and Why Don’t They Love Each Other?” that’s how it is Gay of Thrones The star spoke with the UCLA professor to clear up the difficult debate over the years. The film did little good, but three years later Van Ness was snapped up by Netflix to star in one of them The Queer EyeThe new Fab Five and found themselves loved by millions of new fans.
Since then, Van Ness has written novels, a children’s book, and an essay; nominated for several Emmys; spent time in DC advocating for LGBTQ+ rights; come out as both non-binary and HIV-free; he started touring a live show that combines gymnastics and gymnastics; and even launched her own hair care line. He has also turned the podcast into a Netflix show.
Through it all, Van Ness still found time for his podcast, and Interest dropping his 300th episode last week. Topics on the road have ranged from fatphobia to The Great British Bake Off, Van Ness maintains that he will cover anything as long as he is interested in it. Van Ness says: “I feel like I’ve grown up with the show, and a lot of what I know about life I’ve learned doing this podcast.”
With an eye on sharing some of these lessons, Van Ness has cut back considerably Interest library to his nine favorite episodes, choosing WIRED’s selections that he hopes will “turn more people to their favorites.”
Van Ness: [Data journalist] Meredith Broussard’s work is something I refer to a lot. Techno-chauvinism is the idea that machines know how to do things better than humans. He used an example of a self-made blind machine. It’s nice to press a button and make it go up, but when it breaks, you can’t fix it. Once you have the handrails, you just go and pull the strings down with a cable and it works fine. It would be easy to fix.
The most important example he talks about is his biased work, such as how a police camera or facial recognition machine will not recognize someone of the opposite sex. Most of these algorithms are a reflection of the people who create them and in most cases, the people who create these algorithms are men. People who develop algorithms are not really that different. It is not recommended to bring such matters in those places and disputes may be resolved arbitrarily.
Thus, techno-chauvinism is baked into practices that affect our daily lives in very important ways. If you’re at the TSA scanner, for example, you might get pulled over in line because you’re listed as a man but you’re wearing a long shirt, so you can get caught so someone won’t find you. , because those algorithms don’t know how to recognize another aspect that the human eye can recognize.
Van Ness: Tina Lasisi is an evolutionary scientist and she studies the diversity of human hair, but she also studies how we got here, such as the evolution of human hair and the scalp. Most of what I learned in hair school about why curly hair is curly, why curly hair is curly, why straight hair is straight…it’s all false. It’s not true at all. In hair school we are told that if your hair is round, it is like a kidney bean. Curly hair looks like an oval, then straight hair looks like a perfect circle. But in his work in the lab, they really found every type of hair in all the forms.
What’s really scary is that all this fake science was used to investigate crimes in the 80’s and 90’s, like, “This hair was there and because it’s the color of a kidney bean we know it was a Black person.” It’s more nonsense than that.