Gaywallet (they/it)

I’m gay

  • 177 Posts
  • 576 Comments
Joined 3 years ago
cake
Cake day: January 28th, 2022

help-circle






  • Ethically speaking, we should not be experimenting on humans, even with their explicit consent. It’s not allowed by any credible review board (such as the IRB) and in many countries you can be held legally liable for doing experiments on humans.

    With that being said, there have been exceptions to this, in that in some countries we allow unproven treatments to be given to terminal patients (patients who are going to die from a condition). We also generally don’t have repercussions for folks who experiment on themselves because they are perhaps the only people capable of truly weighing the pros and cons, of not being mislead by figures of authority (although I do think there is merit of discussing this with regards to being influenced by peers), and they are the only ones for which consent cannot be misconstrued.





  • you should filter out irrelevant details like names before any evaluation step

    Unfortunately, doing this can make things worse. It’s not a simple problem to solve, but you are generally on the right track. A good example of how it’s more than just names, is how orchestras screen applicants - when they play a piece they do so behind a curtain so you can’t see the gender of the individual. But the obfuscation doesn’t stop there - they also ensure the female applicants don’t wear shoes with heels (something that makes a distinct sound) and they even have someone stand on stage and step loudly to mask their footsteps/gait. It’s that second level of thinking which is needed to actually obscure gender from AI, and the more complex a data set the more difficult it is to obscure that.





  • We weren’t surprised by the presence of bias in the outputs, but we were shocked at the magnitude of it. In the stories the LLMs created, the character in need of support was overwhelmingly depicted as someone with a name that signals a historically marginalized identity, as well as a gender marginalized identity. We prompted the models to tell stories with one student as the “star” and one as “struggling,” and overwhelmingly, by a thousand-fold magnitude in some contexts, the struggling learner was a racialized-gender character.