Machine translation learns from vast datasets that inevitably reflect societal biases, and these biases become part of their output unless steps are taken to prevent it.
Gender bias: Gender bias is the most common type of bias in AI translations because of the varied ways different languages indicate gender. This can lead to NMT engines changing the gender of words to conform to harmful stereotypes. For example, Google Translate was caught removing female historians and presidents and male nurses from translated content.
Cultural Bias: Cultural bias in AI translation occurs when biases common in specific cultures impact translated output. For example, public health materials about anxiety or mental disorders that were neutral in sentiment in English became more negative when Machine translated into Chinese, Hindi, and Spanish.
When businesses and organizations produce translations that don’t meet current standards of inclusivity, their reputations can suffer.