Posted 3 hours ago3 hr comment_12422 No, this article was not written with AI. You know how you can tell? Because it’s got a bit of personality (mine), and even though it’s about artificial intelligence (arguably one of the most boring topics on the planet, in my opinion), this doesn’t read like a computer generated it. (Just me, standing at my very-expensive standing desk, writing away on my laptop!) Which gets us to the reason for this article: a new study on AI. Researchers from Cornell University looked at how Western-centric AI models provide writing suggestions to users from different cultural backgrounds. The study, titled “AI Suggestions Homogenize Writing Toward Western Styles and Diminish Cultural Nuances,” included 118 participants from India and the United States. And it found that when Indians and Americans used AI writing assistance, it often came at the expense of the Indians in the group. Why, you ask? Even though the tools helped both groups write faster, the Indian writers had to keep correcting the AI’s suggestions, resulting in a smaller productivity boost. One reason for that is because AI tools like ChatGPT are primarily developed by American tech companies, which are powered by large language models that don’t contain all the linguistic nuances of 85% of the world’s population, who live in the Global South and are using AI-writing tools. (The Global South is defined as those countries primarily in the Southern Hemisphere, often considered developing or less developed than their northern counterparts in Africa, Asia, and Latin America.) Study researchers had the two groups write about cultural topics like food and holidays. Half used an AI-writing assistant that gave autocomplete suggestions. The writing samples showed that the Indian participants kept 25% of the suggestions while Americans kept only 19%, but also found the Indian writers made significantly more modifications to those suggestions, rendering them less helpful. For example, when some of the Indians wrote about food, a common suggestion included pizza. Or when they wrote about holidays, the AI tool suggested Christmas. In short, this study shows AI isn’t all it’s cracked up to be, and benefits some users more than others. “This is one of the first studies, if not the first, to show that the use of AI in writing could lead to cultural stereotyping and language homogenization,” according one of the study’s authors, Aditya Vashistha, an assistant professor of information science. “People start writing similarly to others, and that’s not what we want. One of the beautiful things about the world is the diversity that we have.” The study’s main author, Dhruv Agarwal, a doctoral student in the field of information science, said that although the technology brings a lot of value into people’s lives, “for that value to be equitable and for these products to do well in these markets, tech companies need to focus on cultural aspects, rather than just language aspects.” View the full article