Friday 26th of April 2024
|
|
|
Headlines : * Myanmar warlord at centre of battle for key border town   * Two dead as trucks crash in Dinajpur’s Ghoraghat   * Five killed in Manikganj, Gazipur, Dinajpur road crashes   * PM Hasina opens bilateral meeting with Thai premier Thavisin   * Dhaka denounces US State Department`s 2023 human rights report   * PM pays courtesy call on Thai King & Queen   * PM urges world leaders to say `no` to wars   * Heat wave sweeping across the country, may continue   * Secondary schools, colleges to open Sunday   * 155 killed in Tanzania as heavy rains cause floods, landslides  

   Technology
Facebook`s unglamorous mistakes
  Date : 26-04-2024

Facebook, which has renamed itself Meta, and other social networks must make tricky judgment calls to balance supporting free expression while keeping out unwanted material like imagery of child sexual abuse, violent incitements and financial scams. The company’s critics and the semi-independent Facebook Oversight Board have repeatedly said that Facebook needs to make it easier for users whose posts were deleted or accounts were disabled to understand what rules they broke and appeal judgment calls. (Irene Suosalo/The New York Times)

In a Facebook group for gardeners, the social network’s automated systems sometimes flagged discussions about a common backyard tool as inappropriate sexual talk.

Facebook froze the accounts of some Native Americans years ago because its computers mistakenly believed that names like Lance Browneyes were fake.

The company repeatedly rejected ads from businesses that sell clothing for people with disabilities, mostly in a mix-up that confused the products for medical promotions, which are against its rules.

Facebook, which has renamed itself Meta, and other social networks must make tricky judgment calls to balance supporting free expression while keeping out unwanted material like imagery of child sexual abuse, violent incitements and financial scams. But that’s not what happened in the examples above. Those were mistakes made by a computer that couldn’t handle nuance.

Social networks are essential public spaces that are too big and fast-moving for anyone to effectively manage. Wrong calls happen.

These unglamorous mistakes aren’t as momentous as deciding whether Facebook should kick the former US president off its website. But ordinary people, businesses and groups serving the public interest like news organisations suffer when social networks cut off their accounts and they can’t find help or figure out what they did wrong.

This doesn’t happen often, but a small percentage of mistakes at Facebook’s size add up. The Wall Street Journal calculated that Facebook might make roughly 200,000 wrong calls a day.


People who research social networks told me that Facebook — and its peers, although I’ll focus on Facebook here — could do far more to make fewer mistakes and mitigate the harm when it does mess up.

The errors also raise a bigger question: Are we OK with companies being so essential that when they don’t fix mistakes, there’s not much we can do?

The company’s critics and the semi-independent Facebook Oversight Board have repeatedly said that Facebook needs to make it easier for users whose posts were deleted or accounts were disabled to understand what rules they broke and appeal judgment calls. Facebook has done some of this, but not enough.

Researchers also want to dig into Facebook’s data to analyse its decision making and how often it messes up. The company tends to oppose that idea as an intrusion on its users’ privacy.

Facebook has said that it’s working to be more transparent, and that it spends billions of dollars on computer systems and people to oversee communications in its apps. People will disagree with its decisions on posts no matter what.

But its critics again say it hasn’t done enough.


“These are legitimately hard problems, and I wouldn’t want to make these trade-offs and decisions,” said Evelyn Douek, a senior research fellow at the Knight First Amendment Institute at Columbia University. “But I don’t think they’ve tried everything yet or invested enough resources to say that we have the optimal number of errors.”

Most companies that make mistakes face serious consequences. Facebook rarely does. Ryan Calo, a professor at the University of Washington law school, made the comparison between Facebook and building demolition.

When companies tear down buildings, debris or vibrations might damage property or even injure people. Calo told me that because of the inherent risks, US laws hold demolition companies to a high standard of accountability. The firms must take safety precautions and possibly cover any damages. Those potential consequences ideally make them more careful.

But Calo said that laws that govern responsibility on the internet didn’t do enough to likewise hold companies accountable for the harm that information, or restricting it, can cause.

“It’s time to stop pretending like this is so different from other types of societal harms,” Calo said.

 

© 2022 The New York Times Company



  
  সর্বশেষ
DMP arrests 39, files 26 cases in 24-hr anti-narcotics drive
Indigenous people protest Brazil not protecting ancestral lands
Musk`s X says posts of Australia bishop stabbing don`t promote violence
Zendaya`s tennis movie, `Challengers,` tests friendship bonds of complex trio

Chief Advisor: Md. Tajul Islam,
Editor & Publisher Fatima Islam Tania and Printed from Bismillah Printing Press,
219, Fakirapul, Dhaka-1000.
Editorial Office: 167 Eden Complex, Motijheel, Dhaka-1000.
Phone: 02-224401310, Mobile: 01720090514, E-mail: muslimtimes19@gmail.com