Meet the Other Phone. Protection built in.

Meet the Other Phone.
Protection built in.

Buy now

Please or to access all these features

Feminism: Sex and gender discussions

Interesting article in Engineering and Technology Magazine

13 replies

JellySlice · 01/06/2019 09:40

We talk here about the self-imposed echo-chambers created by blocking lists and the NoDebate attitude. It hadn't occurred to me that these echo-chambers could be imposed by Google, too. How do personal or corporate politics influence the writing of algorithms?

"It is an algorithm's basic function to learn and discriminate. But in such a feedback situation, unbounded machine learning will reinforce the very biases that societies are fighting against."

"Operant conditioning is how you brainwash mammals, and more and more algorithms are programmed to do that."

"There is no algo rule-list of dos and don'ts, no Asimov 'laws of robotics' to protect us from programmes that are designed to prey on us..."

OP posts:
TheInebriati · 01/06/2019 09:48

We've also seen that society is not actually fighting against bias, its just evolving new ways to present bias as socially acceptable.

arranbubonicplague · 01/06/2019 10:22

There are several books that address the topic of what's known as algorithmic injustice in the fields of crime & justice, healthcare, education, finance - you get the idea, pretty much every area of your life.

Discriminating or abusing by default is already happening. That looks like a good overview of the issue, thanks for flagging the article.

JellySlice · 01/06/2019 10:57

Isn't algorithm injustice usually how humans apply the algorithm, ie act without considering consequences because of lack of agency or overview?

OP posts:
JellySlice · 01/06/2019 10:58

Here's the article BTW (phone had a hissy fit earlier.)

Interesting article in Engineering and Technology Magazine
OP posts:
TheInebriati · 01/06/2019 13:10

I can read that if I open it in Paint and increase it to 200%. It makes some good points.
I'm getting increasingly annoyed about agencies/companies/organisations that just ignore the law and do their own thing, and this looks like yet another example.

JellySlice · 01/06/2019 13:46

agencies/companies/organisations that just ignore the law and do their own thing

It would appear that there isn't any relevant legislation.

OP posts:
Goosefoot · 02/06/2019 03:05

Isn't algorithm injustice usually how humans apply the algorithm, ie act without considering consequences because of lack of agency or overview?

I'd not say so, though trying to find problems can always help. But I think it comes down to what an algorithm is, it can't be unbiased.

dianebrewster · 02/06/2019 07:05

Algorithms are human constructs. We write algorithms to perform certain tasks. They are a set of rules used to sort data. We say something like "I want to filter out all of candidates predicted to go on to reach the top of this profession to put on a fast track programme" so we might do that by asking the system to analyse those already at the top and extract common characteristics so we can spot them early. Oh look, wealthy white males. Who would have thought. 🤨

Every time we write code we are potentially embedding our own prejudices into it and an algorithm, in this context, is a bit of code.

TeiTetua · 02/06/2019 19:29

One thing I've seen in the news recently is the tendency of algorithms used by dating sites to amplify users' preferences, or prejudices. On sites that present people with matches based on what they've liked before, or what people similar to them have liked, the result is that some profiles will end up with very few matches. It's not that the algorithm was designed to exclude anyone, but that's the way it's worked in practice. The good news is that some of the sites are changing their software to be more fair.
www.huffpost.com/entry/dating-apps-may-reinforce-sexual-racism-study_n_5bf3056ae4b0376c9e67abe1

nickymanchester · 03/06/2019 13:45

Isn't algorithm injustice usually how humans apply the algorithm, ie act without considering consequences because of lack of agency or overview?

No, because these injustices usually stem from the biases of the people writing the algorithm or the data that are used to develop the algorithm.

Just a couple of examples, in the US the courts use an automated system that is much more likely to say that black people are more likely to reoffend than white people even after allowing for age, gender, previous crimes etc.

Then, of course, there was the infamous time that Amazon tried to automate their hiring process. Because in the past it had hired largely white men, the algorithm learned that any reference in a person's CV to being in a sorority or doing female orientated sports or activities meant that they were likely to be less successful in the application process.

So, guess what happened, the algorithm just automated those earlier biases. Here's an article from The Guardian about it:-

Amazon ditched AI recruiting tool that favored men for technical jobs

TheInebriati · 03/06/2019 13:52

The effect is disproportionate compared to the effect that can usually be achieved by one individual or company. Usually to achieve that level of influence someone would need to be massively charismatic and influential.

arranbubonicplague · 04/06/2019 01:53

As another unintended consequence of algorithms:

YouTube’s algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids, we found.

YT often plays the videos after users watch softcore porn, building an audience of millions for what experts call child sexual exploitation

twitter.com/Max_Fisher/status/1135529605530411008

Links to NYT piece but your access might be limited.

SpeakUpXXWomen · 04/06/2019 16:57

This just reinforces the importance of searching for information NOT through google and using a wider range of search engines. There will always be an algorithm direction taken but the chances are it will be slightly less engineered.

Search feminism news on duckduckgo and you will get very different results to the same search on Google for example. Don't log in to view things, use private windows and containers to block trackers. Turn off microphones.

If nothing else do random searches to mess with the algorithms, I know people who run their entire social media like that - just purposefully feeding it big fat fibs to mess with the marketing Grin

Obviously never ever search transgender trend on google. Especially the education downloads page, don't go there and hang around for 10 minutes or so whatever you do.

New posts on this thread. Refresh page
Please create an account

To comment on this thread you need to create a Mumsnet account.

This thread is closed and is no longer accepting replies. Click here to start a new thread.

Swipe left for the next trending thread