Meet the Other Phone. Child-safe in minutes.

Meet the Other Phone.
Child-safe in minutes.

Buy now

Please or to access all these features

Feminism: Sex and gender discussions

Feminist Pub XX - may the summer rains wash the patriarchy down the plughole

983 replies

NoTechnologicalBreakdown · 07/08/2015 08:17

Ooh ooh! Do I get to start it?

Wine and cake all round. And a celebratory burst on the patriarchy-blasting cannon!

Old pub here

OP posts:
kickassangel · 19/08/2015 04:18

Scrub that, just went on the thread about it. Now I need to change all my passwords, which is long overdue, but we're on vacation for another 10 days.

HoVis2001 · 19/08/2015 10:40

Oooh, a new pub!

Just came across this: www.theguardian.com/education/2015/aug/18/female-composers-a-level-music-syllabus-petition

And am now trying, very, very gently, to suggest that a friend of a friend check his male privilege, as he's busy saying he never saw the need for people to have role models of the same gender. Hmm Might it be because he's never had the problem of only ever being presented role models of the opposite gender?!

And back to work... Blush

INickedAName · 19/08/2015 11:28

I should have read your second post kickass, I've just sent you a long pm. Ignore it :)

Hovis2001 · 19/08/2015 12:02

INicked

I haven't read it, but at 99p I'm quite tempted to buy it for some train reading on my tablet!

Online sexism really interests me. I've often wondered if the internet in fact makes the world seem/feel more misogynistic, as you often get the extreme views shouted the loudest, if that makes any sense? I've certainly felt as if attitudes relating to gender have gone downhill during my lifetime, but I can't decide if that is just a perception issue on my part...

ChunkyPickle · 19/08/2015 12:10

HoVis - when younger and more naive I didn't get it either (along with a lot of things that this board has opened my eyes and ears about), but what explained it to me is that you're not being representative of what's already there, you're wanting to showcase what's available.

If Apple made a rainbow of ipods, but the only ones they put out in the shop were the red ones because that's what most people had historically bought, how would you even know that it was possible to get a green one?

If girls never even hear of female composers, why would they even think about it being a possibility (with a few, very determined exceptions who've fought all the way to get there)

ChunkyPickle · 19/08/2015 12:12

With of course the extra bit above, that if you don't know you can get a green ipod, then no-one asks for them, so your sales team might decide that's because no-one wants green ipods, when really, the conclusion should be that people didn't think they could get green ipods.

HoVis2001 · 19/08/2015 13:27

Chunky

I love that way of putting it! Especially the part about sales teams deciding no one wants green ipods because no one asked for them.

ChunkyPickle · 19/08/2015 14:09

I can't take credit - I think it came from someone else in the pub ages back..

INickedAName · 19/08/2015 15:28

I liked the phrase "you can't be, what you can't see". I don't know where I heard it, or who said it, but it's made a lot of things click for me.

MsUnicorn · 19/08/2015 16:08

I really like those ways of describing the problem. Dd1 has announced this week that she wants to be a particle physicist Grin I know she's never met a female physicist, she's never met a male physicist either, but I feel like I've successfully shown her that ANY career is possible for her, she's not on the slightest bit bothered that it's quite male dominated. Grin Obviously I'd be proud of her whatever she decides to do, but I quite fancy having a particle physicist in the family.

EBearhug · 19/08/2015 16:48

I watched a programme on BBC4 last night about Marie Curie, and I wondered where the idea that women aren't so good at maths/science ever came from. (Obviously I do know, and the point was made in the programme, too.) Still the only person to have been awarded a Nobel Prize in two different subjects.

Meanwhile, someone at work was talking about office heating/aircon. She didn't want to be sexist, but women feel the cold more, so the temperature should suit them, as they are the majority (she's in the US-the business model and staff demographics are somewhat different, with a lot more non-techy staff.) Anyway, I don't agree, but I'm mostly feeling miffed that I had to change my response to say gender rather than sex. Sex appears to be a banned word. On some days, I'd feel like finding the admins to ask them to reconsider, but I really can't be bothered.

INickedAName · 19/08/2015 23:54

Just read there's another ddos attack on MN. He's also leaked all of Mumsnet's partner company emails.

Found the publishes list of usernames if anyone's wants it, I've not checked all of it yet, but I recognise a lot of the names.

INickedAName · 20/08/2015 13:53

Link to this article popped up on twitter today. Not read any of the comments yet.

Sexual Harrassment

JeanneDeMontbaston · 21/08/2015 11:59

Is anyone watching Great British Menu? I'm sitting here doing some proofreading with it in the background and I have just found myself saying 'fuck off' at the man who thinks he's 'celebrating' women by preparing them food that looks like cosmetics with a little note thanking women for 'having time to spare'.

Hmm
YonicScrewdriver · 21/08/2015 12:02

Haven't seen it, Jeanne, but sounds shite!

JeanneDeMontbaston · 21/08/2015 12:06

It's really irritating me (I know, I know, I should stop watching).

They always have loads of male chefs, but I always watch it rooting for the women and they've had more and more women on there since they started. It just makes me sad that the BBC can show something that's so utterly patronising.

EBearhug · 21/08/2015 13:03

Found this article interesting (in a depressing way) - www.wsj.com/articles/computers-are-showing-their-biases-and-tech-firms-are-concerned-1440102894

MsMermaid · 21/08/2015 13:13

I can't read that ebear, not without subscribing. Can you summarise it for me? It sounds interesting.

Jeanne, that sounds incredibly irritating and patronising. I'm glad I don't watch it.

EBearhug · 21/08/2015 13:48

Hmm, I googled it, and it brings up a non-subscription preview version. (I went through my phone, as we may well have a subscription through work, I don't know.)

Basically, autotagging algorithms and so on means ad targeting is discriminatory - which is important when it comes to things like job searches.

(I am not surprised. Most programmers are men.)

EBearhug · 21/08/2015 13:50

This may get zapped because of copyrighting...

Social Bias Creeps Into New Web Technology
By Elizabeth Dwoskin
The Wall Street Journal
Fri, 21 Aug 2015
English
1304 words
Copyright © 2015, Dow Jones & Company, Inc.

Software tags our photos, recommends products to buy online and serves up ads based on our interests. Increasingly, it also plays a role in more consequential decisions, such as who gets a job or loan, or who pays surge pricing in a ride-sharing app.

But computer programs that crunch immense amounts of data to render decisions or predictions can go embarrassingly, sometimes troublingly wrong.

In May, Flickr, a division of Yahoo Inc., rolled out software that recognized objects in photos uploaded by users and tagged them accordingly: car, boat, cat, dog. But what was supposed to be a snazzy feature became a public-relations nightmare when the online photo-sharing service tagged a photo of a black man with the word "ape" and a picture of a concentration camp as a "jungle gym." Google Inc. ran into the same problem in June, when the company's auto-tagging feature mislabeled a photo of a black man with the word "gorilla."

Such errors can go beyond insensitivity and insult to arbitrarily limit people's opportunities. Carnegie Mellon University researchers examining Google's ad-targeting system recently found that male Web users were six times more likely than female users to be shown ads for high-paying jobs. Veterans have complained that they were automatically disqualified for civilian jobs because human-resources software used by the employers didn't recognize the skills they learned in the military.

While automation is often thought to eliminate flaws in human judgment, bias or the tendency to favor one outcome over another, in potentially unfair ways can creep into complex computer code. Programmers may embed biases without realizing it, and they can be difficult to spot and root out. The results can alienate customers and expose companies to legal risk. Computer scientists are just starting to study the problem and devise ways to guard against it.

"Computers aren't magically less biased than people, and people don't know their blind spots," said Vivienne Ming, a data scientist and entrepreneur who advises venture capitalists on artificial intelligence technology.

Many data scientists believe that the benefits of such technology outweigh the risks. As chief scientist at Gild Inc., a technology startup that makes software to help identify promising job candidates, Ms. Ming devised programs that she says helped recruiters consider a broader set of candidates than they would otherwise. Recruiters often base their decisions on sharply limited criteria -- for instance, disqualifying talented candidates who didn't attend a top school. Software that weighs more variables allowed recruiters to cast a wider net, she said.

Yet unintended negative outcomes are "definitely a risk," said Adeyemi Ajao, vice president of technology strategy and a data scientist at Workday, a software company using complex statistical formulas for human-resources management. "I don't think it's possible to eliminate it 100%."

One common error is endemic to a popular software technique called machine learning, said Andrew Selbst, co-author of "Big Data's Disparate Impact," a paper to be published next year by the California Law Review. Programs that are designed to "learn" begin with a limited set of training data and then refine what they've learned based on data they encounter in the real world, such as on the Internet. Machine-learning software adopts and often amplifies biases in either data set.

In the type of image-tagging programs used by Google and others, software learns to distinguish people in photos by finding common patterns in millions of images of people. If the training data predominantly depicts white people, the software won't learn to recognize people who look different.

Google acknowledged the error in its image-tagging software and said it was working to fix the problem, but declined to comment further.

Paul Viola, a former Massachusetts Institute of Technology engineer who helped pioneer such techniques, said he encountered similar problems 15 years ago, and that they're hard to tackle. Back then, he built a software program that would comb through images online and try to detect objects in them. The program could easily recognize white faces, but it had trouble detecting faces of Asians and blacks. Mr. Viola eventually traced the error back to the source: In his original data set of about 5,000 images, whites predominated.

The problem got worse as the program processed images it found on the Internet, he said, because the Internet, too, had more images of whites than blacks. The software's familiarity with a larger set of pictures sharpened its knowledge of faces, but it also solidified the program's limited understanding of human differences.

To fix the problem, Mr. Viola added more images of diverse faces into his training data, he said.

Mr. Viola's ability to trace the problem back to the source was unusual, said Mr. Selbst. More often than not, the culprit is hard to pinpoint. Two common reasons: The software is proprietary and not available for examination, and the formula, or algorithm, used by the computer is extremely complex. An image-detection algorithm, for instance, may use hundreds of thousands of variables.

Take recent research from Carnegie Mellon that found male Web users were far more likely than female users to be shown Google ads for high-paying jobs. The researchers couldn't say whether this outcome was the fault of advertisers who may have chosen to target ads for higher-paying jobs to male users or of Google algorithms, which tend to display similar ads to similar people. If Google's software notices men gravitating toward ads for high-paying jobs, the company's algorithm will automatically show that type of ad to men, the researchers said.

Google declined to comment.

Regardless of their source, the only way to detect subtle flaws in such complex software is to test it with a large number of users, said Markus Spiering, a former Yahoo Inc. product manager who oversaw the company's Flickr division. To improve an image-detection program quickly enough to be competitive, he said, it is necessary to let the public use it, and that means running the risk of making public mistakes.

Yahoo didn't respond to requests for comment.

Data scientists say software bias can be minimized by what amounts to building affirmative action into a complex statistical model, such as Mr. Viola introducing more diverse faces.

"You can plan for diversity," said T.M. Ravi, co-founder and director of the Hive, an incubator for data-analytics startups.

Mr. Selbst, along with the Carnegie Mellon technologists, and others are among the pioneers of an emerging discipline known as algorithmic accountability. These academics, who hail from computer science, law and sociology, try to pinpoint what causes software to produce these types of flaws, and find ways to mitigate them.

Researchers at Princeton University's Web Transparency and Accountability Project, for example, have created software robots that surf the Web in patterns designed to make them appear to be human users who are rich or poor, male or female, or suffering from mental-health issues. The researchers are trying to determine whether search results, ads, job postings and the like differ depending on these classifications.

One of the biggest challenges, they say, is that it isn't always clear that the powerful correlations revealed by data-mining may be biased. Xerox Corp., for example, quit looking at job applicants' commuting time even though software showed that customer-service employees with the shortest commutes were likely to keep their jobs at Xerox longer. Xerox managers ultimately decided that the information could put applicants from minority neighborhoods at a disadvantage in the hiring process.

"Algorithms reproduce old patterns of discrimination," Mr. Selbst said, "And create new challenges."

FinglesMcStingles · 21/08/2015 14:01

That was too long for my tired brain to contemplate. Pint of tea to wash the name-change down, please.

YonicScrewdriver · 21/08/2015 14:15

That was interesting.

MsMermaid · 21/08/2015 14:48

That was interesting, thank you ebear. It makes sense that when the computers are trying to target people for things they perpetuate the current reality, complete with bias, prejudice and stereotypes. I'm glad its being highlighted now, so programmers can think more carefully about these things when they are writing their software.

UptoapointLordCopper · 22/08/2015 09:47

Been away. Will catch up.

Am thinkng about name change. Need new name...

iamaboveandBeyond · 23/08/2015 08:08

Hi everyone Wine

Just marking my place, i keep losing track of the pub but i am reading :)