5 algorithms that are already making decisions about your life and that maybe you did not know

5 algorithms that are already making decisions about your life and that maybe you did not know

2 views

A video showing how a passenger was taken from a United Airways plane at O’Hare Airport in Chicago, United States, became a viral phenomenon last April.

The video was bad publicity for the US airline, whose employees wanted the man in question, a doctor named David Dao, to vacate his chair to give it to a company pilot who needed to get to the destination city of the flight, Louisville, to do a relay.

However, only a handful of the many criticisms that rained down on the company addressed a crucial issue: how had it been determined that it was Dao who gave up his place on the flight?

The need to remove Dao was decided by a machine. More specifically, for one software that probably had scored the doctor long before he set foot in the airport.

This is just one example of how those algorithms are – invisibly and behind the scenes – making decisions that affect our lives.

And we do not talk about the algorithms of Google, Facebook or Netflix, which filter what we see or offer suggestions from our previous selections: unlike these, there are algorithms that are not directly related to our actions.

1. Artificial intelligence decides if you are going to have a job or not

The resumes or CVs are now more likely to be discarded without even passing through the hands and eyes of a human being.

Robot and human shake hands
Copyright of the GETTY IMAGES image
Image caption In the not so distant future …

That’s because every day the recruitment companies are adopting Candidate Tracking Systems programs that handle recruitment processes, especially the analysis of hundreds (or thousands) of initial applications.

In U.S.A. It is estimated that 70% of job applications are filtered before being analyzed by humans.

For companies, this saves time and money in the process of hiring new employees.

However, this system has generated questions about the neutrality of the algorithms.

In an article in the Harvard Business Review , academics Gideon Mann and Cathy O’Neil argue that these programs are not devoid of the prejudices and biases of humans, which could make artificial intelligence not really objective.

2. Do you want a loan? Your profile on social networks can prevent you …

Historically, when someone applied for a loan from a financial institution, the answer was based on the direct analysis of their ability to pay: the proportion of the debt over the person’s income and the credit history.

Scene on a bench
Copyright of the GETTY IMAGES image
Image caption This loan may have been approved by an algorithm.

But this is no longer the case: the ability to pay a loan is now evaluated by algorithms that accumulate data from different sources, ranging from buying patterns to internet searches and activity on social networks.

The problem is that this non-traditional method could gather information without the knowledge or approval of the possible beneficiaries of the credit.

Here, too, there are questions about the transparency and impartiality of the process.

3. It can help you find love, but maybe not the one you’re waiting for

It is not a surprise to know that online dating applications use algorithms to bring couples together.

In fact it is part of his speech to attract customers, especially with services remium or payment.

Looking for couples online
Copyright of the GETTY IMAGES image
Image caption Finding love online: I wish it were that simple.

However, how they do it is much less clear.

Especially after eHarmony , one of the most successful dating sites on the planet, revealed last year that it adjusted the profiles of some customers with the idea of making them more “nice” and attractive.

That means ignoring some user preferences, such as “I like” and “I do not like”.

And this is quite annoying for the person who took the time to answer the 400 questions that are asked to create a profile on eHarmony .

But even simpler options like Tinder , where the variables are less (location, age and sexual preferences), are not as clear or random.

Anyone who uses this application is assigned a secret “pull rating” – that is, how “desirable” the candidate is – which is calculated by the company with the idea of “facilitating better pairings”.

The company has kept this formula secret, but its executives have given some clues.

For example, the number of times a user is dragged to the right or left by others (which is the way in Tinder is indicated if someone likes you or does not like you) plays a very important role.

4. A program can determine if you are an addict or you can have health insurance

Opioid abuse – sold legally or illegally – is the leading cause of accidental deaths in the United States and health experts often speak of an “epidemic” in consumption.

Opioids
Copyright of the GETTY IMAGES image
Image caption More than 400,000 people died of overdose in the United States in the past year.

To attack the problem, scientists and authorities are coming together to create and execute data-driven projects.

More recently, in the state of Tennessee, health insurance provider Blue Cross and technology firm Fuzzy Logix announced the creation of an algorithm that analyzed no less than 742 variables to assess the risk of abuse and identify possible addicts.

But of course that raised an ethical discussion: the analyzed data include the medical histories and even the residence address of the candidates.

The argument in favor indicates that this type of intervention can save lives and can even reduce abuses to the system.

Opioid addicts are 59% more likely to be expensive users of medical services.

The advocates of the use of artificial intelligence and algorithms in the industry say this could help make decisions and reduce unnecessary waste generated by human error.

5. A computer can send you to prison

Judges in at least 10 US states They are dictating sentence with the help of a tool called COMPAS.

Justice
Copyright of the GETTY IMAGES image
Image caption Some courts in the United States are already using algorithms to define their sentences.

It is a risk assessment algorithm that can predict the likelihood that an individual has committed a crime.

One of the most famous cases involving the use of COMPAS occurred in 2013, when Eric Loomis was sentenced to seven years in prison for evading police control and driving a vehicle without the owner’s consent.

When preparing the sentence, the local authorities presented an evaluation -based on an interview and on the information about their criminal probabilities- and Loomis obtained a rating of “high risk of committing new crimes”.

His lawyers rejected the conviction using different arguments, one of which was the fact that COMPAS had been developed by a private company and the information on how the algorithm worked had never been disclosed.

They also claimed that Loomis’ rights had been violated, because the risk assessment took into account information on gender and racial information.

In fact, an analysis of more than 10,000 defendants in the state of Florida published in 2016 by the ProPublica research group showed that black people were often qualified with high probabilities of re-offending, while whites were considered less likely to commit new ones. crimes.

About author

Rava Desk

Rava is an online news portal providing recent news, editorials, opinions and advice on day to day happenings in Pakistan.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Your email address will not be published. Required fields are marked *