Robot Overlords or Algorithms? They’re More Alike than You Think

January 30, 2023
·
5 min
deleteme

Protect yourself from future breaches

Most of us have heard of social media algorithms (aka the concerning way Facebook always seems to know what you’ve recently looked up) and have some vague idea of what they are. However, we tend to overlook the much broader ways that algorithms are infiltrating and changing our daily lives.

Where we live, what colleges we go to, and even who we have relationships with are all massively influenced by modern day algorithms. Algorithms have so much impact on our perception of the world around us and how we direct our “free will”, that it starts to feel more like science fiction than science fact.

What’s an “algorithm” and how does it work?

The term algorithm can hold a lot of mystery for those who aren’t super familiar with tech or advanced mathematics. However, it’s really just a series of events that are used to determine actions or solutions to problems.

Taking these actions and creating a formula that’s integrated into code is what drives a lot (if not most) of our online activities. 

Programmers create algorithms that are triggered by certain actions, stored data, or user queries. These algorithms respond based on their intended purposes to create our online journey.

The good news is that this can deliver us some of the most relevant information and customize our online experiences. The bad news is that this can limit our visibility, create a virtual echo chamber, and influence the larger way we navigate the world.

If we’re only shown specific choices, then it can be argued that we lose our wider ability to weigh options and act on our own behalf. There’s also an argument to be made that with algorithms controlling the flow of current events into our lives, we’re existing within a technological bias.

At this point, we need to ask ourselves; Are we influencing the algorithms, or are they influencing us?

How much of our lives are actually run by algorithms?

Now that we have a basic understanding of algorithms, we need to take a look at just how deeply ingrained they are in our everyday lives.

It’s easy to demonstrate how they’re involved in our online shopping activities, ad placements, and interests. However, there are deeper algorithms at play that can impact our ability to live and thrive.

Ever tried to take out a mortgage, online loan, or even a private student loan? Every one of those is initially determined by how your information is processed by an algorithm. Anyone who doesn’t meet certain criteria is usually quickly turned down. And there’s often little recourse, because the human workers at these companies only know how to defer to the algorithms for their answers.

Law enforcement, custody agencies, educational institutions, and even corporate HR all sometimes rely on additional algorithms to perceive risk and make major decisions. Police presence, the likelihood a person committed a crime, some custody battles, and even hiring decisions are all subject to algorithmic interference. 

When it comes down to these types of life-altering events, “data driven” takes on a whole new meaning. Especially when we find out that the data used to make these decisions isn’t always accurate.

Algorithms are also deeply involved in government.

A 2020 study conducted by Stanford University found that more than half of government agencies rely on some form of artificial intelligence fueled by algorithms to automate decision making processes. This can range from risk management within government parameters, such as predictive policing efforts, to a multitude of decisions impacting private citizens.

While this tech was adopted to cut down on government error, it has unfortunately led to some serious consequences when the data it’s based on turns out to be inaccurate. Unfortunately, algorithms can inherit the bias of their creators, and amass historical data that is heavily skewed based on human error. In this sense, we see systemic racism, prejudice against people living in poverty, and inaccurate recidivism risk assessments.

Regardless of how much technology can learn, it’s still only as good as the people who create and operate it. It does not mean that we should abandon the use of AI and algorithms to make smarter, faster decisions. It simply means that we need to lessen our reliance on it and employ other systems of checks and balances to ensure accuracy.

Doing this entails limiting the amount of data we’re sharing on the internet, reducing the use of tracking cookies, being aware of how our data is being collected and used, and taking steps to limit the collection of data on our personal search histories.

People are deferring to algorithms so often, it’s changing the way their brains work.

In 2020, several studies began to emerge exploring the impact Instagram’s “weight and beauty-centric” algorithms were having on adolescents, particularly teenage girls. Teenagers were being fed an ongoing stream of unattainable beauty standards that created harmful internal feedback loops.

According to the American Psychological Association, the brains of children and adolescents crave social rewards more than adults. Particularly those that release the neurotransmitters dopamine and oxytocin (“happy” hormones). In response to this, the brain forms habits based on these reward systems, which can lead to long-term chemical and psychological changes.

In addition to the social rewards aspects of being online, we also see an increasing dependence on technology to replace our internal knowledge base. Digital dependence can have an impact on memory, our capacity for learning, and our ability to access new information without outsourcing the activity to technology.

Manfred Spitzer, a German neuroscientist, coined the term “digital dementia” to describe the potential impact that relying on technology could have on the human brain. While this topic has proven controversial, many experts can agree that there is some truth to the theory. When we realize just how much of this information is being routed via algorithms, we realize that much of our external perception is being engineered.

The goal is to learn how to live with and use technology in ways that enhance the human experience instead of replacing or controlling it. This can only be achieved through unbiased access to information and more control over how our data is used to structure our digital ecosystem. You can learn more about how to control the flow of data you're contributing online here.

No, we haven’t engineered our own robotic overlords - yet. However, knowledge will give us the power to gain a better understanding of how to approach the world in a way fueled by internal resources versus those provided strictly by machines.

Protect yourself from future breaches

View all
Privacy Info
November 27, 2024

Protecting Women's Privacy in the Age of AI and Facial Recognition

Protecting Women's Privacy in the Age of AI and Facial Recognition

by
Abhijay Bhatnagar
Privacy Info
November 27, 2024

Protecting Women's Privacy in the Age of AI and Facial Recognition

Protecting Women's Privacy in the Age of AI and Facial Recognition

by
Abhijay Bhatnagar
Privacy Info
November 23, 2024

Safeguarding Your Holiday Travel Privacy

Safeguarding Your Holiday Travel Privacy

by
Abhijay Bhatnagar
Privacy Info
November 23, 2024

Safeguarding Your Holiday Travel Privacy

Safeguarding Your Holiday Travel Privacy

by
Abhijay Bhatnagar
Privacy Info
November 21, 2024

Privacy at Holiday Events: Keeping Your Personal Data Secure

Privacy at Holiday Events: Keeping Your Personal Data Secure

by
Pulkit Gupta
Privacy Info
November 21, 2024

Privacy at Holiday Events: Keeping Your Personal Data Secure

Privacy at Holiday Events: Keeping Your Personal Data Secure

by
Pulkit Gupta