Our Blind Trust in Technology
For those of you who haven’t heard at this point, Mark Zuckerberg has big plans for Facebook in 2018 (hashtag goals). The plan? To fix it.
CBS This Morning recently covered a story on Zuckerberg and his goal to fix all the complications that Facebook has created. Throughout 2017, Facebook received a lot of backlash, not only for wasting peoples’ most valuable resource (time), but more importantly playing a role in the results of the presidential election, the spreading of “fake news,” and plenty of hate comments throughout the website.
Many believe that Facebook is growing too powerful. To have the power to influence political elections, hateful comments, and many other propagandistic material could be seen as alarming. Facebook has revolutionized the way we receive information, and because many posts that your Facebook friends may make are subjective, it puts a unique spin on figuring out truth from fiction.
We have a tendency to trust technology without much hesitation. We are all about technological advances as it progresses our world forward, saves us time (supposedly), and makes us more efficient. We are able to look up information in an instant and are able to share that information on social media platforms with our Facebook friends or Instagram followers who may also be interested in the content.
I’m currently reading a book by Nicholas Carr titled The Glass Cage: How Computers Are Changing Us. In it, Carr talks about how our growing reliance on computers is making us disengaged, discontented, and much lazier than we use to be. We are beginning to lack essential cognitive abilities that are developed through the skills that computers are now replacing. Despite all the great things technology has to offer, we are sacrificing a great deal without even realizing it.
This is part of the reason Facebook has caused such a stir recently. Our blind trust in what we read online influences our decision-making and how we act towards one another. There are several articles out there that claim to be true that are far from it. But post it on social media with a compelling and captivating message from yourself and you may have several of your friends eating it up.
Facebook is also the land of bias. Due to who we are friends with, many of us will see a post or read articles that are only in line with our own political, religious, or other forms of beliefs. Facebook, because of its nature of anyone being able to post anything, is a vacuum for reading only what we want to read; only believing what we want to believe. Sure, your racist uncle may post something awful on Facebook or your politically active friend may post way too much about a certain cause you don’t agree with, but overall what we find are posts that align with our beliefs.
This becomes difficult because we will believe what we want to believe. We will often neglect hearing anything from another viewpoint, even if the facts are true. Facebook, in many ways, brings people together, but it also separates people and has the capability to generate hatred when it really isn’t necessary.
Here’s hoping Mr. Zuckerberg is able to fix some of these problems this year, but it’s going to be immensely difficult. Sharing ideas, beliefs, and important articles is wired within the framework of Facebook. It’s a freedom of speech platform for the online world. But how does one parse out the articles which are not true, which are filled with propaganda, from the legitimate news sources? And how does one regulate fake Facebook accounts that are created for unjust purposes? How can we allow peaceful events to be created through Facebook while getting rid of events that fuel hatred and discrimination? These are big questions and huge problems to tackle. I suppose the biggest question, however, is whether or not Facebook has grown too much to the point where it is out of Mark Zuckerberg’s control.
What do you guys think? Do you think there is a way to fix Facebook’s issues? How so? Comment below!