KEMBAR78
Issue Guide | PDF | Social Media | Popular Culture & Media Studies
0% found this document useful (0 votes)
173 views36 pages

Issue Guide

Uploaded by

api-664663354
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
173 views36 pages

Issue Guide

Uploaded by

api-664663354
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

1

Staying Safe Behind The Screen:


Social Media Management in
Terms of Safety
Emily Cotrufello, Mahi Mallina, Gianna Martinelli, Casimir Morgan, Sara Plant, Gavin
Robinson, Luke Taylor-Storm, Rosi Tsarnakova

(Kaur and Mook)


2

Table of Contents

Table of Contents 2

Introduction/Issue Overview 3
Desirable Outcome 3
Conditions Currently 4
Obstacles 5
Stakeholders 7
Framing Questions 7

Approaches 8
Approach 1: The United States Government regulates social media use for all citizens. 8
Approach 2: Individuals/Families regulate their own social media use. 15
Approach 3: Social Media Companies regulate their own platforms. 19

Conclusion 24
Summary 24

Works Cited 26

Deliberation Questions 36
3

Introduction/Issue Overview
How should we approach social media regulation in regards to safety,
specifically considering personal safety, mental health, and data security?
Social media is supposed to connect us, bring us all together, and expose us to new things.
However, as recent findings demonstrate a correlation between social media and mental health
decline, and an increasing amount of studies suggest social media can be addictive, it is clear
that some action needs to be taken in regards to social media.

But who should take this action, what should it entail, and what consequences could come from
this new control?

The problem we are addressing is how and who should be in charge of


monitoring/regulating social media usage and security. Some argue that the
government, specifically in the United States, should provide rules and regulations as to how,
when, and where social media, such as TikTok, is used, to ensure the safety and security of itself
and its citizens. Others consider that individuals/family units should be allowed to self-regulate
their own usage and safety when it comes to social media, supporting their own personal
freedoms. Yet another approach is for the companies, including TikTok and Meta, to ensure that
their platforms are safe for all, protecting their data as well as their mental health with certain
regulations or limitations.

Desirable Outcome
The ultimate desirable outcome is for all to be able to engage in individual freedoms,
such as the usage of social media, while also having a guarantee that they and their
private information will remain protected. Social media is a form of self-expression for
countless individuals. It is also a way to market oneself, presenting portfolios and connecting
with others in the field. Perhaps most importantly, it is a way to stay in touch with old friends
and to make new ones from around the world.

At the same time, social media can be dangerous due to the addictive nature of the platforms
(Miller). We know that studies connect increased screen time with feelings of depression (Mir et
al.), and with no limits on the amount of time spent on these platforms, there can be serious
mental health impacts. Higher screen time is associated with levels of moderate or severe
depression (Madhav et al.). Adolescents are especially impacted, specifically through the
“exposure to idealized images” that can lower self-esteem and develop depression (Boers et al.).
4

Figure 1: Relationship between social media use and depression (Therrien and Wakefield).

In a perfect world, screen time would be able to be managed so that there is no detriment on
one’s mental health from using social media.

Furthermore, social media should be safe and accessible for all. Hate speech runs rampant
across many social media platforms, slipping through the cracks of different applications’
security measures (United Nations). This can lead to real-world violence against marginalized
groups, spurring physical attacks or even murders (Laub). Social media should be a
welcoming and inclusive space, rather than one of division and hate.

Conditions Currently
The United States Government Department of Health and Human Services has a
variety of policies regarding social media to guide usage and ensure accessibility (Digital
Communications Division). Online content, whether on a website or social media applications,
must be accessible to people with disabilities. HHS also suggests that comments be
moderated to prevent discriminatory, explicit, or unlawful content to be shared – unlike
accessibility, however, this is not a law, but an encouraged feature.

The Federal Communications Commission, an independent agency of the United States


federal government, “regulates interstate and international communications by radio, television,
wire, satellite, and cable in all 50 states, the District of Columbia and U.S. territories” – social
media and the internet is notably absent (The Federal Communications Commission).

Social media safety is often deliberated in the court system, both in local districts and the
Supreme Court.

Most frequently, litigations tied to social media center around the Digital Millennium Copyright
Act, the Communications Decency Act, and defamation and privacy (Winston and Strawn LLP).
5

➔ The Digital Millennium Copyright Act (1998) (U.S. Copyright Office)


◆ Protections established for providers of online services if users infringe upon
their copyright,
● Copyright owners can inform providers about the infringement so the
material is taken down
◆ Copyright owners encouraged to provide wider access to their works digitally
● Legal protections against hackings, bypassing encryption
◆ False copyright management information made unlawful
➔ Communications Decency Act (1996) (Congress.gov a)
◆ Originally aimed to regulate pornographic material on the internet
◆ Evolved to regulated indecency when available to children, and obscenity online
◆ Rules that operators of internet services are not publishers
● Not liable for the third parties that use their services
➔ Defamation
◆ “A statement that injures a third party’s reputation…defamation includes both
libel (written statements) and slander (spoken statements)” (Wex Definitions
Team a).
◆ Difficult legally, as it is vague
● Opinion versus fact: only legally arguable if a “fact” is being spoken
● First Amendment freedoms of speech and press conflict with it
● Only partially equipped to handle hate speech
➔ Privacy
◆ Fourteenth Amendment (Congress.gov b) – among other previous amendments –
guarantee individual right to privacy (Wex Definitions Team b)
● Right to be able to make own choices and act alone without government
interference
◆ Therefore, social media has remained largely unregulated by the government
(Earnest)

Obstacles
The United States was founded on the principles of life, liberty, and the pursuit of
happiness, and those values have extended over time into the twenty-seven Constitutional
Amendments we have at present (Congress of the United States of America). The First
Amendment, which states that “Congress shall make no law…abridging the freedom of speech”
(United States Courts) provides a standard that the government does not intervene in social
media regulation.

Below, Table 1 considers the current conditions listed above and the obstacles regulation of
social media faces.
6

Table 1: Conditions and obstacles relating to social media regulation


Current Conditions Obstacles

HHS policies guide usage and make Comment moderation to prevent hate speech,
accessibility mandatory when regarding social explicit, or illegal content sharing is only
media. Online content MUST be accessible to encouraged, and is not a law.
people with disabilities.

Federal Communications Commission Social media and the internet are absent from
regulates radio, television, wire, satellite, and the FCC’s regulation.
cable across the states, District of Columbia,
and U.S. territories.

Defamation, which includes libel and Vague, so it is difficult to actually prosecute in


slander, is prosecutable when a statement the legal system. Furthermore, it is only
injures a third party’s reputation. legally arguable if a “fact” is being spoken –
opinions are protected by the First
Amendment freedoms of speech and press.
Handling hate speech is relatively new and
not yet refined in the courts.

Privacy is guaranteed by the Fourteenth Social media has remained largely


Amendment and previous amendments, unregulated by the government due to this
allowing individual rights to make individual Amendment.
choices and actions without government
interference.

The First Amendment is incredibly important when it comes to social media safety, restriction,
and regulation.

A recent example, and court precedent, is Elonis v. United States (2015), which ruled that Elonis
was “improperly convicted of transmitting threats through postings on Facebook” (Vile). The
First Amendment does not protect “true” threats, which inflict real, great harm and are
interpreted as a real threat to a real person.

Since Elonis was posting lyrics, the SCOTUS reversed the decisions of the lower courts,
overturning the conviction. This case has provided a precedent as to how hate speech and
online threats are considered by the legal system.

Stakeholders
Though we are specifically focusing on how the United States is impacted by this issue, it is a
global issue as well that has far-reaching implications. There are three main stakeholders:
7

the social media companies, social media users, and the government, who have the
ability to regulate social media. Of course, all citizens are concerned, so parents of children
who use social media are stakeholders, even though they don’t use social media themselves.

Not only is everyone who has any form of social media impacted, but those without social media
are still faced with the ramifications of social media usage and its importance in communities,
nations, and the international community. A large brunt of the stake of this issue is held by the
individuals, and especially children/younger generations, for whom social media can
shape their future (in positive and negative ways). Those in the audience, all college students,
are definitely impacted by social media, even by the rare chance that they don’t have it.

Framing Questions:
1. Who should control social media usage?
2. What measures should be taken to prevent hate speech, if any?
3. What should be done to protect users’ mental health?
8

Approaches

Approach 1: The United States Government regulates social media


use for all citizens.
The negative impacts of excessive social media use, especially on American youth and young
adults, are apparent. These impacts encompass issues regarding an individual's privacy,
their mental health, and the impact of media addiction, which is a problem for many
teens at varying levels. Under this approach, the United States' government would recognize the
detrimental effects of social media on the American public, and would make strides to limit the
public's use of it. While this may seem to encroach on traditional American freedoms, it also
may be necessary, as social media plays an integral role in society at this point in time, and there
is research to support its harmful effects.

Problem
As far back as 2016, 90% of all college students globally reported using some form of social
media (Braghieri et al. 1). Some of the harmful effects of excessive media use include a decreased
attention span, specifically in an educational environment, a lack of security with regard to
personal information, and increased rates of mental health issues in youth.

In practice, certain states have already implemented or proposed bills to regulate social media in
order to foster a safer environment for children under the age of 18. For instance, there have
been proposals for age verification in Utah and the federal Kids Online Safety Act. However,
there is still more to be done, especially since there are existing threats to national
security on social media platforms. These are the reasons for which it may be effective for
the government to regulate media usage on a national scale.

The first reason for which government regulation may be important regards the violation of an
individual's privacy if social media were to remain unregulated in the United States. Whenever
an individual first downloads a social media platform, they must agree to the organization's
rules and requirements, or they are not able to use it. Many of us brush over these conditions, as
we are excited to use the new social media platform that has been all the rage lately.

However, according to LegalZoom, a business startup informational website, accepting a terms


and agreements contract "requires 'consideration,' in that both sides need to give something up
of real value." Because social media is free, usually the payment a user offers is
allowing the platform to access their site usage with the goal of improving the
application's analytics (Peterson). Many companies either sell or share the data they receive
about individual users with third-party marketers (such as one's name, email address, and the
type of content they like) with the hope that other organizations can benefit from that same user
(Knowledge at Wharton Staff). Furthermore, Sebastian Angel, a professor at the University of
Pennsylvania of computer and information science says, "there's no real way to opt out" of this
9

privacy breaching problem (Knowledge at Wharton Staff). So, might it be necessary for
government regulation to prevent data collection which encroaches on an individual's privacy?

Social media can also have a negative impact on the mental health of the average American teen.
The American Economic Review journal states that in 2021 4.3 billion people in the world
had social media accounts, which is over half of the world's population (Braghieri et al.
3660). Along with the rise of social media since the early 2000s, the number of adolescents and
college students who struggle with mental health issues such as depression, anxiety and suicidal
tendencies has also increased (Braghieri et al. 3661). While there is no direct data linking social
media usage to mental health issues, many scientists theorize that the rise of social media
platforms has played a major role in this phenomenon (Braghieri et al. 3661). Additionally, there
is clear linkage between social media usage and body image issues, as young people
constantly see photoshopped images of men and women who fit the unrealistic beauty standard
in the United States (Fleps). It is possible that there are mental health issues and safety concerns
which support the approach of regulating social media usage in the United States

Furthermore, have you checked your screen time lately? Many teens tend to avoid checking
because the amount of time they spend on their phones and on social media is alarming.
Psychologists recently estimated that 5-10% of Americans
actually classify as addicted to social media today
(AddictionCenter). And while many people are not severely
addicted to media, they have some of the addiction traits which can
prevent them from being productive in other areas of their lives.
For example, excessive screen time can prevent teens from getting
enough sleep, from being able to focus in their classes, and even
distracting them from conversations with friends. Social media
addiction can even be compared to drug addiction; when a drug
addict uses, their dopamine levels rise. AddictionCenter says, "This
is observable in social media usage; when an individual gets a
notification, such as a like or mention, the brain receives a rush of
dopamine and sends it along reward pathways, causing the
individual to feel pleasure" (AddictionCenter). The extent to which
Americans rely on social media for happiness is alarming, so there
may be weight in an approach which restricts media usage.

Figure 2: A pie chart depicting how difficult polled adolescents believe it would be to give up social
media (Vogels et al.).

However, social media doesn’t just affect individuals. For hundreds of years, defending the
nation from attacks domestically and internationally has been one of the US’s top priorities for
centuries, creating the basis for our federal government and its entities such as the CIA, FBI,
NSA, Homeland Security and so much more. As such, there are already many safeguards to fight
against perceived risks despite it going against certain rights citizens hold. For instance, the
10

Patriot Act, enacted after the September 11th attacks, gives law enforcement expanded
surveillance abilities such as tapping phones in a fight against terrorism. The act has had a
widespread impact with mitigating terrorist threats. Specifically, there have been more
than 200 convictions on the act of terrorism and the “disruption of over 150
terrorist cells” (Robinson 4). Social media has revolutionized the way we share information,
however in the midst of cat videos and prom pictures, international terrorist groups, such as
ISIS, use social media to recruit potential members and communicate about attacks. In fact,
there has been a rise in extremist content with almost 90% of extremists radicalized
with the help of Twitter, YouTube and Meta in 2016 (Jensen et al.). This content usually
portrays the groups as “cool saviors” and glamorizes fighting with them, influencing
impressionable young minds (Awan 138). Not only is this harmful to the individual’s psyche, but
also to the nation as a whole.

Figure 3: A figure displaying the increase of social media use for extremists in the United States (Jensen et al.).

Current/Possible Solutions
The idea of the government regulating social media is not new; in recent years, there have been
countless attempts of passing legislation with both success and failure. The public has also seen
a rise in support with nearly 50% of U.S. adults supporting regulation and 61%
believing social media companies don’t do enough when left alone (Teale).
11

Figure 4. Poll regarding support for government regulation (Teale).

What could we do to make social media more nurturing and safe for people of all ages? We want
to look at the use of government regulation with three specific focuses: limiting screen time,
cracking down on stricter age restrictions and monitoring for content deemed to threaten
national security.

Screen Time
Since the large majority of children spend excessive amounts of time on social media, which
could be detrimental to their mental and physical health, we propose a law that would set
national limits on how long children under the age of 18 can spend on devices. One
idea that would be the base of our proposal is Taiwan’s Child and Youth Welfare and Protection
Act that was passed in 2015 in order to promote the welfare of Taiwanese youth. Essentially, the
legislation equates hours spent on social media and online games to be as damaging as drinking,
smoking and doing drugs (Locker). Parents who allow their children to be on devices for an
unhealthy amount of time until they become “mentally or physically” ill could face a hefty fine of
up to 1,600 USD from the Taiwanese government (Ministry of Health and Welfare). However,
the law is very vague without strict definitions for what “unhealthy amounts” mean or how the
government plans to enforce this nationally when it is up to parents to monitor their children.

Although it may be difficult to enforce these rules at home, it may be beneficial to create
government regulations for federally-funded institutions like public schools. In 2021, Minnesota
passed legislation restricting a child’s access to “individual-use screen” in publicly funded
preschool/kindergarten programs unless required by a teacher’s lesson plan or a pre-existing
504 plan (Department of Education). Olivia Christensen, an education specialist, believes that
this law is “proactive” and a “preventative measure” to mitigate addiction and allow students to
gain better cognitive and interpersonal skills without a screen in between them (Davis). Some
schools such as the Rochester Public Schools have already discontinued 1-on-1 iPads for
students to fund more practical experience-based education (Davis).
12

Age Restrictions
Another crucial component to regulating social media would be to tighten the rules on who can
access certain content or create accounts. With age restriction, the government would be able
to help protect the nation’s vulnerable from growing threats of crimes against children and
foster a healthier environment.

Utah’s State Senate bill 152 could be the start to a nationwide implementation of making
sure children are truly the age they say they are. Introduced by Senator Mike McKell, the bill
would require social media companies to get parent’s consent before children under the age of
18 can open an account with harsh fines for noncompliance (McKell). If passed, there could also
be more restrictions on the accounts such as controlling who they can direct message, what ads
they can see and give parents/guardians access to the minors’ actions on the accounts (Cabrera).

There are already similar ID verification systems in place with “millions of people using it for
dating, gambling and pharmaceuticals,” according to McKell (Cabrera). This would mean that
there would be low upfront costs for both the companies and government when implementing
such software in social media. Emily Daly, the legislative director of Utah Parents United,
believes that instead of federally-issued identification such as birth certificates, Social Security
numbers and passports, the bill could also establish state identification numbers; with this
approach, we could limit the threat of third-party data breaches stealing crucial information and
therefore, protect the children’s identity (Cabrera).

Another viable solution that has already made its way to Congress is the Kids Online Safety
Act, introduced by Senators Richard Blumenthal and Marsha Blackburn. Similar to
Minnesota's, the bill would hold social media companies accountable for exposing children to
disturbing explicit content (violence, self-harm, pornography, etc.). The companies would be
required to hold annual internal audits to ensure that they are taking strides to reduce harm and
create a safe environment for children (Blumenthal). However, the act goes even further,
mandating that the data from social media algorithms be looked into further in regards to safety
and children’s well-being (Walters 8).

Content Monitoring
Despite arguments of such acts breaching civil liberties, the US government has started
discussions about whether it would be beneficial to create regulations with censorship and
government monitoring on social media in order to protect our national safety. As of right now,
there are no regulations as the government relies on social media surveillance instead. It can flag
content that is “problematic” and request the company to take the post or account down, but
they do not have ultimate power (Brown and Peters 527). However, with more and more threats,
lawmakers and politicians argue for increasing censorship of “hostile” content and
limiting the “dissemination of information at the root” (Fariss and Lo 671).

Similarly, there have been rising concerns over enemies spying on the US through social media
data. Recently, TikTok has been in the center of controversy. Since it is owned by the Chinese
13

company ByteDance, U.S. user data is vulnerable as companies based in China are required
by law to hand over user information if the government requests it (Bhuyian). Experts believe
that leads to the threat of using the collected information for intelligence and military purposes
(De Le Santos and Klug 227). The United States is not the only one worried; India banned the
app in 2020 in the interest of the country's "sovereignty and integrity" during a border dispute
between the two countries (Mishra et al. 815). Nationally, this fear has already led to the ban of
the short-form video app on federal-owned devices, completely banning all Chinese-owned
apps in some states, and the introduction of legislation completely banning TikTok for
consumers (Morrison).

Implementation of Regulations
There are many key regulations that would make social media a safer place for citizens and the
nation as a whole. Nonetheless, this would not be quick and easy; legislation could take
anywhere from days to years to pass based on Congress’ wishes, if it does pass in the first place.
Even when the regulations are passed, it will take tremendous amounts of money to implement
them in society

However, there have also been discussions on whether such laws would actually make an
impact. Primarily, most of the solutions that have been proposed are based around monitoring
and controlling the actions we take in regards to social media (limiting our time spent on it and
the range of information we can get/publish). This comes with its own costs, especially the
violation of the rights of citizens. Many regulations aimed at restricting content were blocked
by the Supreme Court citing the 1st Amendment (Brown and Peters 532). Other
arguments include whether or not the government should have such power over its citizens, the
extent to which they should be involved in private lives, and if content censorship could be
biased to meet a certain political/ideological outlook.

Conclusion
While social media use is increasingly common in the modern age, there are clearly certain
drawbacks to American citizens using it excessively. The major problems which may indicate a
need for governmental regulation of social media include breaches of privacy which can occur
with social media use, negative impacts on an individual's mental health due to excessive time
on social media, and the risk of media addiction or traits of social media addiction. Certain ways
in which government regulation may be applied to this situation include regulating the time
spent on social media, creating stricter age verification processes for underage children, and
increasing censorship of problematic content that threatens national security. Excessive social
media use is a problem which is incredibly difficult to escape in the modern world. It is
extremely uncommon for someone to have no social media whatsoever, and the world is
becoming increasingly more online. Therefore, it may be necessary for governments to regulate
social media, as it is nearly impossible to escape at this point in time.
14

Table 2: Possible actions for approach 1


Action Taken Benefits Drawbacks

Regulate amount of time spent ● Increased focus on face-to-face ● Hard to enforce regulation in
on social media with laws that interpersonal, communication households when it is parent’s
target screen time through and cognitive skills duty and where government has
parental intervention and public ● Have a practical experience-based less control
schools across the country curriculum in schools ● Vague standard of what
● decreased rates of media “excessive” usage means
addiction
● Beneficial for physical and mental
health

Create stricter standards for age ● Protect young users from explicit ● Lose ability to gain critical social
verification with ID and or age-inappropriate content and media skills such as learning
parental consent before opening interactions with enhanced about risks and socialization
an account. Accounts of those mandated settings based on extent of restrictions
under 18 would have more ● Low upfront cost to implement as ● Potentially give parents and
privacy settings/restrictions in age verification is pre-existing guardians too much control over
place technology minors’ lives
● Potentially harmful for those in
unstable conditions (abuse,
LGBTQ+, etc) as they have
limited access to resources and
are restricted from reaching out
for help without adults knowing
● Risk of personal data leaks if
state or federal-issued IDs are
used in verification process

Increase censorship of ● Prevent brainwashing of ● Infringe on 1st amendment


problematic content that can impressionable young minds with rights
threaten national security extremist content ● Censorship could be political or
(terrorism, hate speech, etc.) ● Prevent spread of false ideologically charged
information ● Government intervening too
Ban of apps based in countries ● Reduce threat of imminent much in the daily lives of citizens
that pose a threat to the physical/data attacks and
country’s national security safeguard security of the US
15

Approach 2: Individuals/Families regulate their own social media


use.

Problem
Whether it’s to catch up with friends, check out what is going on in the world, or to mindlessly
scroll, everyone uses social media for various purposes. As a matter of fact, in 2021, global
social media usage averaged around 147 minutes per day, an increase from the 145 minute
figure in 2021 (Dixon). However, a global problem still persists: there is no regulation over
social media. Therefore, the idea over who should regulate social media usage in various aspects,
still remains in the air.

Figure 5: Global average time (minutes) spent on social media (Dixon).

Data Security and Privacy


When it comes to data security and media usage, the current conditions are shocking. A 2017
survey conducted by Deloitte found that 91% of individuals will agree to terms of service for an
app or software without “reading the fine print”. When they looked at the specific age group
of 18-34 years, 97% of participants would pass up on the reading (Guynn). The consequences of
this are significant because in disregarding these lengthy terms, apps, especially social media
platforms, have free reign to use, share, and store personal and shared information.
If someone uses a media platform to share very general and subtle content, excluding personal
information in the sense of interests, location, etc., these effects are not as detrimental.
However, the more comfortable an individual becomes with a particular platform, the more
personal their shared content becomes, posing a real danger to safety.

Content
Another aspect of regulation takes into account the content of social media. Currently, Section
230 of the Communications Act of Decency of 1996 enables social platforms to remove users or
content that is graphic, part of hate-speech, or a reference to child exploitation (Vanegas).
However, there is no delineated limit on what makes a post “bad”, nor is there a way
of content moderation. This enables users to share and curate content that may or may not
be inappropriate to some viewers, especially if they are younger. While preventative measures
like reporting features exist that allow users to designate a post as inappropriate or
16

disinteresting, the extent to which these suggestions are taken into account is
variable (O’Dell). Additionally, the interconnectedness of social media applications and sites
enables such content to be shared across various mediums, being viewed by even more people.

Implementation
This approach, through reliance on individual control (rather than national or third-parties),
focuses on the values of individuality, self-respect, and freedom. Since social media is used by
individuals of all ages, the approach can be categorized towards two main groups: children
(those under 18 years of age) and adults (anyone over 18 years of age).

With regards to children, both the route of self-control or parental regulation could be
taken. In the instance where parents are involved, this would mean restriction on certain
platforms, time spent online, and content or information to post (if applicable). When it comes
to self-regulating media usage in younger children, the aspect of self-discipline is put into play
once again. However, in order to develop a self-regimen, supplementation from external
sources, such as informative web-sites or role model figures are required. When a “good
practice” model can be created for children to use, they can ultimately regulate their screen
time and social media usage as a whole.

Advantages
Approaching self-control in adults follows a
similar outline as in children. Generally, adults are
more accustomed to the modern day preventative
measures, like screen time limitations (where the
device locks once the limit is reached), and
understanding what information is applicable to
send out in terms of data security. Self-education
and more resources to learn about the negative
consequences of extreme social media usage can
further mature the concept of self-discipline
(Udorie). As previously stated, this approach
supports the freedom of the individual and the right to make your own choices regarding
physical and mental well-being. This freedom exists in almost every other area of health
including choices surrounding things like exercise, addiction, eating habits, and seeking mental
health assistance. Ultimately, it is the responsibility of the individual and families to
follow the same habits with screen usage and use self-regulation.

If governments or businesses themselves were to monitor screen usage on media, people may
feel as though a right was taken away from them. Not everyone needs the same amount of
monitoring or screen time usage in the same way. The individual- monitoring approach allows
17

each person to customize a goal that is best for them and must hold themselves accountable for
this ideal.

As the individual and family must set their goals, this can set a grounded practice for
self-control. Although it has been seen that this has been rather difficult for the newest
generations, we already have some tools to help us with this, such as parental controls and
screen time limits on individual apps. It is also observed that parents' use of screens has a direct
effect on the screen usage of their children. In an article on the role of parents on their young
children’s screen time, it was found that “parent screen time is the strongest predictor of child
screen time”(Lauricella).

A B

C
Figure 7: Screen time for children over different devices based on the perceived parent screen time for A)
low B) medium and C) high actual screen time (Lauricella).

This approach may be effective by targeting the issue from the most direct source: the family.
Part of this view could focus on parents setting a positive example for their children in
responsibly using technology and time spent on media. With this being said, children raised in
families who are aware of the negative outcomes of hyper-use of media would be better fit to
combat this issue and set a strong example for themselves and their future children. This
approach may also seem more beneficial to those with above-average self control and the ability
to self-regulate.
18

Drawbacks
Although the ability to choose your route for screen usage could be considered a great benefit to
this approach, there are also certain drawbacks. As witnessed in today’s society, self-control is a
great challenge in terms of limiting screen usage and the negative effects that come with this
issue. An article by the Mayo Clinic claims that over-using social media can lead to a lack of
sleep, less exercise, damages to social life, and countless more side effects (Mayo Clinic Staff).
The algorithms and tactics used by media companies only worsen this case as they create media
perfectly defined to the user without end in sight. Aside from how willful the user is, media has
been found to be highly addictive and its pull is inevitable. There is also the possibility of a
lack of education on the topic as not everyone is aware of the negative consequences that come
with excessive screen time. Schools and other programs directed at youths aren’t required to
educate about this issue and, in a way, encourage the addiction as most schools have
incorporated more technology throughout the years. They may have provided laptops and IPads
to their classes, giving their students even more access to media and incorporating a
technological necessity into their everyday routines.

Even those who are educated on the screen use issues may not prioritize this issue as one
worthy of resolving. They may accept the issue for what it is and are not motivated enough to
make a change. The approach may seem rather challenging to those uneducated or those with
low self-control.

Table 3: Possible actions for approach 2


Action Taken Benefits Drawbacks

Parents or families regulate ● Relies on the role model behavior ● Parents may not lead a good
young children’s screen time of parents example for their children
and data privacy ● Content limitation based on ● Uninvolved parents may not be
parental discretion concerned with the content their
child views

Children regulate their own ● Builds sense of independence and ● Children may struggle with
screen time and data security. establishes freedom self-control
● Allows children to learn about ● May not be educated on negative
mistakes through trial and error side effects of excessive screen
use

Adults regulate their own screen ● Greater understanding and ● May still struggle with
time and data security. implementation of preventative self-control and proper
measures (i.e. screen time locks) limitation
● Increased education on media ● May disregard the information
usage safety on the issue or not prioritized
19

Approach 3: Social Media Companies regulate their own


platforms.

Problem
When discussing the issues regarding social media regulation and safety, could the
responsibility lie on the companies themselves? If every individual company had control over
guidelines, screen time, age limits, and privacy settings, it could potentially improve the safety of
their user base. This “small government” approach has a lot of potential to increase the safety
and privacy of users by removing the role of the government in social media. If the
government were to stay as involved as they are, user data wouldn’t reliably be kept private.
Right now, there is no real monetary incentive for companies to remove and delete harmful
content (Cusumano, Yoffie, Gawer).

Figure 8: Entertainment Media Landscape (Amor).


20

To counteract this, policies could be put in place to enforce social media companies to strictly
abide by their individual terms of service. An example of how this would work is as follows: such
regulations would make Facebook liable for their recent failures to implement their terms of
service. However, this solution would include a small role for the government because they
would have to enforce these guidelines and properly punish companies who don’t abide by their
policies. The reasons why this would be an effective method has been proven in history. Movies,
video games, and television shows all have content that needs regulation. In the 1950s-1960s,
there was a real threat of government regulation of the appropriateness of the content of this
type of media. In order to avoid the government intruding, these entertainment industries
created a self-regulated rating system (Cusumano,Gawer,Yoffie). By implementing this, they
were able to avoid the government regulation before it even became a problem.

Figure 9: ESRB Rating System (Hall)

Solutions
A big factor that would positively contribute to the implementation of self regulation among
social media companies is the removal/revision of Section 230. Section 230 is a law passed
in 1996 that states, "No provider or user of an interactive computer service shall be treated as
the publisher or speaker of any information provided by another information content
provider"(Electronic Frontier Foundation). Section 230 protects big companies from
being liable for any hate speech or harmful content posted on their site. While this
makes sense to have put in place, companies should be responsible for identifying any money
gained from advertisements and engagement tied to any viral posts and have that money
removed from their profits as fines. A problem with Section 230 is the “good samaritan”
exception. Platforms can remove content that they see as harmful as long as it is done in good
faith, allowing for a lot of bias when companies are deciding what is harmful or appropriate.
More intense self-regulation might lead to better trust among a user base.

This is a potential fix, and some companies such as Facebook have attempted to implement
certain policies that further ensure they are trying to remove harmful content. A large policy
implemented recently is the fact checking bot that flags certain posts as unreliable or untrue
(Meta Business Help Center). This is a big step in the right direction because it shows how
21

companies can once again create policies that deter government regulation before it is
even needed.

When thinking about combating the negative effects of social media, it might take time to go
back to the source of the issue. Having individual companies self-regulate their content,
restrictions, age limits, and data privacy could help curb excessive use of media platforms and
reduce time spent endlessly scrolling. However, this approach has potential pitfalls, as social
media companies may have different values than individuals or the government.

This approach would call for companies to add features such as mandatory screen-time limits,
unskippable break reminders, revised data security policies, simpler terms of service, and
AI-powered content regulation. Each of these features has been successfully implemented in
other fields, so transitioning their medium should be fairly straightforward. Individual
companies are best prepared to evaluate and implement each solution for their particular user
base, allowing for a quick and efficient transition to new policies.

Control of regulation by individual corporations may appeal to those who value small or limited
government and reduced government involvement in the economy. Company-based regulation
may also appeal to those who value privacy, as government involvement in social media
platforms would require an analysis of user data to impose proper restrictions.

Benefits
Implementing this approach would come with benefits and drawbacks for several parties. To
start, companies would retain control of their own platforms, benefitting those who
place value on a small/limited government. Implementation with this approach would be
smooth, efficient, and straightforward, as individual companies are the best prepared to
examine and revise their own policies, compared to lawmakers who may be unfamiliar with the
specifics of the platforms (Adler, Sung).

Figure 10: Digital Trust vs. Company Growth Rate (McKinsey).


22

Users will directly benefit from this strategy by feeling better about the platforms that they are
trusting with their data and be reassured that companies have their interests in mind, rather
than solely making decisions for monetary gain. Gaining the trust of a user base is also beneficial
for companies, as leaders in digital trust are far more likely to see annual revenue
increases (McKinsey). Over time, these increases will add up, establishing a larger user base
and greatly increased profits. Additionally, companies will not have to worry as much about the
potential hassle that government involvement and regulations would bring to their platforms,
including lawsuits and fines, if they take preemptive action to reassure government officials
(Barnes).

Figure 11: Twitter Quarterly Earnings (Statistica)

Drawbacks
This approach also has drawbacks for the same parties that would potentially benefit. While
companies may gain popular support resulting in long-term earnings, their short-term profits
might suffer if their decisions do not correlate with the wishes of their funders. Unpopular
decisions may cause advertisers to cut spending, as seen in the recent flee from Twitter
(Duffy). Advertisers spend billions on social media advertising each year, and it makes up a third
of all digital advertising (Hootsuite). If rates of return plummet, they may shift their focus
elsewhere (Twitter Marketing). As a result, social media companies might refuse to increase
their regulatory efforts for fear of losing revenue without support and pressure from the public.
Creators and influencers who make a living on social media platforms may also be negatively
affected by this approach. Imposing platform-wide rules to lower the amount of time that users
spend on the app means less time for users to interact with content and
advertisements, resulting in lower earnings for creators.
23

This approach may also have a negative effect on some users. In order for companies to regulate
content, they need to make decisions about what they deem to be acceptable on their platforms.
These decisions might go against the views of some users, resulting in a more negative
experience on the platform. Inconsistencies among platforms may also make it hard for users to
understand what is and is not okay to post (ie. Former President Trump banned on Twitter, but
not Truth Social). Companies would also have to decide what they deem as misinformation.
Again, there may be inconsistencies across platforms, resulting in a presentation of confusing
and conflicting information.

However, there are ways for social media platforms to combat these drawbacks. For example,
companies could form a Self-Regulatory Organization amongst themselves to reassure
advertisers and provide standards and enforcement across multiple platforms (Investopedia).
While this has the potential to allow for a more standardized industry, it is possible that
companies would choose not to join this organization which could create its own set of issues.

Table 4: Possible actions for approach 3


Action Taken Benefits Drawbacks

● Self-Regulatory Organization ● Provides consistent standards ● May result in further conflict


and enforcement across and issues if companies choose
platforms not to join

● Section 230 Revision ● Could hold companies ● Revision could allow


responsible for revenue based companies to remove content
off of harmful content based on bias
● Can still protect platforms and ● Less protection to companies
userbases if revised properly ● More lawsuits
● Greater accountability for ● Protects smaller platforms with
issues caused by social media little knowledge/protection
(exploitation, false info, etc.) ● Maintains the idea of free
● Further incentivises regulation speech

● Individual Regulation ● Better for business ● Regulations would become


● Stricter enforcement of terms competitive
and conditions ● Inconsistency
● Companies could tailor their ● Profits are priority
regulations to the purpose of
their platform
● More Privacy
24

Conclusion
Summary

While social media is supposed to bring us together, it also does much more, and not all of it is
positive. Many studies suggest social media can become addictive, and it has been linked with a
mental health decline in many users (McLean Hospital). While social media companies have
been the subject of many news headlines and a few congressional hearings, the downsides of
social media focused on in this deliberation have not. To combat these potential consequences of
social media use, we highlighted three potential approaches.

Approach 1 focuses on government regulation of social media. The government has mostly
kept its hands out of social media for many reasons, but the restrictions and protections it would
legislate could have huge impacts on social media users’ mental health. However, this legislation
may encroach on personal freedoms, and could potentially be difficult to enforce equitably.
Approach 2 instead values personal freedoms, allowing individuals or families to regulate their
own social media use. It emphasizes looking after your own needs, moderating your own social
media content (not paying attention to potentially harmful posts), and looking out for your own
safety. The potential drawback is that individuals may not be able to regulate their own social
media use, especially due to social media’s addictive nature. Approach 3 contends that
companies should be the ones to regulate themselves and their own platforms. Because these
companies design these platforms, social media corporations should be able to best address the
negative effects of social media. The largest downside of this approach is that there is no
guarantee the social media companies will act with the users’ best interests in mind, and could
potentially put profit above user health and safety.

Each approach has a different set of values, but all have their benefits and drawbacks. While all
work towards neutralizing social media’s negative effects, making a decision on the best
approach comes to analyzing both the benefits and drawbacks, but also the values associated
with them.
Table 5: Approach 1 - Government regulation
Benefits Drawbacks Values

● Ensures action will be taken on ● Limits amount of ads and ● Safety and Security
social media’s negative effects content users see, resulting
● Limits hate speech and “fake news” in lower earnings for social
● Can restrict and regulate content or media companies,
platforms that are potentially advertisers, and content
national security threats creators
● Potential for data leaks if
there is heavy regulation
25

● Difficult to enforce, actions


can be unfairly or unjustly
enforced

Table 6: Approach 2 - Individual or family self-regulation


Benefits Drawbacks Values

● Allows for more freedom ● Poor practices of ● Freedom


● Allows parents to set examples self-restraint ● Self-Control
● Restrictions can be individualized ● Negative side effects to ● Individuality
based on needs excessive media use ● Independence
● Not everyone has positive
parental role models to set
examples of appropriate
screen use
● Education not provided to
everyone on effects of
excessive screen usage

Table 7: Approach 3 - Company regulation


Benefits Drawbacks Values

● Reduce the time that users spend ● Motives likely driven by ● Small/Limited Government
on social media and promote a profit ● Privacy
healthy lifestyle ● Puts the power to determine ● Trust
● Cultivate a safer platform that users what is acceptable in the
can trust hands of corporations
● Creates an efficient system ● What companies filter may
maximizing revenue for all parties be what some consumers
invested might want to see
26

Works Cited

Adler, Jim. “Why Data Privacy Self-Regulation is Better than Involuntary Options.” TrustArc, 27

July 2012, https://trustarc.com/blog/2012/07/27/why-data-privacy-self-regulation/.

Accessed 27 February 2023.

Awan, Imran. “Cyber-Extremism: Isis and the Power of Social Media.” Springer Link, 15 March

2017,

https://link.springer.com/article/10.1007/s12115-017-0114-0?data2=ardwn001#citeas.

Accessed 25 February 2023.

Barnes, Robert, et al. “Highlights from Supreme Court arguments in Google case that could

change internet.” The Washington Post, 21 February 2023,

https://www.washingtonpost.com/technology/2023/02/21/gonzalez-v-google-section-2

30-supreme-court/. Accessed 27 February 2023.

Beveridge, Claire. “56 Important Social Media Advertising Stats for 2022.” Hootsuite Blog, 24

February 2022, https://blog.hootsuite.com/social-media-advertising-stats/. Accessed 27

February 2023.

Bhuiyan, Johana. “TikTok has become a global giant. The US is threatening to rein it in.” The

Guardian, 31 October 2022,

https://www.theguardian.com/technology/2022/oct/30/tiktok-regulation-data-privacy-

china. Accessed 27 February 2023.

Boehm, Jim, et al. “Digital trust: Why it matters for businesses | McKinsey.” McKinsey &

Company, 12 September 2022,

https://www.mckinsey.com/capabilities/quantumblack/our-insights/why-digital-trust-t

ruly-matters. Accessed 27 February 2023.


27

Boers, Elroy, et al. “Association of Screen Time and Depression in Adolescence.” NCBI, 2019,

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6632122/. Accessed 26 February

2023.

Braghieri, Luca, et al. “Social Media and Mental Health.” American Economic Association,

https://www.aeaweb.org/articles?id=10.1257/aer.20211218. Accessed 27 February 2023.

Brown, Nina I., and Jonathan Peters. “Say This, Not That: Government Regulation and Control

of Social Media.” Law Journal Library, 2018,

https://heinonline.org/HOL/Page?collection=journals&handle=hein.journals/syrlr68&i

d=563&men_tab=srchresults. Accessed 27 February 2023.

Cabrera, Alixel. “'Our kids cannot handle this': Does social media need government regulation?”

The Salt Lake Tribune, 1 February 2023,

https://www.sltrib.com/news/politics/2023/02/01/can-cracking-down-social-media/.

Accessed 27 February 2023.

Congress.gov a. “Fourteenth Amendment | Browse | Constitution Annotated | Congress.gov |

Library of Congress.” Constitution Annotated,

https://constitution.congress.gov/browse/amendment-14/. Accessed 26 February 2023.

Congress.gov b. “S.314 - 104th Congress (1995-1996): Communications Decency Act of 1995.”

Congress.gov, https://www.congress.gov/bill/104th-congress/senate-bill/314. Accessed

26 February 2023.

Congress.gov c. “Text - S.3663 - 117th Congress (2021-2022): Kids Online Safety Act.”

Congress.gov, https://www.congress.gov/bill/117th-congress/senate-bill/3663/text.

Accessed 27 February 2023.

Congress of the United States of America. “Amendments to the U.S. Constitution.” National

Archives Foundation,

https://www.archivesfoundation.org/amendments-u-s-constitution/. Accessed 26

February 2023.
28

Cusumano, Michael. “Social Media Companies Should Self-Regulate. Now.” Harvard Business

Review, 15 January 2021,

https://hbr.org/2021/01/social-media-companies-should-self-regulate-now. Accessed

27 February 2023.

Cusumano, Michael A., et al. “Pushing Social Media Platforms to Self-Regulate.” The Regulatory

Review, 3 January 2022,

https://www.theregreview.org/2022/01/03/cusumano-yoffie-gawer-pushing-social-me

dia-self-regulate/. Accessed 27 February 2023.

Davis, Mackenzie. “New Minnesota screen time law limits use in pre-K, kindergarten

classrooms.” KAAL, 20 July 2022,

https://www.kaaltv.com/news/local-news/new-minnesota-screen-time-law-limits-use-i

n-pre-k-kindergarten-classrooms/. Accessed 27 February 2023.

De Los Santos, Maya, and Daniel Klug. “The TikTok Tradeoff: Compelling Algorithmic Content

at the Expense of Personal Privacy.” ACM Digital Library, December 2021,

https://dl.acm.org/doi/abs/10.1145/3490632.3497864. Accessed 27 February 2023.

Digital Communications Division. “Social Media Policies.” HHS.gov, 26 March 2019,

https://www.hhs.gov/web/social-media/policies/index.html#:~:text=Use%20of%20soc

ial%20media%20technologies,govern%20information%20and%20information%20tech

nology. Accessed 26 February 2023.

Dixon, S. “Global daily social media usage 2022.” Statista, 22 August 2022,

http://www.statista.com/statistics/433871/daily-social-media-usage-worldwide/.

Accessed 27 February 2023.

Duffy, Clare, and Christopher Hickey. “More than half of Twitter's top 1,000 advertisers stopped

spending on platform, data show.” CNN, 10 February 2023,

https://www.cnn.com/2023/02/10/tech/twitter-top-advertiser-decline/index.html.

Accessed 27 February 2023.


29

Earnest, Stephen. “Constitutional Avenues for Challenging Social Media Monitoring by Law

Enforcement.” Minnesota Journal of Law & Inequality, University of Minnesota Law

School, 25 May 2021,

https://lawandinequality.org/2021/05/25/constitutional-avenues-for-challenging-social

-media-monitoring-by-law-enforcement/. Accessed 26 February 2023.

“Fact-Checking Policies on Facebook | Meta Business Help Center.” Facebook,

https://www.facebook.com/business/help/315131736305613?id=673052479947730.

Accessed 27 February 2023.

The Federal Communications Commission. “What We Do.” Federal Communications

Commission, https://www.fcc.gov/about-fcc/what-we-do. Accessed 26 February 2023.

Fleps, Bella. “Social media effects on body image and eating disorders.” Illinois State University

News, 21 April 2021,

http://news.illinoisstate.edu/2021/04/social-media-effects-on-body-image-and-eating-

disorders. Accessed 27 February 2023.

“47 USC 230: Protection for private blocking and screening of offensive material.” U.S. Code,

https://uscode.house.gov/view.xhtml?req=(title:47%20section:230%20edition:prelim).

Accessed 27 February 2023.

Guynn, Jessica. “Data Privacy Day: Not reading the small print can hurt you.” USA Today, 28

January 2020,

https://www.usatoday.com/story/tech/2020/01/28/not-reading-the-small-print-is-priv

acy-policy-fail/4565274002/. Accessed 27 February 2023.

Hall, Charlie. “A brief history of the ESRB rating system.” Polygon, 3 March 2018,

https://www.polygon.com/2018/3/3/17068788/esrb-ratings-changes-history-loot-boxe

s. Accessed 27 February 2023.


30

Hilliard, Jena. “Social Media Addiction.” Addiction Center, 15 July 2019,

https://www.addictioncenter.com/drugs/social-media-addiction/. Accessed 27 February

2023.

“How should we regulate the flow of online content?” The World Economic Forum, 21 February

2018,

https://www.weforum.org/agenda/2018/02/how-should-we-regulate-the-flow-of-onlin

e-content-here-are-three-crucial-facts-in-the-debate-over-digital-media/. Accessed 27

February 2023.

Jiang, Jingjing. “How Teens and Parents Navigate Screen Time and Device Distractions.” Pew

Research Center, 22 August 2018,

https://www.pewresearch.org/internet/2018/08/22/how-teens-and-parents-navigate-s

creen-time-and-device-distractions/. Accessed 27 February 2023.

Kaur, Shruti, and Janice Mook. “Face off: Should social media platforms be regulated?” South

China Morning Post, 28 May 2022,

https://www.scmp.com/yp/discover/your-voice/article/3179329/face-should-social-me

dia-platforms-be-regulated. Accessed 27 February 2023.

Laub, Zachary. “Hate Speech on Social Media: Global Comparisons.” Council on Foreign

Relations, 7 June 2019,

https://www.cfr.org/backgrounder/hate-speech-social-media-global-comparisons.

Accessed 26 February 2023.

Lauricella, Alexis R., et al. “Young Children’s Screen Time: The Complex Role of Parent and

Child Factors.” Journal of Applied Developmental Psychology, vol. 36, no. 36, Jan. 2015,

pp. 11–17, https://doi.org/10.1016/j.appdev.2014.12.001

Locker, Melissa. “This Place Just Made it Illegal to Give Kids Too Much Screen Time.” TIME, 26

January 2015,
31

https://time.com/3682621/this-country-just-made-it-illegal-to-give-kids-too-much-scre

en-time/. Accessed 27 February 2023.

Madhav, K. C., et al. “Association between screen time and depression among US adults.” NCBI,

16 August 2017, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5574844/. Accessed

26 February 2023.

Meserve, Stephen A., and Daniel Pemstein. “Terrorism and internet censorship.” SAGE

Journals, November 2020,

https://journals.sagepub.com/doi/epub/10.1177/0022343320959369. Accessed 27

February 2023.

Miller, Sarah. “The Addictiveness of Social Media: How Teens Get Hooked.” Jefferson Health, 2

June 2022,

https://www.jeffersonhealth.org/your-health/living-well/the-addictiveness-of-social-me

dia-how-teens-get-hooked. Accessed 26 February 2023.

Mir, Elina, et al. “Social Media and Adolescents' and Young Adults' Mental Health.” National

Center for Health Research, 2021,

https://www.center4research.org/social-media-affects-mental-health/. Accessed 26

February 2023.

Morrison, Sara. “The US government’s TikTok bans, explained.” Vox, 2 February 2023,

https://www.vox.com/recode/2023/1/17/23552716/tiktok-ban-cfius-bytedance.

Accessed 27 February 2023.

O'Dell, J. “Here's what really happens when you report a rude or spammy Facebook post.”

VentureBeat, 19 June 2012, http://venturebeat.com/social/facebook-reporting/.

Accessed 27 February 2023.

Perzyk, Tim. “Study shows average Twitter Ads ROI is 40% higher than other channels.”

Marketing | Twitter,
32

https://marketing.twitter.com/en_gb/insights/study-offers-new-marketing-mix-modeli

ng-insights-and-guidance. Accessed 27 February 2023.

Peterson, Tim. “Know your rights when social media companies change their terms of service.”

LegalZoom,

http://www.legalzoom.com/articles/know-your-rights-when-social-media-companies-c

hange-their-terms-of-service. Accessed 27 February 2023.

Prekindergarten and Kindergarten Screen Time Legislation,

https://education.mn.gov/mdeprod/idcplg?IdcService=GET_FILE&dDocName=PROD

046804&RevisionSelectionMethod=latestReleased&Rendition=primary. Accessed 27

February 2023.

The Protection of Children and Youths Welfare and Rights Act - Article Content - Laws &

Regulations Database of The Republic of China (Taiwan),

https://law.moj.gov.tw/ENG/LawClass/LawAll.aspx?pcode=D0050001. Accessed 27

February 2023.

Robinson, Matthew, and Steven Brill. “Freedom in an Era of Terror: A Critical Analysis of the

USA Patriot Act*.” Center on Juvenile & Criminal Justice, 11 September 2001,

http://www.cjcj.org/uploads/cjcj/documents/freedom_in.pdf. Accessed 27 February

2023.

“SB0152.” Utah Legislature, https://le.utah.gov/~2023/bills/static/SB0152.html. Accessed 27

February 2023.

Scott, Gordon. “Self-Regulatory Organization (SRO): Definition and Examples.” Investopedia,

https://www.investopedia.com/terms/s/sro.asp. Accessed 27 February 2023.

Smith, Michael D., and Marshall Van Alstyne. “It’s Time to Update Section 230.” Harvard

Business Review, 12 August 2021,

https://hbr.org/2021/08/its-time-to-update-section-230?ab=at_art_art_1x4_s02.

Accessed 27 February 2023.


33

“The Social Dilemma: Social Media and Your Mental Health.” McLean Hospital, 18 January

2023,

https://www.mcleanhospital.org/essential/it-or-not-social-medias-affecting-your-menta

l-health. Accessed 27 February 2023.

“Social Media and Adolescents' and Young Adults' Mental Health.” National Center for Health

Research, https://www.center4research.org/social-media-affects-mental-health/.

Accessed 27 February 2023.

“Social Media Policies.” HHS.gov, https://www.hhs.gov/web/social-media/policies/index.html.

Accessed 27 February 2023.

Sung, Morgan. “'Mr. Zuckerberg' explains the internet to elderly senators.” Mashable, 10 April

2018, https://mashable.com/article/mr-zuckerberg-meme-senate-hearing-facebook.

Accessed 27 February 2023.

Teale, Chris. “Lawmakers See 2022 as the Year to Rein in Social Media. Others Worry Politics

Will Get in the Way.” Morning Consult, 15 December 2021,

https://morningconsult.com/2021/12/15/social-media-regulation-poll-2022/. Accessed

27 February 2023.

“Teens and social media use: What's the impact?” Mayo Clinic,

https://www.mayoclinic.org/healthy-lifestyle/tween-and-teen-health/in-depth/teens-an

d-social-media-use/art-20474437. Accessed 27 February 2023.

Therrien, Alex, and Jane Wakefield. “Worry less about children's screen use, parents told.” BBC,

4 January 2019, https://www.bbc.com/news/health-46749232. Accessed 26 February

2023.

Udorie, June Eric. “Social media is harming the mental health of teenagers. The state has to act |

June Eric Udorie.” The Guardian, 16 September 2015,

http://www.theguardian.com/commentisfree/2015/sep/16/social-media-mental-health-

teenagers-government-pshe-lessons. Accessed 27 February 2023.


34

United Nations. “'Urgent need' for more accountability from social media giants to curb hate

speech: UN experts.” UN News, 6 January 2023,

https://news.un.org/en/story/2023/01/1132232. Accessed 26 February 2023.

United States Courts. “Facts and Case Summary - Elonis v. U.S. | United States Courts.” U.S.

Courts,

https://www.uscourts.gov/educational-resources/educational-activities/facts-and-case-

summary-elonis-v-us. Accessed 26 February 2023.

U.S. Copyright Office. “Home The Digital Millennium Copyright Act.” Copyright.gov,

https://www.copyright.gov/dmca/. Accessed 26 February 2023.

U.S. Copyright Office. “Home The Digital Millennium Copyright Act.” U.S. Copyright Office,

https://www.copyright.gov/dmca/. Accessed 27 February 2023.

“Use of Social Media By US Extremists.” START.umd.edu,

https://www.start.umd.edu/pubs/START_PIRUS_UseOfSocialMediaByUSExtremists_

ResearchBrief_July2018.pdf. Accessed 27 February 2023.

Vanegas, Marta R. “Regulating Social Media Content – A Primer.” Contra Costa County Bar

Association, 1 March 2022,

https://www.cccba.org/article/regulating-social-media-content-a-primer/. Accessed 27

February 2023.

Vile, John R. “Elonis v. United States | The First Amendment Encyclopedia.” Middle Tennessee

State University,

https://www.mtsu.edu/first-amendment/article/1455/elonis-v-united-states. Accessed

26 February 2023.

Vogels, Emily A., et al. “Teens, Social Media and Technology 2022.” Pew Research Center, 10

August 2022,

https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology

-2022/. Accessed 27 February 2023.


35

Walters, Anne. “Kids Online Safety Act: A means of monitoring children's use of social media

apps.” Wiley Online Library, 4 April 2022,

https://onlinelibrary.wiley.com/doi/full/10.1002/cbl.30628. Accessed 27 February

2023.

Wex Definitions Team. “defamation | Wex | US Law | LII / Legal Information Institute.” Legal

Information Institute, Cornell Law School, March 2022,

https://www.law.cornell.edu/wex/defamation. Accessed 26 February 2023.

Wex Definitions Team. “privacy | Wex | US Law | LII / Legal Information Institute.” Legal

Information Institute, Cornell Law School, June 2022,

https://www.law.cornell.edu/wex/privacy. Accessed 26 February 2023.

Winston and Strawn LLP. “What is Social Media Law? | Winston & Strawn Law Glossary.”

Winston & Strawn LLP,

https://www.winston.com/en/legal-glossary/social-media-law.html. Accessed 26

February 2023.

“Your Data Is Shared and Sold... What's Being Done About It?” Knowledge at Wharton, 28

October 2019,

http://knowledge.wharton.upenn.edu/article/data-shared-sold-whats-done/. Accessed

27 February 2023.
36

Deliberation Questions
I. Intro
A. General Questions:
1. How do you think social media has affected your mental health?
2. What would be lost by banning or severely limiting social media usage?
B. General Approach Questions:
1. Do you feel comfortable with someone else regulating your social media
usage?
2. Do you think you have enough willpower to “self-regulate” yourself on
social media? Do you think others can?
3. Would you ever trust large corporations to act with the consumer’s best
interests at heart?
II. Approach 1:
A. Can heavy regulation on social media be constitutional?
B. Should the government decide what belongs on social media and how we use it?
1. Is the safety government regulation worth the loss in personal freedom?
2. Do you worry about a government administration potentially weaponizing
this legislation should it be passed?
C. How much can heavy regulation of social media limit its negative effects on
users?
D. Who should enforce this regulation? Parents/guardians? Government officials?
Someone else?
III. Approach 2:
A. If you still were/are living at home, how would you feel about your parents
regulating your screen time? Other parents regulating their children?
B. How effective could self-control measures be? Could it work for you?
C. Would individuals be able to successfully moderate their own content, i.e.
distinguishing factual posts from fake ones?
D. How important are your individual freedoms? Would your mental health be
worth giving some of them up?
IV. Approach 3:
A. Do economics and profit hold the best solution to excessive social media use?
B. Do you trust companies to protect your privacy and rights online?
C. Should social media companies be able to moderate the content on their
platforms?
D. Do you think companies would actually do anything to protect their users, or
would their actions only focus on benefiting the company?
E. Should (or can) social media platforms be politicized, like Truth Social, or should
they all be non-partisan, whether announced or unannounced?
F. Do you think companies would find loopholes or ways around their own actions,
like “unskippable” break reminders or revised data security policies?

You might also like