Facebook is hiring a director of human rights policy to work on “conflict prevention” and “peace-building”
Facebook is
advertising for a human rights policy director to join its business, located
either at its Menlo Park HQ or in Washington DC — with “conflict prevention”
and “peace-building” among the listed responsibilities.
In the job
ad, Facebook writes that as the reach and impact of its various products
continues to grow “so does the responsibility we have to respect the individual
and human rights of the members of our diverse global community”, saying it’s:
… looking for a Director of Human Rights
Policy to coordinate our company-wide effort to address human rights abuses,
including by both state and non-state actors. This role will be responsible
for:-Working with product teams to ensure that Facebook is a positive force
for human rights and apply the lessons we learn from our investigations,-representing Facebook with key stakeholders in civil society, government,
international institutions, and industry,-driving our investigations into
and disruptions of human rights abusers on our platforms, and - crafting
policies to counteract bad actors and help us ensure that we continue to
operate our platforms consistent with human rights principles.
Among the
minimum requirements for the role, Facebook lists experience “working in
developing nations and with governments and civil society organizations around
the world”.
It adds that
“global travel to support our international teams is expected”.
The company
has faced fierce criticism in recent years over its failure to take greater
responsibility for the spread of disinformation and hate speech on its
platform. Especially in international markets it has targeted for business
growth via its Internet.org initiative which seeks to get more people
‘connected’ to the Internet (and thus to Facebook).
More
connections means more users for Facebook’s business and growth for its
shareholders. But the costs of that growth have been cast into sharp relief over
the past several years as the human impact of handing millions of people
lacking in digital literacy some very powerful social sharing tools — without a
commensurately large investment in local education programs (or even in
moderating and policing Facebook’s own platform) — has become all too clear.
In Myanmar
Facebook’s tools have been used to spread hate and accelerate ethic cleansing
and/or the targeting of political critics of authoritarian governments —
earning the company widespread condemnation, including a rebuke from the UN
earlier this year which blamed the platform for accelerating ethnic violence
against Myanmar’s Muslim minority.
In the
Philippines Facebook also played a pivotal role in the election of president
Rodrigo Duterte — who now stands accused of plunging the country into its worst
human rights crisis since the dictatorship of Ferdinand Marcos in the 1970s and
80s.
While in
India the popularity of the Facebook-owned WhatsApp messaging platform has been
blamed for accelerating the spread of misinformation — leading to mob violence
and the deaths of several people.
Facebook
famously failed even to spot mass manipulation campaigns going on in its own
backyard — when in 2016 Kremlin-backed disinformation agents injected masses of
anti-Clinton, pro-Trump propaganda into its platform and garnered hundreds of
millions of American voters’ eyeballs at a bargain basement price.
So it’s
hardly surprising the company has been equally naive in markets it understands
far less. Though also hardly excusable — given all the signals it has access
to.
In Myanmar,
for example, local organizations that are sensitive to the cultural context
repeatedly complained to Facebook that it lacked Burmese-speaking staff —
complaints that apparently fell on deaf ears for the longest time.
The cost to
American society of social media enabled political manipulation and increased
social division is certainly very high. The costs of the weaponization of
digital information in markets such as Myanmar looks incalculable.
In the
Philippines Facebook also indirectly has blood on its hands — having provided
services to the Duterte government to help it make more effective use of its
tools. This same government is now waging a bloody ‘war on drugs’ that Human
Rights Watch says has claimed the lives of around 12,000 people, including
children.
Facebook’s
job ad for a human rights policy director includes the pledge that “we’re just
getting started” — referring to its stated mission of helping people “build stronger communities”.
But when you
consider the impact its business decisions have already had in certain corners
of the world it’s hard not to read that line with a shudder.
Citing the
UN Guiding Principles on Business and Human Rights (and “our commitments as a
member of the Global Network Initiative”), Facebook writes that its product
policy team is dedicated to “understanding the human rights impacts of our
platform and to crafting policies that allow us both to act against those who
would use Facebook to enable harm, stifle expression, and undermine human
rights, and to support those who seek to advance rights, promote peace, and
build strong communities”.
Clearly it
has an awful lot of “understanding” to do on this front. And hopefully it will
now move fast to understand the impact of its own platform, circa fifteen years
into its great ‘society reshaping experience’, and prevent Facebook from being
repeatedly used to trash human rights.
As well as
representing the company in meetings with politicians, policymakers, NGOs and
civil society groups, Facebook says the new human rights director will work on
formulating internal policies governing user, advertiser, and developer
behavior on Facebook. “This includes policies to encourage responsible online
activity as well as policies that deter or mitigate the risk of human rights
violations or the escalation of targeted violence,” it notes.
The director
will also work with internal public policy, community ops and security teams to
try to spot and disrupt “actors that seek to misuse our platforms and target
our users” — while also working to support “those using our platforms to foster
peace-building and enable transitional justice”.
So you have
to wonder how, for example, Holocaust denial continuing to be being protected
speech on Facebook will square with that stated mission for the human rights
policy director.
At the same
time, Facebook is currently hiring for a public policy manager in Francophone,
Africa — who it writes can “combine a passion for technology’s potential to
create opportunity and to make Africa more open and connected, with deep
knowledge of the political and regulatory dynamics across key Francophone
countries in Africa”.
That job ad
does not explicitly reference human rights — talking only about “interesting
public policy challenges… including privacy, safety and security, freedom of
expression, Internet shutdowns, the impact of the Internet on economic growth,
and new opportunities for democratic engagement”.
As well as
“new opportunities for democratic engagement”, among the role’s other listed
responsibilities is working with Facebook’s Politics & Government team to
“promote the use of Facebook as a platform for citizen and voter engagement to
policymakers and NGOs and other political influencers”.
So here, in
a second policy job, Facebook looks to be continuing its ‘business as usual’
strategy of pushing for more political activity to take place on Facebook.
And if
Facebook wants an accelerated understanding of human rights issues around the
world it might be better advised to take a more joined up approach to human
rights across its own policy staff board, and at least include it among the
listed responsibilities of all the policy shapers it’s looking to hire.
Comments
Post a Comment