Skip to content Skip to footer

Tech Pushes the Captive Audience Meeting Inside Our Minds

Article published:
A puppet topped with a putty-colored papier mâché head of a man in glasses and a suit jacket and red tie towers over a semi-circle of people holding signs reading “Comcast pay your fair share” and “Philly vs Comcast.” A globe wrapped in red tape floats over the people, who stand in a plaza with shiny office buildings in the background.

Tech tools amplify existing biases and raise new obstacles and questions for organizers

In “captive audience” meetings held in windowless rooms at the back of the nursing home, the charter school, or the nonprofit – or on the factory floor – bosses gather workers together and make their case.  A union would get in the way of the family feel we have here. You’d be paying dues for no guarantee of better wages or working conditions.  And, as Amazon made clear in its successful pitch to the Bessemer warehouse laborers, we already pay well and provide better benefits than the other low-wage jobs around here.

When workers are forming their union, they need to flex their societally-suppressed ability to unite with their coworkers, to fight entrenched alienation, and to build a shared sense of identity and a concrete vision of what improvements they can win if they join forces. In the Bessemer campaign, not only did Amazon center the use of captive audience meetings, like most anti-union campaigns do – but Amazon’s almost constant watching of its workers killed the regular, normal opportunities that workers have to connect with each other.

The speed and pressure of Amazon’s work demands are one part – Amazon’s workplace is laced with data-driven surveillance tools that track how fast and well you perform and how much time you spend “off task”. If you have to walk a football-field’s length and back to the breakroom on your lunch, you have very little time to organize with your coworkers.  The company also surveils how workers talk with each other online, trying to connect off the jobsite – paying particular attention to rumblings from the workers about organizing – including trolling for and then actively monitoring private Facebook groups of its Flex drivers for talks of protests or strikes.

Combine that surveillance and isolation with the constant barrage of anti-union messaging from Amazon – from text messages to bathroom stall flyers to billboards to social media – including directly targeted social media aimed at the workers as they played video games in their scant free time  – and the captive audience meeting has moved rent-free from the breakroom into the workers’ minds.

Your inbox needs more left. Sign up for our newsletter.

It’s a brutal thing to consider – but something that we have to study and understand considerably beyond workplace justice. With deep alienation as a hallmark of work in this stage of capitalism, we have a question today that we’ve never had to answer in the same way – how do we unite and build power when the ruling bloc has enhanced its hegemonic dominance through the very technologies we rely on to communicate?

More than twenty years ago, when the internet was beginning to scale from military and academic uses to consumers and communities, movements around the world were experimenting with forms of independent media that amplified resistance, while also building leadership amongst the creators.  We felt extraordinary hope about what the internet could bring for movements for liberation.

But today, rather than being a site of democracy and liberation, the internet is dominated by the same consolidation of corporate ownership and power that has destroyed radio, television, cable, and local journalism over the past thirty years. It creates profit as its major product, using our activities online as the means of production.

While we might have the ability to track open rates on emails, test engagement with Instagram posts, and use hashtags to fundraise and mobilize at scale, the boss has the upper hand by far.  The tools are built for him, not for us. Organizers hoping to build the massive numbers it will take to take on racial monopoly capitalism in the United States must understand the ways in which those currently in power use technology to alienate us from each other, intensify the impact of racism, and extract our value.

Courts, for example, use risk assessment algorithms as a part of their pretrial systems – deciding who gets released, who gets a (likely unaffordable) money bail, and who stays locked up pretrial. For decades, cities and states struggling with crowded jails have tried to figure out how to make their systems smaller, without increasing “risk” to the community.  Even though the vast majority of people in jails pretrial will safely return to court on their court dates with nothing but a reminder, hundreds of countries use risk assessments to decide who should even get a chance at release, branding people as risky to release because of factors deeply correlated with poverty and race.

Risk assessments amplify bias

Risk assessments have been sold by major groups like Arnold Ventures as the key to ending money bail and pretrial incarceration. But over the last ten years, the opposite has proven true. Many jurisdictions have used risk assessments as part of their pretrial decision making, without reducing their jail populations or the disproportionate numbers of people of color among them. Recent studies demonstrate that risk assessment, rather than informing fearful judges’ guesses at who is risky and who is not, amplifies existing fears and buries a core truth – that almost no one is “dangerous” enough to justify jail.

We see this contradiction in other aspects of our society as well. Technology promises fairness and evidence-based accuracy – but far too often increases the power of the oppressive system and buries that oppression under a veneer of “science.” Schools use surveillance and algorithms to try to predict the next mass shooting or other violence; then they get predictable false positives on Black, brown, immigrant, and low-income students, and use these tools to target, surveil, push out and sort kids of color.  It isn’t that the algorithms or surveillance introduce anything new – but they absolutely aggravate the impacts of the oppressive system on our communities – while providing us with potential opportunities to throw the oppression of the system into relief.

Child welfare agencies use algorithms as a part of surveilling and separating Black and brown families, while claiming they support fairness and reduce bias. Hiring has moved remote in the pandemic, meaning interviews are laced with algorithms and enhanced bias. The right used vulnerabilities in how social media is regulated to target Black and Latino male voters with misinformation in the 2020 election, contributing to a marked drop in their support for the Democratic ticket.

Questions on tech and organizing

Communities nationwide are fighting for basic material needs – housing, school, healthcare, racial justice. But we need to pay closer attention to two things: how companies and the state use technology to aggravate the impacts of the systems that deprive our communities of our human rights – while also being a tricky factor in how we organize to win what we need.  At this moment in history, here are some of the questions we need to answer as we think about how technology in the hands of the ruling bloc changes the terrain for organizing:

How does technology shape what people think, believe, and share in the heart of organizing campaigns, whether they are in a neighborhood, a worksite, within a community, or across lines of division??  How does the fact that we’ve surrendered to the ugliest of surveillance – signing away our rights to privacy and to ownership of our own data in exchange for basic access to work, banking, communication, entertainment that comes through these technologies – impact our ability to envision a brighter world? How does the fact that we consume most of our news through curated paths online affect what we learn, who we trust, what we hope?  How are the boss, the right, those who profit from our peoples’ alienation, finding and using those paths? Understanding where our members, leaders, and opponents get their information and the paths that are most trusted for them will help us map our terrain and anticipate twists and turns before they come.

How do we get better at understanding how, or whether, a technology is hurting our communities? When we do the deepest kinds of listening about the material conditions our neighborhoods are facing, occasionally a tech issue rises to the top (see: internet access during the pandemic). But more often, the technology is enhancing, or aggravating, more basic challenges around work, housing, healthcare, education, or safety. Listening deeply to communities about the struggles they face and using popular education methods to learn about and discuss how technologies like predictive policing algorithms, facial recognition, school-based algorithms intersect with their daily lives can surface new opportunities to both build the power of our groups and develop strategies to defeat our targets.

How do we understand the technology companies themselves? These firms, especially Big Tech, or FAANG, impact the structure of our economy with their heft and gravity as much or even more than their consumer products and tools impact people and work. In Silicon Valley and New York, big tech is as dominant as big banks or real estate, warping the housing  and stock markets.  This gives us a chance to understand the role of big technology companies in our cities and towns in ways that go beyond the products they make – something we explored at Movement Alliance Project (formerly Media Mobilizing Project) when we took on Comcast in 2015. We targeted the company both for the poor quality, distribution, and high price of its internet services, and for the outsized role it played in Philadelphia’s politics and economic inequality. We have to contend with how tech companies are changing the makeup of the neoliberal bloc and its impacts, and try to anticipate how changes in that bloc and the ways it relates to the base of society will impact conditions for organizing and building left power.

How do we relate a critical assessment of technology with the largest material harms our communities face? We have a long history of communications and technology organizing in this country, but need to update our thinking.  Groups came together in 2019 to cohere the Athena Coalition, where dozens of organizations take on Amazon and how big tech and corporate power is warping society and the economy. The coalition is paying sharp attention to this, with leadership from partners MediaJustice, FIght for the Future, Institute for Local Self Reliance, United for Respect, Mijente, and others. While our communities are fighting for core, material needs like housing, food, safety, jobs, racial justice, and schools, the policies that govern technology intersect deeply with these issues.

The good news is that when communities grapple with technology’s role in enhancing the impacts of racial monopoly capitalism, they can win.  Communities have defeated racist algorithms in sentencing and pretrial systems in ways that lessened the power of the state to incarcerate people. They’ve kept Amazon from building new headquarters.  But algorithms and incarceration continue to proliferate nationwide, and Amazon’s power continues to grow.  To contend with the role technology is playing in our communities and in this conjuncture, we need more leadership from vibrant membership and base-building groups in the movement for tech justice, both to restrain big tech’s power, and to broaden the scope and scale of what we can win.