Trust in a Monetized Internet: How Do Two People Build Bonds When the Pipes Are Watching?
Inspired by Rabbi Simon Jacobson's "Trust Issues: What's Blocking Your Bonds?"
November 12, 2025
Imagine your digital life as two physical pipes: one bringing information into your home, one carrying your activity out. News, videos, messages, searches, smart-home pings — in, out, in, out. Now imagine that, at every bend in those pipes, someone is measuring the flow. That measurement is money. It funds "free" services, powers advertising, and fuels the predictive engines that try to guess your next click, purchase, or thought.
Trust is the foundation — but our pipes monetize attention
Rabbi Simon Jacobson describes trust as the bedrock of human connection: without it, love has no stable place to land. That framing raises a hard question for our era: How do two people build a genuinely trusting relationship when the digital pipes around them are optimized to watch, profile, and monetize them? If an economy profits from making us predictable, it also profits when we behave according to past patterns — even when we are trying to change.
From public research to private platforms: how the pipes were built
The internet grew from publicly funded research. In the 1960s, ARPA (now DARPA) seeded the ARPANET to connect research nodes; the first packet traveled between UCLA and SRI on October 29, 1969. Decades later, a handful of large platforms sit where those open networks meet people's daily lives, turning data exhaust into revenue and behavioral predictions.
Predictability as product
Modern AI systems, including large language models, are prediction machines. The commercial web runs on a similar logic: observe enough past behavior and you can estimate the next click. In marketing, this means segmenting and scoring individuals, nudging them toward profitable actions, and continuously A/B testing the world into a more predictable place. Over time, these nudges can make personal transformation harder: when you try to stop drinking, you notice more alcohol ads; when you seek stillness, the feed accelerates.
When sensors see through walls
The "pipes" aren't only browsers and phones. Wireless signals themselves can be repurposed to infer presence and motion in physical spaces. Academic teams have shown that Wi-Fi-band systems can identify people behind walls and estimate body pose through occlusions. These are research systems, not consumer gadgets, but they show how far inference can go, and why metadata (not just messages) matters for privacy.
Data brokers and the quiet economy of profiles
Behind the scenes, data brokers aggregate location, app, and ad-tech data into dossiers about individuals and households. Regulators have begun to push back: in 2024, the U.S. Federal Trade Commission reported "vast surveillance" by major online services and brought actions against brokers selling sensitive location data. This enforcement matters not just for policy, but for trust between people: when unseen parties can buy inferences about your habits, it changes what intimacy and consent mean.
Law can help — but it isn't enough by itself
Legal frameworks try to rebalance power. The EU's GDPR establishes rights to access, correct, and delete personal data, and to object to certain processing. California's CCPA gives residents rights to know, delete, and opt out of the sale or sharing of their data. These laws set guardrails, but they don't resolve the social challenge: building relationships when third parties are incentivized to study the space between us.
Public sentiment: people feel watched and out of control
Surveys consistently show rising concern and confusion. In 2023, Pew Research reported that 71% of U.S. adults were concerned about government use of their data, and many felt they had little control over how companies collected and used it. Whether or not you live in the U.S., these numbers reflect a broader mood: people sense that the pipes are measuring them, and they aren't sure how to opt out.
The human problem: trust under the gravity of prediction
Trust requires room for change. Yet predictive systems monetize continuity: the more we behave like our past selves, the higher the model confidence and the greater the ad revenue. This creates a subtle form of social gravity. We can resist it, but it takes intentional design in our relationships and tools.
Practical steps for two people trying to build trust
- Make your own norms explicit. Agree on shared rules for phones at the table, private spaces, and when to be offline together.
- Reduce third-party observers. Use privacy-respecting browsers, disable ad-ID tracking on phones, and prefer end-to-end encrypted chat for sensitive topics.
- Practice "consent for inference." Don't share each other's stories, locations, or photos without explicit permission.
- Create unpredictability on purpose. Vary routines in small, healthy ways (routes, times, apps). Break the feedback loops that keep serving the same triggers.
- Audit the pipes together. Once a month, review app permissions, ad preferences, and data-broker opt-outs. Treat it like changing the batteries in a smoke alarm.
A simple ethic for a complicated system
Question for the era of monetized pipes: How do two people create a trusting foundation for love when the flows of data around them are incentivized to profile, predict, and manipulate? The answer isn't perfect privacy; it's shared practices, honest disclosure, and tools that reduce unnecessary surveillance.
We can't rip the pipes from the walls. But we can decide how we use them, what leaks we tolerate, and which spaces we keep for human unpredictability. That choice is the start of trust.
No comments:
Post a Comment