Bite the Feed That Feeds You
Part three. Poets on Social Media series
Over ten years ago, the tech bros of Target were so effective at processing costumer data that there was a creepy incident involving a teenage girl. She found out she was pregnant, but she wasn’t ready to tell her parents, so she tried to keep it a secret. But they suddenly started to receive Target ads in the mail about prenatal products and things that new babies need. The teenager had been buying vitamins and unscented lotions, and other nonbaby products that the Target data analysis correctly recognized as behavior associated with pregnancy, so the automated system sent targeted ads to their home. That's how her parents found out about it.
That was with the computing power of 14 years ago. If the system knew the teenager’s secret because of the vitamins she was buying, calcium and magnesium, how much more can the AI technology of today know things about us? The systems that run social media know what we want, sometimes more than we do.
Intelligent people don't want to believe this, because it is staggering to think that we could be figured out so easily. We are complex and divergent thinkers!
But here’s the kicker: The algorithms not only run parallel to the story you tell yourself about who you are, offering stimuli on the feed that is part of your public identity, but they also run in correlation to data related to your body, that is, your unconscious actions as you interact with the platform. What does your body do while on Instagram?
Earlier we defined “algorithm” as a list of choices that a mind comes up with when faced with stimuli. The basic human algorithm for a threat is a duality, fight or flight. When we see something threatening, those two options appear on our algorithmic minds.
Fight or flight.
And even though we live more complex lives than the first Homo sapiens, and even there are many more options on our menu of choices, the organic algorithms are rooted in the mind and its body. They are rooted in homeostasis. They are, in other words, our basic code as organisms.
We cannot not be human.
Intelligent people may deny they can be figured out by Meta’s algorithms, and that’s why they’re so easy to manipulate!
The more you deny how vulnerable you are, the more vulnerable you are.
Advertisers have known for decades: People who think they’re not influenced by commercials are the target of commercials.
That Ginsburg quote in part one of this series could be speaking of poets today, scholars, who rely on social media to interact. “I saw the best minds of my generation destroyed by madness, starving hysterical naked…”
It makes you crazy, makes you dependent, it makes you an addict. Obviously, people use social media very well. Some writers are naturals at interacting with the interface and creating positive information flows. I’m on Facebook, and I have a lot of friends with whom I've been real friends with for over a decade, and some of them use social media very well. I know a brilliant Texas poet who has built an amazing career in part by using Facebook as her platform. Whether one is conscious of it or not, social media platforms can be fundamental to author branding and to creating a communal narrative.
So, what is to be done?
We need to acknowledge that even “the best” minds are vulnerable. AI systems are not called neural networks because it’s cleaver nomenclature. From large language models to Meta analytics, they are named neural networks because they are created in our image. They are created in the image of the human brain. That's why they’re so effective. The best minds of our generation may not be vulnerable to cultural pressures, but they are vulnerable to their bodies. Our bodies want to endure and flourish. That is the basic binary code, the zeros and ones of human evolution, and that code can be manipulated. We need to acknowledge that we are human.
Some Christians will tell you that the devil can fool you more effectively if you don't believe he exists. And there is a law of the universe rooted in that idea. We need to acknowledge that we can be led astray by synthetic algorithms, that there is a tremendous amount of data about us relating to our likes and dislikes, fears, angers, a range of emotions that cause us to react, sometimes outside of the story we tell ourselves about who we are.
Here’s a story I believe to be true about me:
I’m not a violent person. I don't like to fight, I don't like to see fights. I remember as a teenager going to the San Jose flea market on a Sunday, the biggest flea market in the Bay Area at the time. I must've been 13 years old, and I saw two men fighting, older men in their 30s or 40s. They were having a fist fight, and when it was broken up, one of the men was bleeding and his face looked so defeated, and I cried. I was a teenager, but it hurt me so much I cried. I had to leave the crowd so no one would see me.
I've never punched anybody, ever, nor have I kicked anybody. So the story I tell myself is that I'm not a violent person.
However, my organism tells a different story.
When I see a fight, I stop on it and look, like most of us, at least for a while, before I remember how much I hate violence.
When I'm scrolling through my social media feed and there's a quick video of somebody getting mugged on the street caught by a camera, I pause. I move on, but I pause. I feel intense emotion, and those pauses are now part of the data associated with me. The algorithms will include my pauses to provide content for me, and I will see more violent imagery as I scroll. Some people will click on violent videos and watch them, and they will receive a lot of violent content. I have an uncle who watches street fight videos on TikTok. I can’t imagine what his feed is like, how more and more extreme the images need to be to keep him on the platform.
We can take some control of the algorithms. I want the data on me to reflect my story, the one I tell myself about who I am. I am not a violent person. My basic coding will have me pause on a fight, but I know that and can direct my body to react differently.
Now, when I am presented with violent videos or reels on my feed, I scroll through it quickly. I do not dwell, not even a moment, because I’m training the algorithms to provide me with what I want, not what they want, which is to keep me addicted by manipulating my human code.
Should you Leave Meta?
Today there is a great diaspora of Facebook poets, journalists, scholars, professors, students, moving away and searching for another social media home, some of them going to Blue Sky, some of them not sure where to go.
But many of them stay on Meta platforms.
Including me. I’m on Instagram (#theaudiowriter) and Facebook, which I use as tools, as best I can. I use it to stay in touch with people I care about, like Lee Herrick, just to mention one. I care about what my friends are up to.
But what we need is AI literacy.
As you walk the streets of that mega city, with its flashing lights and its opportunities to stop on those things that appeal to you, images that may shoot you with disgust, that anger you, or that make you feel good about yourself, all those intense emotional experiences that are waiting for you to stop, to delay, to dwell a little bit so they could understand you even more, you need to ignore them. Ignore anything on your feed that isn’t from real friends. REAL friends, not corporate accounts that say “Follow.”
Go to your friends.
Scroll through the feed and don't stop at anything but what your friends are sharing, and learn to discern when they are sharing things from bots and data eaters.
Get your news from outside the platform! No matter what your political beliefs, you will find fake news on your Meta feed. If you see a news item that you cannot let go of, research the veracity of the story.
By mindfully walking through the virtual city, you can shape the landscape.
You can train the algorithms, instead of letting them train you.
It requires cognitive effort, and you probably won't want to stick around in Facebook city all day and you won’t want to keep going back, over and over again, not as often as you used to when it was a plaza filled with friends, but when you do go back, you'll know what you're looking for, and you’ll find it.
If Simulation Theory is true and reality is a holographic world in some cosmic computer (and today many people spend a good percentage of their lives literally in the virtual world), let's be playing characters.
Let's not be the NPCs, the non-playing characters that populate the landscape for the benefit of the designers and the real players. Let's take up our digital guns (or pens) and fight for what we believe in.






Great stuff. I recently wrote a little essay about inheriting the family violin and deciding to learn how to play it. Now I am getting violin lesson ads across all my platforms: YouTube, Google, Facebook, etc. It's creepy af. When I used to do a lot of content writing, my research had me googling all sorts of things every day. The algorithm for me was schizophrenic.
Brilliant article. I've been watching this develop over the decades and we're definitely fighting algorithmic influence in every part of our lives these days.