Bonus Programs, Algorithms, and You

[This post was substantially rewritten on July 27, 2024]

A few folks have commented and messaged with questions about how FB’s engagement bonus program works. Here’s what little is known to me. None of this is any kind of special inside information, just observations that I’ve been able to at least somewhat validate.

Comment Quality

Comments with fewer than six words or so seem to be of less value.

Copy-paste and template comments are a complete waste of time, and certain phrases have become obviously abused in an attempt by sleazy shortcutters and game-riggers to screw the system. We’ve all seen those posts full of “COUNT ME IN” and “LET’S GO” comments. Not only a huge waste of time, but a huge message to FB that you’re gonna try to cheat the bonus program.

Advertisements

There are easy moderation tools page owners can use to block these messages from ever showing up in the first place. Use them.

I’ve seen pages that aren’t working to prevent/remove/discourage that kind of behavior get dropped from the program. Pages and accounts leaving those comments tend to get algo-suppressed and eventually banned as well, but the people doing it are a) stupid and b) have nine thousand other sockpuppets and bots running anyway so they don’t care.

Content

Quality, original content is the order of the day.

Regular repost content like the Wednesday “post your gig” autopost I’ve had running the last few weeks doesn’t get bonuses (but that’s fine, it’s a useful thing to do for working creative performers and builds the community) and tends to get suppressed by the algo as clickbait.

Not sure how other repost content like the articles I have rotating in the autoposter on a cycle anywhere from around 30 to 90 days is handled. As far as I can tell, substantive content that is reposted on a longer cycle, say 45 days or more, probably does better than stuff that’s reposted weekly or biweekly.

Clickbaiting tactics in general aren’t going to get far. “❤ for a, 🙂 for b” posts, for example, are seen as low-quality content fishing for traffic – I don’t recall the details of the source anymore, but that specific example appears in FB documentation as something that will be penalized by the system. It’s cheap, dumb traffic that’s mostly bots anyway and FB doesn’t want that so they don’t pay for it. (Note: they do want it just badly enough to allow a lot of the ‘bot and other fake activity to happen. It inflates their numbers and allows them to charge more for advertising.)

Their favorite content is about what you’d expect – original memes and status messages short enough to use a graphic background. They choke the hell out of anything that links offsite, but they do pay on it if it’s otherwise quality content e.g. a link to my latest Medium or JHUS article.

Image showing FB react bar with the "Like" circled, the "love," "care," "laugh," "wow," and "sad" reacts grouped in a green box, and the angry react in a red box. The test reads "Remember! [blue text]This Gets Boosted. [Green Text] This get boosted a lot. [Red text] This doesn't get boosted at all. [White text] Smart Engagement is Effective Engagement! Effective Engagement Helps Creators!" There is a johnhenry.us logo at lower right.
Click the image to get this meme in shareable size!

Reels and stories aren’t part of the current bonus program for me, although there are other ways to get paid for them that you may have access to. That’s not to say don’t bother using those tools. It’s still extremely beneficial to propagation and audience growth. It just doesn’t pay through the bonus program.

The very best thing you can do is share with your own additional comment. Sharing content without any commentary, or with a brief and non-descriptive comment like “me too,” is less effective than sharing with a substantial comment. A substantial comment is over six words and adds some kind of meaning or substance. Simply sharing a post without a comment may only generate a minimal amount of interest. Adding a simple emoji or a word like “TRUTH!” doesn’t significantly increase its impact. However, sharing the post with a meaningful comment like “this is really good information, you should check it out,” or “this resonates with me and is worth reading,” or even a detailed personal experience that relates to the content can greatly enhance its value and the algorithm’s response.

“Follow for follow” and “like for like” are bad and you shouldn’t. As convoluted as it may sound, FB really does believe they’re pushing for legitimate, “organic” engagement. L4L/F4F falls under what FB refers to as “coordinated inauthentic activity,” and enough of it won’t just get you demonetized, it’ll get you deleted eventually. There’s nothing wrong with creating communities of creators, but just liking every rando who likes you is probably going to do you more harm than good in the end. Social media sites that use algorithms, like Facebook, interpret that behavior as genuine interest and associates you with those interests and behaviors, and inevitably if you just like and follow everyone who likes and follows you, you’re going to be associated with clickbaiters, spammers, and worse in the algorithm.

Fundraising and mutual aid posts appear to be heavily penalized by the algorithm. It’s getting worse as more and more desperate people are forced to try to survive on crowdfunding, in the exact same way local police will start cracking down on vagrancy-related crimes as you start seeing more people on street corners with their hands out.

The only way to overcome that is to share it like you just discovered a new book of the Bible hand-lettered and signed by Jesus.

It deserves to be said out loud that this is not entirely FB’s fault; over the last couple of years every talentless half-wit with no marketable skills and their mom has decided they’re an “influencer” or “activist” and goes out trolling for cash (sidebar: this has substantially reduced income for those of us who are out here doing meaningful and substantive work and trying to survive, and driven some good people out of the space entirely), and FB actively works to identify and avoid rewarding/encouraging those bad actors. Note that I don’t claim they do this well, but it’s what they’re trying to do.

Obviously, stuff that would be problematic anyway like hate speech, disinformation, etc. is not rewarded by the bonus program and if you do much of it they’ll demonetize you and delete your accounts.

The Bigger Picture

Over the years a trillion get-rich-quick schemers and grifters have turned social media into a largely automated and mostly useless pile of garbage begging for cheap, easy attention. These are the sops you see slapping their t-shirt “designs” (a pithy cheap appeal to ego with variable fonts) on a picture of Keanu Reeves or Morgan Freeman or some other super-popular celebrity with a high level of public trust.

The schemers and get-rich-quick types have built up this industrial strength imitation of human engagement, and FB can’t sell advertising based on the number of ‘bots and sockpuppets that will see it.

They want “real human beings acting like real human beings.” Unfortunately, this puts us real human beings in a position of basically being forced to become unpaid (or paid, even, given this bonus program) Facebook employees whose job it is to keep the platform supplied with a steady stream of quality original content followed by a good solid engagement cycle of real human beings earnestly recommending, reacting to, and sharing that content.

They know they can’t sell ads to ‘bots and AI, so they’re stuck in this weird space: On one hand, they want those numbers. On the other hand, the more garbage traffic there is the fewer human users will engage with the platform at all. (The conversation is much deeper and runs far outside the scope of this article, but this is a core part of it.)

It’s also well worth keeping in mind that these same dynamics apply to other contexts as well. There’s not a special interest or identity group or hobby or celebrity or political label or opinion on a popular topic that doesn’t have seven million auto-pilot Facebook pages devoted to them, every one of them kicking out just the most vapid and ridiculous crap imaginable as they chase that easy money.

I caught one just now as I was taking a quick break and scrolling through FB on my phone, some person with the world’s most obnoxious British accent pretending to ask seriously if it’s true that all Americans have a personal shopping assistant who will help you brings your bags to your car. Of course it’s not true and they don’t believe it and probably made it up or it’s some family in-joke about some mistaken conclusion they drew when they were a little kid or something. But that’s not the point: the point is that’s garbage content intended to generate contentious, rude, ego-driven traffic that creates long arguments in the comment section, each one of which adds to the apparent popularity of the page.

It also exacerbates and amplifies our worst selves. This is where things get serious, because it’s the exact same tendencies that these types of pages play to and exploit, that are leveraged to spread sometimes catastrophically destructive disinformation on a broad scale. We saw this during Covid; we see it every day related to politics and much of the mainstream television news is now playing to those same tendencies in carefully calculated ways. So while the efforts of social media companies to control content quality is very much rooted in profit-seeking and capitalism, those efforts are also important to helping stem the tide of disinformation and misinformation. This is why if you share a lot of “fake news,” eventually it’s going to cost you.

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x