Meta's Reels algorithm has quietly introduced a new signal that is penalizing videos with burned-in, hardcoded captions โ€” the kind that creators add directly to the video file, as opposed to Instagram's native caption tool. Internal A/B testing data, obtained by SocialSeconds from a source within Meta's creator partnerships team, shows that Reels with hardcoded captions receive an average of 35% less distribution than equivalent videos using Instagram's auto-caption feature.

The reason, per our source, is a combination of two factors: first, Meta's algorithm now reads caption text as an engagement signal and prefers its own caption layer (which it can analyze and target more precisely); second, hardcoded captions have historically been associated with repurposed content from other platforms โ€” a behavior Meta's algorithm has been actively trained to suppress.

The Accessibility Concern

The change has drawn immediate criticism from disability advocates. Hardcoded captions are the gold standard for deaf and hard-of-hearing viewers because they're embedded in the video itself and always visible, regardless of platform settings. Instagram's native auto-captions, by contrast, require users to manually enable them โ€” and have a documented error rate of 12-18% for non-native English speakers.