Discussion about this post

User's avatar
Cognitive Drift's avatar

This really clarifies that AI optimizes the compression pathway the system uses to decide what matters. When the wrong variables are compressed early, downstream “fair” or “efficient” fixes can only amplify the error. Systems thinking is what keeps compression aligned with reality rather than dashboards.

Daria Cupareanu's avatar

Reading this reminded me of this YouTube channel, which focuses a lot on unintended consequences and second-order effects: https://www.youtube.com/watch?v=ImJSMqgyvCY&list=PLBuns9Evn1w9XhnH7vVh_7C65wJbaBECK

It seems very similar to what happens with AI and how it’s implemented in real systems.

Maybe it’s worth adding one more question when integrating AI: what could be the possible unintended consequences?

Great article, Marcela!

9 more comments...

No posts

Ready for more?