What if the state punished you not for what you did, but for what an algorithm predicts you will do? 💡 We analyze the world of predictive policing and the role of companies like Palantir. From the collection of health and social media data (scraping) to the "risk scores" that determine our lives, we explore how Europe, despite laws like the AI Act, is building the infrastructure for social scoring under a different name.
🔍 What you will see in the analysis:
System Fuel: How the state, data brokers, and social media feed the algorithm with our personal data. Social Scoring: Why it doesn't need to be called by that name to function as a penalty without a trial. The EU Contradiction: How GDPR and the AI Act coexist with massive databases that enable mass surveillance. Case Studies: The case of the NHS in the UK and the use of such systems in the war in Ukraine as a "testing ground." The Habit of Evaluation: Why the greatest danger to democracy is not violence, but the acceptance of silent digital classification.
💡 The Conclusion: Digital totalitarianism does not need an ideology; it only needs metrics. When freedom is transformed into a "conditional system," the citizen ceases to be a bearer of rights and becomes a variable in software they cannot control.
Like and Subscribe to @Geopolitics-news-gr for analyses that deconstruct the new architecture of global power. WEB Link: Geopolitics-News.gr
Playback speed
×
Share post
Share post at current time
Share from 0:00
0:00
/
0:00
Transcript
Authors
Recent Posts











