Release the Data

Mind Control for the Masses

INTRODUCTION

To control populations you need to know how populations think as individuals and as collectives before manufacturing a crisis to control their behaviour.

Over the decades, Social Experiments have been carried out to calculate how most people respond to certain situations. From this, Governments and those who control governments can create manipulation tactics and execute psychological operations that allow for their most ideal result to play out.

The average person cannot fathom the idea that their own Government would have systems and structures in place to control their behaviour and ironically, that is why they work so well. The more that populations trust them, the more they can get away with.

The KNOWLEDGE BASE below outlines these experiments, cognitive biases, manipulation tactics, and psychological operations so that you can recognise them and engineer your own plan of attack against them.

KNOWLEDGE BASE

Social Experiments

Social experiments are structured studies conducted to observe how people behave in specific situations—often revealing insights about conformity, obedience, prejudice, cooperation, empathy, or morality. This kind of information about people is terribly important for Governments and those who control them – as it can provide the key to controlling a population.

Below are key examples of influential social experiments, organized by the type of human behavior they explored:

SOCIAL EXPERIMENTS


1. Obedience to Authority

Milgram Experiment (1961)

  • Purpose: To test how far individuals would go in obeying an authority figure, even if it meant harming another person.

  • Setup: Participants were told they were assisting in a memory test. They were instructed to administer increasingly severe electric shocks to a “learner” (actually an actor) when incorrect answers were given.

  • Finding: A majority of participants continued administering shocks, even up to dangerous levels, simply because an authority figure told them to.

2. Conformity and Group Pressure

Asch Conformity Experiment (1951)

  • Purpose: To observe whether individuals would conform to a group’s incorrect consensus.

  • Setup: Participants were placed in a group with actors who gave obviously incorrect answers to simple visual questions. The real subject answered last.

  • Finding: Many participants conformed with the group, even when the group was clearly wrong, highlighting the power of peer pressure.


3. Role Adoption and Power Dynamics

Stanford Prison Experiment (1971)

  • Purpose: To examine how individuals adopt roles of power (guards) or submission (prisoners) in a simulated prison environment.

  • Setup: College students were randomly assigned roles as guards or prisoners in a mock prison. The experiment was supposed to last two weeks.

  • Finding: The guards became abusive and the prisoners distressed. The experiment was terminated after just six days due to ethical concerns.

4. Bystander Effect

Latane and Darley Experiments (1968)

  • Purpose: To investigate why people often fail to help others in emergency situations when others are present.

  • Setup: Participants overheard someone (an actor) having a seizure. The number of other people the participant believed were present was varied.

  • Finding: The more people who were believed to be present, the less likely any one person was to help—illustrating diffusion of responsibility.


5. In-group vs. Out-group Behavior

Robbers Cave Experiment (1954)

  • Purpose: To explore how intergroup conflict can arise and be resolved.

  • Setup: Two groups of 11-year-old boys at a summer camp were separated and competed in games. Conflict escalated, but cooperation was restored when they had to work together on common goals.

  • Finding: Group conflict arose quickly but could be mitigated through superordinate goals requiring cooperation.

6. Learned Helplessness

Seligman’s Dog Experiment (1967)

  • Purpose: To study how animals (and by extension humans) develop helplessness after repeated failure or suffering.

  • Setup: Dogs were subjected to unavoidable shocks. Later, even when escape was possible, some dogs didn’t try.

  • Finding: Repeated exposure to uncontrollable stress led to passivity, later applied to understanding depression in humans.


7. Social Identity and Prejudice

Jane Elliott’s “Blue Eyes/Brown Eyes” Exercise (1968)

  • Purpose: To teach schoolchildren about racism and prejudice.

  • Setup: Elliott divided her class by eye color and gave one group privileges, then reversed the roles.

  • Finding: Children quickly adopted discriminatory attitudes, showing how easily social divisions can lead to bias and mistreatment.

8. Altruism and Moral Decision-Making

Good Samaritan Experiment (1973)

  • Purpose: To test whether situational factors influence helping behavior.

  • Setup: Seminary students on their way to give a talk (some on the topic of the Good Samaritan) encountered someone in distress. Some were told they were late, others had time.

  • Finding: Those in a hurry were far less likely to stop and help, regardless of moral training.


These experiments have shaped our understanding of human psychology and social behavior—but many are now cited in discussions about research ethics, as they often involved deception and emotional distress.

Cognitive biases and Manipulation tactics

COGNITIVE BIASES AND MANIPULATION TACTICS

Cognitive Dissonance, Confirmation Bias, Gaslighting, and the Sunk Cost Fallacy are all examples of cognitive distortions, psychological biases, or manipulation tactics USED BY GOVERNMENTS AND MEDIA that influence human thinking and behavior. These concepts belong to different but overlapping categories in psychology, such as cognitive biases, logical fallacies, defense mechanisms, and emotional manipulation tactics. Below is a list of many commonly recognized ones, each with a brief explanation:


Cognitive Biases

Systematic patterns of deviation from norm or rationality in judgment.

  • Cognitive Dissonance
    When a person holds two conflicting beliefs or behaviors, it causes psychological discomfort, leading them to adjust their thoughts or rationalize their actions to reduce the dissonance.

  • Confirmation Bias
    The tendency to search for, interpret, and remember information in a way that confirms one’s preexisting beliefs, while ignoring contradictory evidence.

  • Anchoring Bias
    Relying too heavily on the first piece of information encountered (the “anchor”) when making decisions, even if it’s arbitrary or irrelevant.

  • Availability Heuristic
    Overestimating the likelihood of events based on their availability in memory—often because they are recent, vivid, or emotionally charged.

  • Bandwagon Effect
    Adopting beliefs or behaviors because many others do, often without critical evaluation.

  • Negativity Bias
    The tendency to give more psychological weight to negative experiences or information than to positive ones of equal intensity.

  • Hindsight Bias
    The “I knew it all along” effect—believing, after an event has occurred, that the outcome was predictable.

  • Fundamental Attribution Error
    Overemphasizing personal characteristics and ignoring situational factors when judging others’ behavior.

  • Self-Serving Bias
    Attributing one’s successes to internal factors (like skill) and failures to external factors (like bad luck).

  • Dunning-Kruger Effect
    People with low ability or knowledge often overestimate their competence, while highly skilled individuals may underestimate theirs.


Logical Fallacies

Errors in reasoning that undermine the logic of an argument.

  • Sunk Cost Fallacy
    The tendency to continue investing in something because of what has already been invested (time, money, effort), even when it’s no longer beneficial.

  • Straw Man Fallacy
    Misrepresenting someone’s argument to make it easier to attack or refute.

  • Ad Hominem
    Attacking the person making an argument rather than the argument itself.

  • Slippery Slope
    Assuming that a relatively small first step will inevitably lead to a chain of related (and usually negative) events.

  • Appeal to Authority
    Arguing something is true because an authority or expert says it is, without critically evaluating the argument itself.

  • False Dilemma (Black-or-White Thinking)
    Presenting only two options when in fact more exist.

  • Circular Reasoning
    When the conclusion of an argument is assumed in the premise (e.g., “I’m trustworthy because I say I am”).

Manipulation & Gaslighting Techniques

Tactics used to confuse, control, or undermine others.

  • Gaslighting
    A form of psychological manipulation in which a person causes another to doubt their memories, perceptions, or sanity—often to gain power or control.

  • Love Bombing
    Overwhelming someone with attention and affection to gain control or make them emotionally dependent.

  • Triangulation
    Bringing in a third party to create rivalry or insecurity in a relationship, often used by narcissists to maintain power.

  • Blame-Shifting
    Deflecting responsibility by blaming others for one’s own actions or failures.

  • Stonewalling
    Refusing to engage or communicate, used to punish or control a conversation or relationship.

Defense Mechanisms (from Freudian psychoanalytic theory)

Unconscious strategies the mind uses to protect itself from anxiety.

  • Denial
    Refusing to accept reality or facts, acting as if a painful event, thought, or feeling does not exist.

  • Projection
    Attributing one’s own unacceptable thoughts or feelings to someone else (e.g., accusing someone of being angry when you are the one feeling anger).

  • Displacement
    Redirecting emotions from a threatening target to a safer one (e.g., yelling at a pet instead of your boss).

  • Rationalization
    Creating logical-sounding excuses or explanations to justify irrational or harmful behavior.

  • Repression
    Unconsciously blocking painful thoughts or memories from awareness.

  • Sublimation
    Channeling unacceptable impulses into socially acceptable or productive activities (e.g., using aggression in sports).

Social & Moral Psychological Effects

  • Bystander Effect
    The more people who are present during an emergency, the less likely any one of them is to help.

  • Moral Licensing
    After doing something moral or good, people may feel “licensed” to do something immoral or self-indulgent.

  • Pluralistic Ignorance
    When people wrongly believe their thoughts or behaviors are different from others’, leading to inaction or conformity.

  • Groupthink
    When a group prioritizes consensus over critical thinking, often leading to poor decisions.

  • Illusory Superiority
    The belief that one is better than average at various tasks or in personal qualities, which is statistically impossible for most.

  • Just-World Hypothesis
    The belief that people get what they deserve, leading to victim-blaming and rationalization of injustice.

Psychological Operation (Psyop) tactics and identification

Psychological operation (psyop) tactics and identification

This is a summary of a presentation by a best-selling author in behavioural psychology, with 20 years in the military as a behavioural expert.

Original presentation:

https://www.youtube.com/watch?v=b3AN2wY4qAM

A psyop is designed to make the masses trust and believe the perpetrators.

Most people have no idea how easily our behaviour and even identity can be taken over.

The psyop FATE model

Focus, Authority, Tribe and Emotion. This is what triggers the mammalian part of our brain that makes all of our real decisions.

Focus

Hijack your attention with repetition, shocking visuals, fear-inducing scenarios.

Examples: 24/7 media coverage of a crisis, constant repetition of terms like “unprecedented”.

Antidote: ask “why is this message being pushed so aggressively?” Compare how much airtime different stories are getting. Is there a disproportionate focus?

Authority

Trusted figures might shift there positions unnaturally to support the narrative.

Antidote: watch for “expert panelists” or endorsements from authority figures outside of their expertise.

Tribe

Messages that polarise opinions or create stark “in” groups vs “out” groups. Eg labelling groups as scientific vs deniers.

Emotions

Emotional responses suppress critical thinking

Eg. Messages that promote fear, hope or outrage without any clear verifiable evidence.

Repeated media pictures of empty shop shelves or panic buying during a food shortage dominating every broadcast.

Antidote: Look for messages that appeal to emotion without providing facts.

Identifying a psyop

  1. Question novelty. Our brains are wired to focus on sudden or unusual changes. So news of unexpected crises triggers this.

Antidote: ask “why is this new?”

  1. Look for timing – is it coming out at the same time as damaging revelations about a Government official (for example).
  1. Look for multiple sources of the same message – centralised sources that create an echo chamber, silencing dissent and narrowing perspective. Eg. If multiple media outlets present identical talking points or messages it is a HUGE red flag. Seek independent commentators.
  1. Identify cognitive dissonance weapons. (Cognitive dissonance – when new information clashes with our identity or beliefs).

Psyops exploit this by creating “micro-agreements” – small seemingly harmless concessions to shape your thinking. Over time it leads you to unconsciously modify your behaviour to reduce that internal conflict. Example “Only good citizens do x, y, z. If you disagree with that you risk some identity conflict. Ask “Am I being nudged to identify with another group?”

  1. Look for messages that trigger emotion.

E.g  fear – of loss (scarcity), danger or social (tribal) rejection. Break the script – focus on the facts. Ask “what is the likelihood of this scenario?” Compare the claims with other sources.

  1. Follow the money. Look for sponsorships or political connections that are tied to the narrative. There is always a trail of beneficiaries.
  2. Analyse the context boundary. Manipulative people shift context to normalise extreme behaviour. Do the responses seem disproportionate compared with other scenarios?

Weaponising cognitive dissonance and shifting the context can manipulate people to do almost anything.

  1. Spot the use of archetypes (e.g. heroes, villains, saviours) and simplistic stories (like good vs evil). Deconstruct the story. Ask who the characters are and what roles they are playing
  2. Evaluate the frame.
  3. a) Expectation – what are you expected to do or believe in response?
  4. b) Belief – what assumptions are being made about you – your values or fears?
  5. c) Perception – how is reality being shaped, e.g. by using selective facts?
  6. d) Definition – what “truth” is being asserted as something completely unchallengeable?
  7. e) Information suppression – what topics are being avoided, critics being silenced?
  8. Be alert to rapid compliance shifts. Psyops try to get people to change behaviour rapidly before critical thinking kicks in.
  9. Study the timing of the event. Why is this happening now? What other unrelated events or scandals might this be distracting everyone from? Coincidence? Probably not.
  10. Teach your brain to recognise logical fallacies:
  11. Use of emotion – fear and panic to bypass logic.
  12. Strawman argument – misrepresenting someone’s argument so it can be attacked better.
  13. Bandwagon fallacy – this must be true because “everyone” supports it.
  14. False dilemma – presenting only 2 extreme options, ignoring alternatives. Leads to “You are with us or against us”.
  15. As hominem attack – attacking the person instead of the argument. “You can’t listen to x because s/he isn’t a scientist.”
  16. Appeal to authority – claiming something is true because an “authority figure” says so, even though they might have no expertise or might have a major conflict of interest.
  17. Slippery slope – one action will lead to some extreme negative outcome.
  18. Hasty generalisation – broad claims made with very little evidence.
  19. Red Herring – distracting you with irrelevant information.
  20. False equivalence – compares two things as equal when they are not.

NCI Engineered Reality Scoring System <PDF LINK BELOW>

NCI Psyop Scoring System

 

Social Experiments

3 Videos

Cognitive Biases

3 Videos

WHERE DOES THIS TOPIC FIT INSIDE THE BIG PICTURE?

Manipulation tactics are created above but deployed by Governments, Media and Institutions.

This is arguably a topic for the Government or Media section, however it’s roots lie in the architects of psychology and manipulation who work beside the major influencers of Governments, Media and institutions.

While in some cases a politician or a journalist may actually think they are working as objective participants, it is through a combination of their own years long indoctrination and the selection criteria of their industries that result in exactly the type of ‘journalist’ or ‘politician’ the controlling class wants, to fulfil their agenda.

Former KGB operative Yuri Besmanov explains how a nation is taken over from within.