INTRODUCTION
To control populations you need to know how populations think as individuals and as collectives before manufacturing a crisis to control their behaviour.
Over the decades, Social Experiments have been carried out to calculate how most people respond to certain situations. From this, Governments and those who control governments can create manipulation tactics and execute psychological operations that allow for their most ideal result to play out.
The average person cannot fathom the idea that their own Government would have systems and structures in place to control their behaviour and ironically, that is why they work so well. The more that populations trust them, the more they can get away with.
The KNOWLEDGE BASE below outlines these experiments, cognitive biases, manipulation tactics, and psychological operations so that you can recognise them and engineer your own plan of attack against them.
Social experiments are structured studies conducted to observe how people behave in specific situations—often revealing insights about conformity, obedience, prejudice, cooperation, empathy, or morality. This kind of information about people is terribly important for Governments and those who control them – as it can provide the key to controlling a population.
Below are key examples of influential social experiments, organized by the type of human behavior they explored:
Purpose: To test how far individuals would go in obeying an authority figure, even if it meant harming another person.
Setup: Participants were told they were assisting in a memory test. They were instructed to administer increasingly severe electric shocks to a “learner” (actually an actor) when incorrect answers were given.
Finding: A majority of participants continued administering shocks, even up to dangerous levels, simply because an authority figure told them to.
Purpose: To observe whether individuals would conform to a group’s incorrect consensus.
Setup: Participants were placed in a group with actors who gave obviously incorrect answers to simple visual questions. The real subject answered last.
Finding: Many participants conformed with the group, even when the group was clearly wrong, highlighting the power of peer pressure.
Purpose: To examine how individuals adopt roles of power (guards) or submission (prisoners) in a simulated prison environment.
Setup: College students were randomly assigned roles as guards or prisoners in a mock prison. The experiment was supposed to last two weeks.
Finding: The guards became abusive and the prisoners distressed. The experiment was terminated after just six days due to ethical concerns.
Purpose: To investigate why people often fail to help others in emergency situations when others are present.
Setup: Participants overheard someone (an actor) having a seizure. The number of other people the participant believed were present was varied.
Finding: The more people who were believed to be present, the less likely any one person was to help—illustrating diffusion of responsibility.
Purpose: To explore how intergroup conflict can arise and be resolved.
Setup: Two groups of 11-year-old boys at a summer camp were separated and competed in games. Conflict escalated, but cooperation was restored when they had to work together on common goals.
Finding: Group conflict arose quickly but could be mitigated through superordinate goals requiring cooperation.
Purpose: To study how animals (and by extension humans) develop helplessness after repeated failure or suffering.
Setup: Dogs were subjected to unavoidable shocks. Later, even when escape was possible, some dogs didn’t try.
Finding: Repeated exposure to uncontrollable stress led to passivity, later applied to understanding depression in humans.
Purpose: To teach schoolchildren about racism and prejudice.
Setup: Elliott divided her class by eye color and gave one group privileges, then reversed the roles.
Finding: Children quickly adopted discriminatory attitudes, showing how easily social divisions can lead to bias and mistreatment.
Purpose: To test whether situational factors influence helping behavior.
Setup: Seminary students on their way to give a talk (some on the topic of the Good Samaritan) encountered someone in distress. Some were told they were late, others had time.
Finding: Those in a hurry were far less likely to stop and help, regardless of moral training.
These experiments have shaped our understanding of human psychology and social behavior—but many are now cited in discussions about research ethics, as they often involved deception and emotional distress.
Cognitive Dissonance, Confirmation Bias, Gaslighting, and the Sunk Cost Fallacy are all examples of cognitive distortions, psychological biases, or manipulation tactics USED BY GOVERNMENTS AND MEDIA that influence human thinking and behavior. These concepts belong to different but overlapping categories in psychology, such as cognitive biases, logical fallacies, defense mechanisms, and emotional manipulation tactics. Below is a list of many commonly recognized ones, each with a brief explanation:
Systematic patterns of deviation from norm or rationality in judgment.
Cognitive Dissonance
When a person holds two conflicting beliefs or behaviors, it causes psychological discomfort, leading them to adjust their thoughts or rationalize their actions to reduce the dissonance.
Confirmation Bias
The tendency to search for, interpret, and remember information in a way that confirms one’s preexisting beliefs, while ignoring contradictory evidence.
Anchoring Bias
Relying too heavily on the first piece of information encountered (the “anchor”) when making decisions, even if it’s arbitrary or irrelevant.
Availability Heuristic
Overestimating the likelihood of events based on their availability in memory—often because they are recent, vivid, or emotionally charged.
Bandwagon Effect
Adopting beliefs or behaviors because many others do, often without critical evaluation.
Negativity Bias
The tendency to give more psychological weight to negative experiences or information than to positive ones of equal intensity.
Hindsight Bias
The “I knew it all along” effect—believing, after an event has occurred, that the outcome was predictable.
Fundamental Attribution Error
Overemphasizing personal characteristics and ignoring situational factors when judging others’ behavior.
Self-Serving Bias
Attributing one’s successes to internal factors (like skill) and failures to external factors (like bad luck).
Dunning-Kruger Effect
People with low ability or knowledge often overestimate their competence, while highly skilled individuals may underestimate theirs.
Errors in reasoning that undermine the logic of an argument.
Sunk Cost Fallacy
The tendency to continue investing in something because of what has already been invested (time, money, effort), even when it’s no longer beneficial.
Straw Man Fallacy
Misrepresenting someone’s argument to make it easier to attack or refute.
Ad Hominem
Attacking the person making an argument rather than the argument itself.
Slippery Slope
Assuming that a relatively small first step will inevitably lead to a chain of related (and usually negative) events.
Appeal to Authority
Arguing something is true because an authority or expert says it is, without critically evaluating the argument itself.
False Dilemma (Black-or-White Thinking)
Presenting only two options when in fact more exist.
Circular Reasoning
When the conclusion of an argument is assumed in the premise (e.g., “I’m trustworthy because I say I am”).
Tactics used to confuse, control, or undermine others.
Gaslighting
A form of psychological manipulation in which a person causes another to doubt their memories, perceptions, or sanity—often to gain power or control.
Love Bombing
Overwhelming someone with attention and affection to gain control or make them emotionally dependent.
Triangulation
Bringing in a third party to create rivalry or insecurity in a relationship, often used by narcissists to maintain power.
Blame-Shifting
Deflecting responsibility by blaming others for one’s own actions or failures.
Stonewalling
Refusing to engage or communicate, used to punish or control a conversation or relationship.
Unconscious strategies the mind uses to protect itself from anxiety.
Denial
Refusing to accept reality or facts, acting as if a painful event, thought, or feeling does not exist.
Projection
Attributing one’s own unacceptable thoughts or feelings to someone else (e.g., accusing someone of being angry when you are the one feeling anger).
Displacement
Redirecting emotions from a threatening target to a safer one (e.g., yelling at a pet instead of your boss).
Rationalization
Creating logical-sounding excuses or explanations to justify irrational or harmful behavior.
Repression
Unconsciously blocking painful thoughts or memories from awareness.
Sublimation
Channeling unacceptable impulses into socially acceptable or productive activities (e.g., using aggression in sports).
Bystander Effect
The more people who are present during an emergency, the less likely any one of them is to help.
Moral Licensing
After doing something moral or good, people may feel “licensed” to do something immoral or self-indulgent.
Pluralistic Ignorance
When people wrongly believe their thoughts or behaviors are different from others’, leading to inaction or conformity.
Groupthink
When a group prioritizes consensus over critical thinking, often leading to poor decisions.
Illusory Superiority
The belief that one is better than average at various tasks or in personal qualities, which is statistically impossible for most.
Just-World Hypothesis
The belief that people get what they deserve, leading to victim-blaming and rationalization of injustice.
Psychological operation (psyop) tactics and identification
This is a summary of a presentation by a best-selling author in behavioural psychology, with 20 years in the military as a behavioural expert.
Original presentation:
https://www.youtube.com/watch?v=b3AN2wY4qAM
A psyop is designed to make the masses trust and believe the perpetrators.
Most people have no idea how easily our behaviour and even identity can be taken over.
The psyop FATE model
Focus, Authority, Tribe and Emotion. This is what triggers the mammalian part of our brain that makes all of our real decisions.
Focus
Hijack your attention with repetition, shocking visuals, fear-inducing scenarios.
Examples: 24/7 media coverage of a crisis, constant repetition of terms like “unprecedented”.
Antidote: ask “why is this message being pushed so aggressively?” Compare how much airtime different stories are getting. Is there a disproportionate focus?
Authority
Trusted figures might shift there positions unnaturally to support the narrative.
Antidote: watch for “expert panelists” or endorsements from authority figures outside of their expertise.
Tribe
Messages that polarise opinions or create stark “in” groups vs “out” groups. Eg labelling groups as scientific vs deniers.
Emotions
Emotional responses suppress critical thinking
Eg. Messages that promote fear, hope or outrage without any clear verifiable evidence.
Repeated media pictures of empty shop shelves or panic buying during a food shortage dominating every broadcast.
Antidote: Look for messages that appeal to emotion without providing facts.
Identifying a psyop
Antidote: ask “why is this new?”
Psyops exploit this by creating “micro-agreements” – small seemingly harmless concessions to shape your thinking. Over time it leads you to unconsciously modify your behaviour to reduce that internal conflict. Example “Only good citizens do x, y, z. If you disagree with that you risk some identity conflict. Ask “Am I being nudged to identify with another group?”
E.g fear – of loss (scarcity), danger or social (tribal) rejection. Break the script – focus on the facts. Ask “what is the likelihood of this scenario?” Compare the claims with other sources.
Weaponising cognitive dissonance and shifting the context can manipulate people to do almost anything.
NCI Engineered Reality Scoring System <PDF LINK BELOW>
WHERE DOES THIS TOPIC FIT INSIDE THE BIG PICTURE?

This is arguably a topic for the Government or Media section, however it’s roots lie in the architects of psychology and manipulation who work beside the major influencers of Governments, Media and institutions.
While in some cases a politician or a journalist may actually think they are working as objective participants, it is through a combination of their own years long indoctrination and the selection criteria of their industries that result in exactly the type of ‘journalist’ or ‘politician’ the controlling class wants, to fulfil their agenda.
Former KGB operative Yuri Besmanov explains how a nation is taken over from within.