On my recent trip to Cyprus I uploaded a few pics of what the guide introduced as Ghost Town at the border of Turkish and Greek Cyprus. There has been a conflict between government of two parts and it was in 1974 that Turkey invaded and almost took half of the Cyprus. This was followed by some post war actions in which the inhabitants from Occupied side were moved out of their homes and one border town within Nicosia that comprised of nearly 4000 buildings remained in a high tension area and as there were armed clashes between two sides therefore the whole town remained uninhabitable for various reasons. It was however on 12th July 2014 that the journalist Nick Enoch used the word Ghost Town for the first time for this area that became popular. The town has nothing to do with ghosts or supernatural.

Nevertheless once I put that on my facebook page some of my friends took significant interest in it and asked me a number of questions including one that did you note any suspicious/paranormal activity there? This led me to explore the psychology of fear and I found some very good understanding of how such fears develop in our mind in Dan Gardner’s book titled “Risk”. below are the excerpts taken from Blinkist App on Dan’s book.

Our brains are ancient, so when it comes to risk perception, it’s like using really outdated hardware to run the latest and most sophisticated software.

Human brains underwent a big change in the Stone Age. About 500,000 years ago, they grew from 650 cubic centimeters to 1,200 cubic centimeters. That’s only a slightly smaller than our current brain size of 1,400 cubic centimeters.

The final jump to 1,400 cubic centimeters occurred 200,000 years ago when the Homo sapiens species was born. DNA analysis has proved that every person alive today can be traced back to a single Homo sapiens ancestor from only 100,000 years ago.

But our brains haven’t changed a lot since then. Agriculture was developed 12,000 years ago and the first cities were built 4,600 years ago. Brain development didn’t progress nearly as fast as our environments did. The world has changed dramatically, but our brains have basically stayed the same.

Take the way we perceive snakes. Everyone is born with a fear of snakes. It’s hardwired into our brains because it helped our ancestors survive and pass on their genes. Even people from places with no snakes, such as the Arctic, have an innate fear of them. Car accidents are a much bigger threat to our safety, but we haven’t evolved to fear cars.

The “Law of Similarity” is another holdover from the past. By the Law of Similarity, humans believe that things are similar when they look similar.

Our brains play some funny tricks on us. Daniel Kahneman won a Nobel Prize for illuminating one of them: we have two distinct brain systems that help us reason, and they produce different results!

The first is called System 1 or gut. It runs quickly and without your being conscious of it. You’re using System 1 when you intuitively feel something is right or wrong without knowing why.

System 1 operates based on a few simple rules. The Law of Similarity is part of System 1: if something looks like a lion, it’s probably a lion, and you should get away from it.

The problem is that System 1 is often inaccurate and doesn’t adapt well to new situations. Remember the snakes? System 1 tells you to jump in fear even if you see them in a movie and know they can’t cause you any harm.

The strength of System 1 can be illustrated by a simple math problem: let’s say a bat and a ball cost $1.10 in total. The bat costs $1 more than the ball. How much does the ball cost?

Most people intuitively answer ten cents because it feels right, even though that’s not the right answer. It’s a simple math problem, but it trips people up because it goes against our gut.

System 2 or head runs on conscious thought. System 2 is at work when you carefully think through a problem or situation. It reminds us to calm down when we feel frightened of terrorist attacks because it knows they’re unlikely to affect us.

But System 2 also has its flaws. First off, it’s slow. Second, it has to be fed by education. We need to utilize our math skills to determine that the ball actually costs five cents.

Our System 1 gut reactions often lead us to make rash decisions because they rely on heuristics. Heuristics are basically cognitive shortcuts that tell us what to do.

The Rule of Typical Things is an example of this. It states that when a question contains information we find typical, our intuition takes over when we formulate an answer.

The Rule of Typical Things can be illustrated by Kahneman’s famous Linda problem. Linda is bright and outspoken. She majored in philosophy, participates in anti-nuclear protests and is passionate about social justice.

Which is more likely: a) that Linda is a bank teller or b) that Linda is a bank teller who is active in the feminist movement?

85 percent of Kahneman’s students answered “b,” even though it’s clearly wrong. It’s more likely that she’s just a bank teller as opposed to a bank teller and a feminist. The “b” option being correct depends on the “a” option being correct, plus it has another condition.

The logic here is simple, but your gut takes over because of the Rule of Typical Things. She studied philosophy, she’s an activist and she cares about social justice, so we assume she’s a feminist.

The Example Rule, also called the availability heuristic, also guides your gut. It states that your gut is highly influenced by the ease with which an example appears.

“Breast implants cause cancer – I saw it on TV!” Statements or anecdotes like this are more powerful than you might think. Anecdotes are little stories about other people that we use to affirm certain beliefs or feelings.

Powerful anecdotes spread in 1994 when the American media ran several stories about women who had apparently developed connective-tissue diseases because of their silicone breast implants.

The media was flooded with stories of “toxic breasts” and “ticking time bombs,” but there was no real scientific evidence that the implants caused the diseases.

Despite this, the implants manufacturer Dow Corning was faced with a class action lawsuit that same year. In the end, they shelled out $4.25 billion to women with implants. Over half the women who had Dow Corning implants registered for the settlement and the company went bankrupt.

However, a 1994 study found there was no link between implants and connective-tissue disease. Panic had spread, and the company was destroyed, for no logical reason.

Part of the reason we take anecdotal evidence so seriously is that it’s difficult for us to understand numbers and probability. In fact, our innate mathematical skills aren’t much better than a rat’s or a dolphin’s.

Think again before you let a piece of information turn into a fear for you.