Rituals make Christmas merrier

When I was an ankle-biter in a different land at different times, an old uncle told me to place my shoes in the chimney on Christmas Eve if I wanted a gift from Santa Claus. I’m still angry with the white-bearded, big-bellied man for not bringing me a gift. That was my first encounter with Christmas in a non-Christian family. You may not be a Christian but you may find it hard to get away from Christmas rituals at this time of the year.

Christmas rituals, religious or otherwise, can be exciting for children, but they also give them a sense of security, stability and affection. And they also improve psychological health and wellbeing of adults. Christmas cards, carols and crackers might seem like lively traditions; however, a review of 50 years of research on rituals suggests that they involve symbolic communication and convey “this is what we are” as a group.

The review by Barbara H. Fiese and her colleagues at Syracuse University suggests that family rituals provide continuity in meaning across generations. “Also, there is often an emotional imprint where once the act is completed, the individual may replay it in memory to recapture some of the positive experience,” she says.

While knocking on wood might not ward off evil spirits, but many everyday rituals are surprisingly effective. We can trace the reason for real benefits to rituals in our evolutionary history. Before the development of languages, humans related to each other via “symbols”. US anthropologist Terrence W. Deacon argues in his book, The Symbolic Species, that these symbols would have been made up of extended, effortful and complex sequences of behaviours performed in groups – in other words, rituals. Rituals, he says, connected human groups and enables them to ensure that they had a shared understanding of how the group worked.

Numerous studies show that family rituals – of any shape or form –  really help people to get closer to one another. Kathleen Vohs, a psychologist at the University of Minnesota, who is involved in a large-scale online study of rituals, says that rituals help people to share an experience without feeling awkward or forced. What can be a better-shared experience than sharing a meal with others? Well, we don’t need psychologists to tell us that the family that eats dinner together stays together.

On the matter of warm feelings of giving and being given to Christmas gifts, UK philosopher Robert Rowland Smith weighs in: “To be rewarded with a gift is to be subtly told that someone loves you, and we love nothing better than to be loved.”

For your wellbeing, unpack your gifts and say thank you to the givers, clink your glasses and share plum pudding (which might not contain any plums) with your family and friends. Merry Christmas!

© Surendra Verma 2018

Trying to remember your name

When older people begin to mutter whatshisname or whatshername, we all assume that’s what happens in old age. Onomastic aphasia, the medical name for whatshisname condition (the name is on the tip of your tongue but can’t recall), conjures up painful images of dementia: memories of the past slowly slipping away in old age.

As a 75-year-old I’m sure I’m not alone in this, the name will never come to me however hard I tried to recall it. But minutes or hours later, when I was not consciously thinking about the name, it suddenly appears.

Contrary to popular belief, we do not spend most of our time engaged in goal-directed thoughts, and occasionally we have blips of irrelevant thoughts that pop up on the radar. The truth is that most of the time we are engaged in less directed, unintended thoughts, and that state is routinely interrupted by periods of goal-directed thoughts. The human brain prefers its default network, a region that remains active when the brain is supposedly doing nothing. But it immediately springs into action when some task is required.

When the brain is in the so-called resting state, it is doing a tremendous amount. During its ‘off’ times the brain may not be involved in specific tasks, but it is still busy working out responses to internal thoughts or anticipating what needs to be done in the future. This gives our brain an amazing capacity to multitask. Without multitasking, we’d be pretty constrained creatures, indeed.

The names I try to remember pop up when my brain is in its default mode. How could we tap into the default network of our brains?

To shift instantly into brain’s default mode, or free-form attention as opposed to on-task focus, all an individual has to do is goof off, advises Lea Waters, founding director of the Centre for Positive Psychology at the University if Melbourne.

In her recent book, The Strength Switch: How the New Science of Strength-Based Parenting Can Help Your Child and Teen to Flourish, she advises parents and teachers to move away from the idea that the more specific and directed the learning, the better. The brain’s default mode has many benefits, including children’s learning and their development.

‘Good goofing off is not texting or talking on the phone, which pulls the child into the external world,’ she advises. ‘It’s about giving a child’s brain the chance to reboot and come back sharper and more attentive when the time arrives.’

Don’t worry if you can’t recall the name of the writer of this article, even after goofing off. Fading brainpower is not an inevitable part of growing older. It’s a myth.

Michael Ramscar, a linguistics researcher at Universität Tübingen in Germany, ascribes the popular belief, in part, to Greek mythology. Eos, the goddess of dawn, begged Zeus to grant immortality to Tithonus, a mortal whom she had married. Zeus agreed to this request. But she forgot to ask also for perpetual youth dooming Tithonus to an eternity of physical and mental decay. Ramscar remarks that Tithonus’ account of ageing echoes loudly in brain-science literature, which portrays old age as a protracted episode in mental decline, in which memories dim, thoughts slow down and problem-solving abilities diminish.

He suggests that many of the assumptions scientists currently make about ‘cognitive decline’ are seriously flawed and, for the most part, formally invalid. He agrees that our brains work slower in old age but only because we have stored more information over time. ‘The brains of older people do not get weak,’ he says. ‘On the contrary, they simply know more.’ Older brains are so jam-packed with the knowledge that they simply take longer to retrieve the correct bits of information. This brimming store of knowledge helps older brains to compensate for any loss related to ageing.

Ramscar also has an explanation for whatsisname condition. There is a greater variety of given or first names than there were two generations ago. This means the number of different names we learn over our lifetimes has increased dramatically. Locating a name in memory, therefore, is far harder than it used to be. It’s true even for computers; that’s why we need supercomputers.

© Surendra Verma 2017

Fighting the battle of the bulge

Different types of food produce different amounts of energy.

Our awareness of the link between body weight and food intake began in 1896 when Wilbur Atwater (1844-1907), an American agricultural chemist, showed that different types of food produced different amounts of energy and the efficiency of a diet should be measured in food calories (or kilojoules if you prefer the metric term).

In Atwater’s time knowledge of the nutrients and their functions was very limited: carbohydrates and fats provide energy; proteins build and repair tissues; vitamins were unknown and only a few minerals such as calcium and phosphorus were recognised as essential but their role was unknown. With his colleague, E. B. Rosa, a physicist, Atwater developed a calorimeter to measure the calorific value of different foods. Atwater’s measurements mark the beginning of the quest for scientific understanding of nutrition.

In 1919 American scientists J. Arthur Harris and Francis Benedict devised an equation for calculating how many calories we need to consume each day. The Harris-Benedict equation determines a person’s ideal calorie intake by taking into account age, gender, height and weight (Google ‘Harris-Benedict equation’ to search online calculators to work out your basal metabolic rate or BMR, a scientific term for ideal calorie intake).

Once you know your BMR, fighting the battle of the bulge is a simple task: all you have to do is to strike a balance between two variables, diet and exercise. In her book, Calculus Diaries, science writer Jennifer Ouellette, has turned the simple arithmetic of diet and exercise into calculus by introducing another variable: the ‘tastiness’. To Ouellette, ‘tastiness’ is ‘the pleasure we derive from our food intake, given a fixed number of calories we can consume per day and a fixed amount of money we can spend on groceries.’

‘So if we know what we’re eating each day now, what small change can we make in our diet to optimize how much we enjoy mealtimes?’ she asks. To record the small, incremental change recommended by her, requires a graph pad, a pencil, a good knowledge of calculus and singing ‘You take a function of diet and you call it yummy’ to the right tune. Wouldn’t you rather eat that chocolate doughnut, now?

© Surendra Verma 2019

Pour nothing but water down the sink

Mrs Beeton’s kitchen rules

The eldest of 21 (yes, 21) siblings, Isabella Mary Mayson was born in London in 1836. She married to Samuel Beeton, an ambitious young publisher, when she was hardly 20 years old. Soon after marriage she started writing articles and books on cookery and household management. She died young at the age of 28 in 1865. She is remembered now for classics such as Mrs Beeton’s Book of Household Management (published in 1861, it’s one of the great publishing successes of all times, nearly 2 million copies were sold by 1868) and Mrs Beeton’s Every Day Cookery and Household Book (1862).

Mrs Beeton believed that if novices committed her culinary maxims to memory, they would have before them the fundamental truths of the art of cookery. A selection:

  • A good cook looks ahead … there is no work like early work.
  • Muddle makes more muddle.
  • Dirty saucepans filled with hot water begin to clean themselves.
  • Wash well a saucepan, but clean a frying-pan with a piece of bread.
  • Thrust an oniony knife into the earth to take away the smell.
  • Pour nothing but water down the sink
  • Green vegetables should be boiled fast with the lid off.
  • Fish boiled should be done slowly, with a little vinegar.
  • Water boils when it gallops, oil when it is still.
  • A stew boiled is a stew spoiled.
  • One egg, beaten well, is worth two not beaten.
  • Draw fresh water for the kettle to boil for tea, cocoa, or coffee.
  • Make the tea directly the water boils.

The first ‘domestic goddess’ (a moniker made famous by Nigella Lawson) was also the first to support worry-free, free-range chickens when she warned: Never eat a depressed chicken.

© Surendra Verma 2019

Make children intelligent science consumers

Chikdren  may turn into copy-and-paste (from online to mind) information junkies if we don’t’ teach them to think scientifically.

Fake news works because our minds are lazy; they subconsciously rely on shortcuts to make quick decisions, accept too much at face value, and if something is familiar is also safe.

Our minds are not blank slates; they are eager to assimilate new information in their world view, a projection of their self. What we call the self is simply a story, a story that we continuously write and rewrite in our minds.

How would a secondary school student (let’s call her Jane as “familiar is safe”) assimilate new information if she comes across anti-vaccination Facebook pages well curated by parents of one of her friends, an anti-vaccine campaigner? The pages present a well-argued claim linking childhood immunisation to autism. The claim quotes a research paper published by a British doctor in 1997 in Lancet, a prestigious medical journal, which suggested that measles, mumps and rubella (MMR) vaccine is increasing autism in British children.

 Jane is fluent in social media, but it doesn’t mean she is also an expert in judging the credibility of online information. She cannot dismiss this false claim as hogwash as she doesn’t know that the Lancet article was an elaborate lie and was retracted by the journal in 1998. A slew of studies now shows that vaccines are safe, effective and save lives. Besides, vaccination is not a personal choice; it’s a social responsibility.

The way Jane’s impressionable mind will filter and shape the anti-vaccination information will be influenced by immediately believing that the “facts” come from a familiar source, a friend’s parents, and accepting the “authoritative source” at face value. Behavioural scientist Daniel Kahneman would describe Jane’s experience as the phenomenon of “what you see is all there is”, the cognitive laziness of assuming the facts to hand is all the information you need.

In his best-selling book Thinking, Fast and Slow, Kahneman (he won a Nobel Prize in economics, but he never attended an economics class in his school or college) talks about the “halo effect” (that first impressions can overwhelm subsequent information). Anti-vaxxers have got one more recruit to continue their pseudoscientific cause as the myth is likely to persist in Jane’s mind even when she reads about the new research.

Science education is not merely a matter of teaching new ideas, but more important is teaching how these ideas become accepted by scientists. A sensational news item on new research finding students may read in a popular media doesn’t mean the study result has been automatically stamped “proven by science”. Science advances unpredictably, not linearly in a series of eureka moments; one scientific study often disputes the other, sometimes followed by the third that contradicts both. An idea is labelled truly “scientific” only when it has earned the consensus among the majority of scientist in that particular field. Even then it can be challenged by other scientists. Theories of science are continually being added to and updated.

Like most of her friends, when our hypothetical Jane gets out of bed, she reaches for her smartphone and checks her social media feeds. There is no time to process and synthesise new information; it’s all copy (from online) and paste (on mind).

Students all run the risk of turning into copy-and-paste information junkies if they don’t’ learn to think scientifically, that is, think critically. If in their science classes, students discuss topics with each other and get frequent, targeted, feedback, they tend to do better. Parents and teachers should create this kind of learning environment even before the kids learn to tie their shoelaces. It is easy as young children are naturally curious and they frequently ask how and why questions. These children will be free from naïve intuitions and false beliefs.

In an article, “Online and Scared”, in The New York Times, Thomas L. Friedman, a renowned commentator who has won three Pulitzer Prizes, writes that the mass of our interactions has “moved to a realm where we’re all connected but no one’s in charge”. He suggests teaching children that “the internet is an open sewer of untreated, unfiltered information, where they need to bring skepticism and critical thinking to everything they read and basic civic decency to everything they write”.

Interestingly, a recent research study, “Analytical Thinking Promotes Religious Disbelief”, published in the respected journal Science shows that regardless of their religious background, the subjects who were encouraged to adopt an analytical stance in solving problems reported significantly reduced religious convictions compared with people who didn’t receive the same clues.

Don’t let your lazy mind come under the “halo effect” that the results of one scientific experiment will make you lose your religious faith. Scepticism and critical thinking will let you examine your intuitions and beliefs in different ways.

© Surendra Verma 2019

Turn up the wattage of your smile

Smiling may not make cartoons funnier, as a recent research study claims, but smiling is definitely good for your wellbeing.

Try this little trick, look at a cartoon in this newspaper. Did you find it amusing? Grab a pencil, pen or a chopstick and hold the end between the teeth so it doesn’t touch lips and look at the cartoon. Science suggests that you will now find the cartoon more amusing.

Holding a pen between teeth induces smiles, while holding it between lips induces a frown. The muscles that make you small work very hard giving our brains a rush of endorphins, the hormones responsible for a general sense of wellbeing. Smiling is like laughing, singing, listening to good music, meditating or even eating chocolates which help release feel-good endorphins in the brain.

The decades-old idea that facial expressions can influence a person’s emotional state – or people find cartoons funnier if they surreptitiously induced to smile– has now been challenged by researchers at the University of Amsterdam. Their experiments on 1,900 participants found no difference in the way people pen-induced smiles or frowns rated the cartoons.

Fritz Strack, the German psychologist, who in 1988 first experimented on people looking at cartoons from Gary Larson’s classic series, Far Side, disagrees with the new findings. As psychologists debate the issue, there are still reasons to turn up the wattage of your smile, with or without the help of a pen.

A standard smile uses only the muscles surrounding the mouth, while a genuine smile, produces a spontaneous expression of positive emotion by engaging the muscles surrounding both mouth and eyes.

Using the classic pen-in-mouth tricks, researchers at the University of Kansas asked participants to create standard or genuine smiles and work on multitasking activities, unknown to them were designed to be stressful. Participants with genuine smiles had lower heart rate levels after recovering from stressful activities. Other participants reported a smaller decrease in positive effect.

“The next time you are stuck in traffic or are experiencing some other type of stress you might try to hold your face in a smile for a moment,” advises Dr Sarah Pressman, the lead researcher, “Not only will it help you ‘grin and bear’ psychologically, but it might actually help your health as well.”

Other studies have shown other engaging effects of a genuine smile. In a 30-year study, researchers matched college yearbook photos of women who displayed a genuine smile with personality data collected when these women were at age 27, 43 and 52. They found that these women had higher levels of wellbeing and marital satisfaction three decades later when they were in their early 50s.

It’s fascinating to know that a photograph captures not only passing emotions of the moment but also the future. Smile naturally and brightly when a camera lens is staring at you.

© Surendra Verma 2017

The dilemma of turning 75

I turned 75 recently, and I don’t feel 75. Though there are some irritating senior moments of whatshername type, the old mind still bubbles with ideas as it did 61 years ago when I published my first science piece in a national newspaper. 

Should I thank the medical science or the genes I have inherited?

The genes, they all are not perfect. Some of the rogue ones I can blame for type 2 diabetes which I have stoically challenged for nearly 25 years. I know I can never win against their single-mindedness to instruct some murderous molecules to attack blood vessels in my body when I watch the nurse struggling to find a vein in my arm to draw blood for testing.

Philip Larkin was right when he wrote:

They fuck you up, your mum and dad.

They may not mean to, but they do.

They fill you with faults they had

And some extra, just for you.

Advances in genetics have side-lined the old idea of nature versus nurture, proposed in 1874 by Francis Galton. But both nature (destiny as encoded in genes) and nurture (freedom, as in our lifestyles and beliefs) shape our lives.

I was a schoolboy when I discovered the motto of the sundial (“I record only hours of sunshine.”) and a saying of Epictetus (“Before we try to control our circumstances we have to control ourselves first; and nothing lies completely in our power except our judgments, desires and goals.”)

These two bright beliefs have positively nurtured my mindset. Would they be enough when I face the surest thing in life: death?

We are now living longer but not necessarily healthier as the ageing cells put breaks on our mental and physical abilities. In spite of all the medical advances, there is no getting away from the pain that Epicurus advised us 2,300 years ago to fear – and not death.

It’s evolution that inculcates our desire to live. It’s also the evolution that cruelly takes this desire away from some of us, for it has failed to equip us with a mechanism to cope with excruciating and continuous pain. When this pain becomes unbearable, it’s understandable to me at least; the desire to end life overtakes millions of years of genetic programming designed to perpetuate it.

Ezekiel J. Emanuel, an American oncologist and bioethicist, says that 75 years is all he wants to live: “By the time I reach 75, I will have lived a complete life … I will have pursued my life’s projects and made whatever contributions, important or not, I am going to make.” He makes his compelling argument that society and families will be better if nature takes its course swiftly and promptly in The Atlantic magazine. 

Emanuel, 57 when he wrote the article, says that he won’t be actively ending his life when he turns 75 but won’t try to prolong it, either. He plans to stop getting any regular preventive tests, screenings and interventions: “Today, when the doctor recommends a test or treatment, especially one that will extend our lives … The momentum of medicine and family means we will almost invariably get it.”

The ethical questions I face at 75 are not only about dying with dignity when pain becomes unbearable but also about prolonging life beyond a certain age.

© Surendra Verma 2017