It’s Less Toxic Than Table Salt!

How many times have you heard “it’s less toxic than table salt!” as a reason that something’s safe.  But really, how safe is this commonly used ‘benchmark’ for toxicity?

It is widely accepted and proved through many studies that excessive salt intake can contribute to, and often cause, high blood pressure and hypertension. Although it is likely that there is also a genetic susceptibility to high blood pressure that contributes to this, high salt intake is widely accepted to be influential. Other studies have also shown positive correlations between high salt intake and strokes and gastric cancer, even without the high blood pressure that can contribute to strokes and other medical issues.

Excessive salt (more specifically, sodium) is called hypernatremia  in medical terms and is classified by the balance of sodium and water being thrown massively towards the ‘salt’  end. Most of the effects of this are caused by shrinking of cells, since water leaves the cells to maintain the balance of salt and water. This causes a multitude of problems that include confusion, seizures, coma, and even death if the situation is not corrected.

Thankfully, we have an inbuilt mechanism of preventing poisoning from excessive salt – we get thirsty. If you have a salty food then you get thirsty and increase your fluid intake as well, which allows the cells to keep most of their water inside, protecting you against severe hypernatremia and the unpleasant effects of that.

LD50 is a commonly used measure of toxicity. It is the amount of a substance that, when ingested, kills 50% of the test sample. The LD50 for table salt is 12357 mg/kg for humans, in other words if you fed 100 people 12.357g of salt for every kilo of their weight, 50 of them would die as a result.

Compare this to that of restricted use pesticides. 29-42% of these are ‘less toxic than table salt,’ one of these being glyphosate. They are significantly less toxic than the salt we eat without thinking. Even acetic acid – found in vinegar – has a worse toxicity profile than table salt.

So, unless you eat a lot of salt, it’s not likely to kill you. Even low amounts may give you other problems, but a lethal overdose, really isn’t that high a risk. So, maybe the expression really isn’t that wrong, even if it is a little misleading.


The War on GMOs

The first thing that comes up if you Google “Genetically modified organism” or “GMO,” you get a lot of results – around 5.5 million actually. The majority of the first page, apart from the obligatory Wikipedia and news articles, consists of anti-GMO websites. Not really the best way of getting unbiased information.

A quick look through these websites proved that most of them were presenting only a very limited amount of information, all of which was – surprise, surprise – showing how genetic modification is a bad thing, one even going so far as to say that the process “creates unstable combinations of plant, animal, bacteria, and viral genes.”

Sure, in some cases genetic modification does mix genes from different species, such as the modification that inserted bacterial DNA into papayas and subsequently saved the species from extinction, but that’s not always the case. There are a number of cases where plants have been made more resistant to extreme conditions through inserting genes from the same species, really just speeding up the process of selective breeding that has been going on for as long as we’ve been farming.

Even in cases where genes from different species have been mixed, the result isn’t guaranteed to be a mutated affront to nature. Take Golden Rice for example. This is a type of rice, transformed with genes from daffodils and the soil bacterium Erwinia uredovora to produce beta-carotene. It has been shown through multiple field tests that it poses no risk to human health and instead is able to protect some of the poorest people from the blindness and even death that results from vitamin A deficiency. That is why it was developed after all. The largest problem with allowing this to be actually distributed is the anti-GMO groups who have attempted to sabotage progress at every turn.

Admittedly, I’m pro-GMO, within reason. Sure, it needs to be tested and everything to make sure that there are no unintended consequences, but it’s also a brilliant way of optimising agriculture and improving quality of life for hundreds of thousands of people. If we can make farming more efficient and cheap crops produce more per harvest and more resistant to adverse conditions, then that could be a massive step to reduce the number of people without enough food. In addition, we can make the crops we do have more nutritious, as in the case of Golden Rice.

Just because something is different, doesn’t mean it’s dangerous. Genes don’t hurt people – they’re not dangerous. Some of the pesticides that can be avoided through the use of genetic modification technologies, well, those are dangerous. In many cases the amount of pesticides used can be cut through the use of genetic modification technology. All in all, the anti-GMO protesters are doing their job well – they’re convincing people that all forms of genetic modification are potentially lethal threats to human life.

And that’s not really very fair, considering all the good that can be done by the technology.

Big Science

Which is worth more, membership to the CERN supercollider or over 300 lifetime academic positions?

It’s questions like this that come up every time big science is discussed – big projects cost big money and, since there’s never unlimited money, mean big decisions.

For some context:

  • The entire Royal Society of New Zealand Marsden Research Fund allocation in 2015 was $59.8 million including GST.
  • Funding for the NZ supercomputer network totals $48.4 million
  • The Hubble Space Telescope cost over US$2.5 billion for just construction, and the estimated cumulative costs are estimated at over US$10 billion

Compare to these, the rough cost of an academic’s salary for the entire 40 years of their working career – only $5 million.

Big science has the possibility of big results, but it also requires an absolutely enormous investment without any guarantee of when those results will come, if at all. A massive criticism of big science is the drain that is imposes on funding allocations. For example, Denmark – also a relatively small population – contributed 1.6% of the money required for CERN to run. That’s $5.5 per person. That was approximately 31.3 million NZD, for just 1.6% of what was needed.  That’s over half the government money put into research in New Zealand. Just think what else could be done with that much funding.

Big science has definite opportunities but those opportunities come at both a massive financial cost and an opportunity cost – since the money wasn’t put into other projects, results from those have been delayed or lost completely in favour of big science projects.

There lies the controversy over big science.

Do we fund these projects and collaborate with other countries to achieve a world-wide project, or do we put our money into more, smaller projects which may produce more diverse and New Zealand centered results?

That’s up for debate.

Scientific Funding

When there’s only so much money to go around, how do you decide who to give it to?

The question remains the same in any situation, unless you have unlimited money (which has never actually happened), you have to prioritise what is the most ‘worth’ spending it on.

In science this is done by a group of reviewers who determine which of the proposals they receive are worthy of the funding requested. They have a whole lot of criteria that they use to rank each of the proposals and then the top ones get funding to go ahead. The criteria differ depending on the group but often include things like ‘viability’, ‘demand’, and ‘how much will be achieved for the cost.’

Governments allocate a certain amount to science funding and so also go through this selection process. In some countries they’ve now released challenges or focuses for areas that will be funded by their budget. Typically these are of importance or special relevance to the country in question. New Zealand released the National Science Challenges, 10 being announced in May 2013 with an eleventh being released in September 2014.

These challenges are designed to tackle problems or concerns of special significance to New Zealand and in keeping with our reliance on our ‘clean, green’ image are the Sustainable Seas and Our Land and Water challenges. Others such as High-Value Nutrition, A Better Start, Ageing Well, Healthier Lives, and Building Better Homes, Towns, and Cities, are more socially focussed and intend to improve the way New Zealanders live.

This idea isn’t specific to New Zealand, the UK also have their own set of focuses that they are allocating funding do.

These ‘challenges’ or ‘focus areas’ or whatever you want to call them are intended to provide, well, focus to the funding allocation – instead of spending a lot of the budget on something with very theoretical applications, the government has decided that money is better spent on addressing social issues that will benefit far more people.

The New Zealand National Science Challenges:

Computer Modelling

What do astrophysics, economics, psychology, weather forecasts and noise barriers have in common?

This wonderful thing called Computer Modelling.

Computer models are a great way of getting information on what could happen before it actually does. Climatology uses models to analyse and predict climate change based on current and potential future scenarios. Economists use them to predict changes in stock markets and determine ‘smart investments’. Models are used in the development of things such as noise barriers, roadways, and buildings (among others) in order to ensure that they will perform their function as needed.

When talking about computer modelling, a lot of people think about the picture or the information that comes out that is used to make a decision. That’s not actually the model – that’s the simulation that ran based off the model. The actual model is the collection of equations and code that makes the simulation work. The reliability of the simulation and its results comes from the validity of the model – if the model doesn’t take into account some important factors, the simulation isn’t going to give reliable results.

Take a really common example – driving somewhere. In order to get a result that’s going to be anywhere near accurate about how long it will take to get there, you have to take into account a whole lot of factors.
– Speed limits
– Traffic
– Road closures
– Weather conditions
Among others that would take too long to list. If you miss even one of these variables then the simulation isn’t going to give a reliable result and you will probably end up being late. There’s so many different things to take into account if you’re creating a model and they’re specific to the simulation you’re wanting to run.

A scenario like global climate modelling is like a massively scaled up version of that. You not only have to take into account past and present, there are also different scenarios that follow potential paths that depend on action that may or may not be taken. But the use of these models is to determine what action should be taken, so this added complexity is the whole point of creating the model.

Computer models can be used for absolutely anything you like – from figuring out how long it’ll take you to drive to a friend’s house to determining what the climate might be like in 300 years. They underlie many important areas of life and research although not everybody knows how they work or what they even are.

For more information:

How to Think

It seems like a bit of a weird concept – teaching someone how to think – but it’s actually a valuable skill to learn. We all think, every day. One of the biggest problems with that is that a lot of these thoughts aren’t clear – they’re tainted by bias, lack of information, prejudice, or sometimes they’re just incomplete. So thinking, although we do it every day, isn’t actually all that simple.  That’s where critical thinking comes in to it.

Critical thinking is a style of thought that hopes to overcome some of the problems with just thinking. With normal thinking it’s common to select information that confirms a prior belief, rather than looking at broader arguments that may contradict it. And, while that may make you feel good about your ‘knowledge’, it can also even further develop your own bias and prevent you from learning and developing.

Some of the things that seem to separate critical thinking from just thinking are based on emotion. Critical thinkers tend to look for reason and supporting evidence rather than emotion and are more concerned with finding the best explanation than being ‘right’. Similar to this, people who are thinking critically consider their own motives and bias and recognise where their own beliefs may skew any conclusions. There are a lot of other things that are a part of critical thinking – judgement, discipline, open-mindedness – but a lot of these are based off the idea of distancing yourself from your prior beliefs and preventing these from impacting on your conclusions.

Almost by definition, critical thinkers are sceptical. They want to see hard facts to support something, rather than an emotional argument. They will say why and want to know how you know something is true, rather than just accepting that it is. Most importantly, they keep their minds open to new ideas and consider something they encounter rationally rather than emotionally or with prejudice.

But why bother with all that?

Well, critical thinking can keep you from being sucked in by bad reasoning of others or your own prior beliefs. Consider: someone tells you that there’s just been a massive drop in the number of pigeons at the feeder in their backyard and therefore something must be done to save the birds from whatever cats/dogs/miscellaneous other predators in the area have to be hunting them. A non-critical thinker (who doesn’t want to see pigeons disappear) may think ‘oh that’s terrible, we should do something to stop them disappearing completely.’ The benefit of being a critical thinker is not doing that. A critical thinker would stop and, well, think. Instead of immediately following the conclusion given to them they would consider it for themselves. Did the decrease occur over time, or was it a single-day occurrence that could be due to other circumstances? Are there a lot of predators in the area that would be a threat to the birds? I know for certain that my pet cat wouldn’t dare try to take down a pigeon – she’s not big or strong enough. Sure, a critical thinker may reach the same conclusion (that something has to be done about the problem) but they could be a lot more sure of their reasoning than the non-critical thinker.

There are a lot of other situations where critical thinking is important – science, co-operative reasoning, even just reading the news. Life can be made a lot easier if you learn to think critically and follow the facts, rather than just what’s handed to you.

The Horrors of Communication

What would you generally associate with a ‘scientist’? Personality traits? Skills? Interests?

Well of course there’s science, and a quick Google search gave me some other suggestions of what most people think. Typical personality traits that are associated with scientists – analytical thinking, intellectual honesty, curiosity, and focus came up a lot. Something that I didn’t see at all, however, was communication.

That’s a bit of a problem actually. Science revolves around discoveries but without communication there isn’t a way to get those out to others. Now very few people would argue that scientists aren’t good at communicating with each other, it’s only when the general public and media become part of the equation that things get a bit more scary. Although there are scientists who are perfectly comfortable and good at talking to the media and explaining science to those without their expertise, many more are not.

One big thing that I’ve heard is that scientists are afraid of the terrible headlines that come up every now and then. The media has a reputation for distorting an interview or press release in order to make a more interesting story and for many scientists this can be a reason to avoid talking to them. Sure, some journalists probably do this. Certainly a spin is often put on a story to make it more saleable, but most journalists are perfectly happy to present an accurate, factual story – as long as it’s still interesting. It helps if the journalist is willing to focus on the same angle as the scientist wants, of course, but it’s something that goes both ways. Scientists can make sure that they stick to the implications, and the aspects of their work that will be interesting to everyone who isn’t an expert in the field, and in return hopefully journalists will write a story to show off the research. This type of relationship between science and the media can give some amazing results with neither group having to sacrifice what’s important to them.

Two other problems that seem to arise when scientists talk to the media are overuse of jargon and expecting others to interpret the data. Jargon is great for a technical talk. It gets the point across fast and accurately and is often the correct term to use. It can also isolate you from the people you’re trying to talk to, make you seem pretentious and untrustworthy, and prevent non-experts from understanding the topic. Using jargon can be good, but generally it’s better left for other scientists and replaced with more commonplace language when talking to the rest of the world. Likewise data points can prove fascinating for others with the same very specific interest but the majority of the world doesn’t want to put the time and effort into understanding a graph or raw data. Summaries are your friend. Explaining, rather than simply showing, results is going to get a way better result and probably get a lot more positive attention for the work.

Not all science communication needs journalists though. This, blogging, is a quickly growing method of science communication where it’s possible to write about your own work in your own words and make sure that your meaning comes across. Similarly a rapidly increasing number of scientists are turning to things like Twitter to talk about their findings to the rest of the world. Both of these remove the risk of your words being misinterpreted or twisted for someone else’s purpose and let you get your work out there just as you want to.

So, for all scientists out there, go for it. Communication isn’t that big a deal, it’s just a case of learning how to make it accessible to everyone. The rest of the world isn’t really that scary.

The Art of Codes

Uml dzhfj tc lrxhkaulj.

Can you read that?

If so, then well done – I probably should have used a better code – but if not then it did its job. That’s the whole point of cryptography, to communicate in a secure way that outsiders aren’t able to interpret.

But, what if the message is intended for you? Well that’s where the key comes in. If you have the key to my code then you should be able to decipher it, and suddenly the jumbled nonsense becomes perfectly readable.

The world is encrypted.

Believe it or not, the modern world basically runs on codes. Sure more sophisticated ones than I used, but the same idea. The online world uses encryption to protect identity, sensitive details, and (hopefully) any other information that you give to the website. The last time you entered your credit card number to buy something online, the information was stored and encrypted in order to keep it secure and prevent someone misusing it. The way to access this information is to then use the key, like we did before, to ‘translate’ the information back to a readable language. This protects your information from being stolen.

Even if computers aren’t your thing, cryptography is still relevant to you. For thousands of years ‘classical cryptography’ has been used to protect information. Throughout the history of warfare it has been used to send messages that are safe from interception by the enemy. Even in ancient times (as far back as 1500 BC) there have been examples of people hiding valuable information in code. And, not to forget all the notes that get passed in class in a ‘secret language’ so that the teacher can’t get you in trouble for whatever you said.

Cryptography isn’t just for the written word though. Any item that is decorated or designed to appear to have a purpose other than its real one can be thought to be in code – its true purpose (much like the true meaning of an encoded sentence) is hidden from viewers who don’t have the ‘key’.

Cryptography is everywhere, it has a hand in pretty much all aspects of the modern world, as well as a strong impact on the past. Without it we wouldn’t have the safety that we do when using online shopping, or really anything online. Wars would have had different outcomes, and who knows what else would be different. And just think, we wouldn’t have those infuriating ‘Code Cracker’ logic puzzles either.

Insightful or Wishful?

Thinking ‘outside the box’ is a trait that many of history’s most recognised scientists. The ability to find an explanation that differs from the widely accepted one has led to some of the largest discoveries and advancements in science. From the theories of evolution conflicting with then-prevalent creationist beliefs to the theory of the heliocentric universe contradicting the geocentric one, these types of revolutionary ideas have caused massive controversy throughout history. Discoveries such as this can be split into two different types, however.

Paradigm shifts are the type of discoveries that change our understanding of how the world works. They do not fit with expectations BUT over time experiments and observations are unable to disprove them and groups eventually reach a resolution (although this may take decades or even longer). Often the scientists are ridiculed for their novel ideas, as Darwin, Galileo, Feigenbaum, and countless others have been over time, but eventually their theories have gained more acceptance.

Similar to paradigm shifts, pathological science also presents a novel idea that causes great controversy and does not fit with accepted explanations. Again, scientists involved often experience ridicule because of this. In this case, however, although the scientists honestly believe that their idea is right (again, similar to paradigm shifts), experiments are not continued in an impartial way and are controlled to confirm the theory rather than to test its accuracy. This can be intentional or can be unconscious due to the bias of those creating the experiment. Discoveries such as Polywater, Mitogenetic rays, water dowsing/divining, and countless others have since been attributed to pathological science – although the scientist/s involved were convinced of their validity, others were unable to replicate the results or actively discredited these theories.

The two types of discovery have a lot in common, which means it can be very hard to tell the difference between them. Irving Langmuir was the first to characterise ‘pathological science’ with the six symptoms, Nicholas J. Turro then added three more, with Dennis Rosseau adding another to the list. Even these, such as the idea that “theories outside the field’s paradigm are suggested” present similarities to paradigm shifts which makes it even harder to tell the difference.

Ultimately the only way to tell the difference is experiments and time. Paradigm shifts will last whereas pathological science will eventually be discredited, even if it takes a lot of time to do so. Until a conclusion is reached, all that you can really do is listen to the evidence and think about it, without writing it off as either one of the two. After all, some ideas really are inspired and revolutionary, while others are simply mistaken and the product of wishful thinking.