Liquid crystal layer of water and charge separation at surface boundaries

Here is an extremely interesting talk by Prof. Gerald Pollack about the characteristics of water layers at the surface boundaries. He presents experimental results supporting the point that water forms a relatively thick liquid crystal layer at the boundary, establishing a separation of charge that may be exploited for energy production. Interestingly, this phenomenon is largely promoted by infrared light. The same phenomenon may be used for efficient and cost effective water purification.

Posted in Chemistry. Tags: . Comments Off »

The science of decay: beautiful BBC documentary

Posted in Biology, Chemistry, TV. Tags: , . Comments Off »

Computational chemistry development in research

Imagine you are a professor in organic chemistry. You received financial support for a project, and you are ready to hire a Ph.D. student to make it happen. The project requires the synthesis of a new compound.

Imagine you interview your best candidate. At the whiteboard, you present him with various problems of how to synthesize different products, and you find he is very qualified in planning these strategies. He also knows how to estimate the yields of the products, and how much initial reagents are needed. He knows temperatures and catalyzers required, as well as conditions that may break down the product. He can also write excellent articles. Seems a very promising person, so you hire him.

After a few weeks, he joins the group and enters your lab for the first time. Here you find out the following about him:

  • It is the first, maybe second time he enters into a lab.
  • He does not know the names of lab glassware. He never heard terms like “flask” or “test tube”. He actually looks at you with a strange face when presented with them, and say that the only time he did an experiment during his course, he used a pan and it was enough.
  • He is unable to keep his bench and glassware clean, and most often than not, he throws away used glassware instead of washing it.
  • He constantly breaks glassware, scattering glass and reagents around, damaging the work of his colleagues who have to clean his mess or just deal with it in the most improbable ways.
  • He is unable to assemble glassware (such as he is unable to build a distillation setup). He puts clamps in the wrong way, does not put silicone grease in the joints, nor any clamp to keep these joins sealed to prevent “popping” due to pressure buildup.
  • He never performed a chromatography, nor an Infrared Spectrum. He just vaguely heard about them, but his course professor said that they are useless and you can identify the substance with a sniff, which is enough most of the time.
  • He pretends to prepare and use his own very impure reagents, even when high-purity, standard reagents are available from a reagent reseller. He claims that it takes too much time to call them and make the order. Also, he is unable to understand the codes and numbers written on the bottle’s label.
  • He labels his flasks with cryptic names, and pass them to colleagues who constantly have to ask him what they contain. Most often, he does not remember and have to check on his notebook. When he remembers, it turns out that the product inside is either polluted or damaged.
  • He disposes of his byproducts in the drain, instead of using the proper disposal units. When addressed about it, he says “who cares? it takes less trouble”
  • When asked about his low quality work, he claims that it’s not his task nor its core expertise to produce top quality laboratory work, nor to know how to use laboratory equipment, as long as he can manage to get the required product at the end. He also points you at a very efficient synthetic strategy he just devised at the whiteboard, claiming that he is doing an excellent work.
  • At the end of his employment, he leaves a notebook containing the process to create the product. The notebook is completely disorganized, the pages are mixed and not numbered. The handwriting is close to impossible to understand, it is written in four different languages, and the quantities are specified as “a pinch of”. However, the glassware setup he made and left, when fed with the contents of an unlabeled reagent bottle you find in a bench downstairs, gives you the product, but only if the humidity in the laboratory is at exactly 75 %.

My question for you, the professor of the group is the following: would you keep this person in your group, or would you dismiss him?

Same story, different chemist

The scenario presented above is considered the norm in a different branch of chemistry, theoretical and computational chemistry. While “white coat” chemists use whiteboard synthesis strategies, a laboratory, glassware tools, spectroscopic instruments, chromatography and distillations, theoretical and computational chemists use mathematics, computers, editors and software development. Both these sets are tools for the job, and in order to practice the discipline, proficiency with them is expected. Yet, in the discipline of theoretical and computational chemistry, the general level of competence and proficiency with its tools can be rightfully compared one-to-one with the scenario given above for the case of organic chemistry.

Organic chemists are supposed to be able to use a laboratory environment with high quality standards and protocols. By analogy, computational chemists are supposed to be able to use a software development environment to high quality standards and protocols. In both cases, to perform their research they should be required to be masters of the basic tools, and proficient in a broad set of specialized and modern tools. It is inexcusable not to be.

What makes the color of things ?

Suppose someone gives you the chemical formula of a substance, such as

and asks you the color this substance is expected to have. Is it possible to give an answer? In most cases, you may have an educated guess, but an accurate prediction is far from trivial: the color of a substance is decided at various levels, from the basic molecular level up to the macroscopic structure.

The first level: the molecule by itself

The most “trivial” level is the molecule by itself, and it is decided by the elements it is made of, its geometric structure (the position of the atoms), and their charges. These parameters have a key impact on how its electrons are distributed in space and how this distribution changes when light enters the scene, a phenomenon which is strongly related to light adsorption and thus to color.

When it comes to perception of visible light, white light is a mixture of all the wavelengths of electromagnetic radiation from ~700 nanometers to ~400 nanometers. These wavelengths are perceived by our eyes (and brains) as colors, with the longer value of 700 nanometers being almost infrared and the shorter 400 nanometers being almost ultraviolet.

EM spectrum

In a more simplified rewording, white light is a mixture of all the colors of the rainbow, spanning from red to violet passing through yellow, green, blue etc. as beautifully shown by this prism

When you send some white light on the molecule you basically provide all the colors. The electronic setup of the molecule is such that it “prefers” specific light wavelengths (hence, specific light colors), and this preference results in an adsorption. This is due to the light promoting an “electronic transition” between a ground state and an excited state: electronic distribution is rearranged due to the interaction between electrons and the electromagnetic radiation. A simplified vision of this event is the electron “jumping” to a higher, excited level, but in reality, it is the electronic cloud that changes.

Transition to a ground (E1) to an excited state (E2) due to adsorption of an incoming photon of light (hv)

The accumulated energy is then “quenched” (dispersed) as heat. As a consequence, the molecule removes some colors from the white light, leaving others unscathed, and the resulting color we see is the complementary one. If the molecule absorb blue, you get red. If it absorb yellow, you get violet. Absorption in general is not an “all-or-nothing”. The intensity of absorption at each wavelength depends on many factors, producing what is called an absorption spectrum, which is unique and characteristic of every molecule or atom. The color of the substance is the complementary result of this spectrum. The uniqueness of the spectrum allows us to infer the composition of our Sun, of distant stars and planets, through what were commonly known as Fraunhofer lines

Fraunhofer lines are absorption lines of atoms in the Sun atmosphere. Some absorption is also performed by Earth atmosphere. They act as fingerprints for a given atomic species.

Electronic transitions, however, are not the only responsible for absorbing light. A molecule can also absorb light by excitation of rotations and vibrations (meaning that the molecule spins faster, or vibrates more). One case is water. Water appears as transparent, but in reality it’s slightly blue. The reason is that some wavelengths in the red make it vibrate more (to be exact, water absorbs in the infrared, which would not make a difference to our eyes, but this absorption has a so-called “overtone” which is in the visible red). As a result, a minimal amount of red is subtracted from white light and water ends up being slightly blue.

Can we predict this information? Yes, we totally can, with relatively good, but not perfect accuracy. There are many different programs capable of obtaining this information: the wavelengths where absorptions occur, vibrations, and other parameters that are important to decide the final spectrum. For atoms and small molecules, accuracy is very good, but as the molecule size increases, predictions require larger and larger computing power. For this reasons, quantum chemistry method developers daily create new smart approximations, able to deliver a very accurate result for a reduced computational cost. In any case, the required input is just the geometric position (xyz coordinates) of the atoms, their atomic numbers and masses, and the net charge.

The second level: molecular interactions and reactions

Molecules are generally not alone. They can come close, and eventually have other molecules around, either of the same species, or of other species, such as those of a solvent: from simple water, alcohol or acetone, to complex cell environment. There are no reactions involved, just the proximity of other molecules, with their protons and electrons. These partners alters the electronic setup of the molecule, promoting a slight variation of the electronic and vibrational behavior. Absorption, and thus the color, is consequently changed. In general, this change is a shift of the original spectrum either towards higher wavelengths (bathochromic shift) or shorter ones (hypsochromic shift).

Then you have anything that can change the structure of the molecule through chemical reaction. Take tea, put some lemon into it, and its color becomes lighter. The reason is that with lemon you are increasing the acidity of the water, which means a higher concentration of charged hydrogen ions (H+). The higher concentration of hydrogen ions push Thearubigins, a class of colored substances found in fermented tea, into a form with the ion attached, which creates a change in the molecular electronic distribution, which in turns changes the absorption and thus the color. For some substances, this effect can be dramatic: from blue to red, from transparent to purple, from yellow to blue. These are the so-called pH indicators

Bromothymol blue at acid, neutral and alcaline solution (left to right).

These effects may technically be predictable, but they require to consider a complex system of interacting species, with different chemical exchanges, short and long range interactions of charges and so on. This may be very difficult, if not impossible to perform with today’s methods and computational power, although approximations exist to work around the heavy computational weight and provide reasonable results.

The third level: crystal and impurities

A crystal is a solid where the constituent molecules or ions are disposed in space with a well defined order. For any given substance, the ordering of its atoms or molecules in space is not necessarily unique, a phenomenon known as polymorphism. Depending on the packing, different properties arise, and different colors are the result. Diamond is transparent, graphite is black, and black is also C60 fullerene, but they are all made of the same element: carbon.Diamond and graphite

For another example, take gold. You may say that it has gold color, but if you take a small cluster (say, 100 atoms) of gold, what you see is red, not gold-colored.

When you have atoms or molecules ordered in a crystalline structure, the result can absorb light by virtue of this ordered structure. Note that this effect is complementary to the initial absorption characteristics of the molecule or atom taken by itself. For example, one single atom of carbon may absorb close to nothing in the visible, but due to the highly ordered crystalline structure, the macroscopic block of graphite you hold in your hands absorbs light, most of it, and thus is black. A similar effect occurs with any pigment having crystalline structures influencing the color. At the quantum level, the effect just presented is related to band structure and Bloch wavefunctions. The same facts also explains semiconductors and conductivity of metals.

These effects are relatively predictable. A large number of computational software deals with periodic structures in a very efficient way, providing spectroscopic information about the properties of both atomic and molecular crystals.

As an additional twist, crystals can have defects, such as imperfect packing or impurities of foreign elements into the periodic structure. The resulting effect is beautifully shown in diamonds, for example in the Aurora Pyramid of Hope

and in Aluminium oxide: pure, it is colorless. Add some chromium, iron vanadium and titanium and it may become ruby

or sapphire, which is blue, pink, yellow, orange, purple or green, depending on the crystal structure, and the relative quantities of these impurities.

These effects are generally very hard to compute, as they may require statistically large ensembles of atoms. I am not aware of any computational techniques on this regard.

The fourth level: macroscopic properties

Finally you have how the substance is structured at the macroscopic level. Take a smooth platinum electrode: it is platinum color. Make it sponge-like (by making very tiny bubbles and pits) to increase the surface area and it appears black as coal. The reason is that light is scattered and absorbed completely, leading to a black color.

This opens to many additional effects concerning matter-light interaction. What is the color of a CD ? Is it silver ? Is it “rainbow” ? What about the color of a oil slick on the road in a rainy day ? What about the color of a Tiger’s eye, or of an opal

And what about blue eyes, and the blue color of a spoon of flour dispersed in water ? Both are due to Tyndall scattering. There is no blue pigment in blue eyes, nor in flour, but the scattering of light is frequency dependent, reflecting blue and transmitting red, leading to a blue color.

This Wikipedia and Wikimedia Commons image is from the user Chris 73 and is freely available at under the creative commons cc-by-sa 3.0 license.

As you see, color is a very particular property, and while you may have an educated guess from quantum mechanics techniques, it’s not always easy to infer the color of a substance. This is just the tip of the iceberg. You have many other phenomena (such as how much light penetrates into the substance, or which macroscopic imperfections are present) which affects both the color and the reflective properties of a substance. Ice is transparent, but if it’s full of bubbles it is white. Plastic looks like plastic, and metal looks like metal, depending on how light is scattered and absorbed, which then changes the way it is reflected back to the viewer. In addition, this does not only affects color, but also the general material texture.

What about the opening molecule ?

The opening molecule is Indigo, a natural dye found in some plants. Today, it is synthetically produced in large quantities. It is commonly used to dye blue jeans.

Eight molecules that changed the rules of the game: CFC

Rule changed: made safe and easy refrigeration possible. Raised environmental awareness.

Chlorofluorocarbons (CFC) is a class of compounds, the simplest among them with a structure similar to the one of methane: a tetrahedron. A simple representative is the one pictured below, Dichlorodifluoromethane. It’s a molecule made of one carbon atom (in the center, black), two chlorine atoms (on top, green) and two fluorine atoms (on bottom, pea green).


Dichlorodifluoromethane, one compound in the CFC class

The class includes slightly more complex molecules, but all of them have one thing in common: they are made of carbon, chlorine and fluorine (and occasionally, hydrogen). In this post, we will briefly examine the history of CFC compounds, why they were so disruptive at the time, why they turned out to be so dangerous, and why their contribute to human knowledge was a strong wake-up call for everybody on this little blue planet.

Why CFCs? A brief history of making cold

Just out of the trees and into the caves, humanity learned how to make heat. Controlling fire was maybe the first important technological advancement of humanity, as it improved quality of food (cooked meat is easier to eat and digest, and cooking kills parasites), safety (dangerous animals don’t like fire) and health by warming the cold, humid cave. It took a very low initial technological level to heat things, for a very valid reason: making heat is easily done because the chemical reaction between organic matter and oxygen (that is, burning) is relatively easy to start, produces a lot of heat, and requires components that are easy to find. The opposite operation, cooling, took much longer and way more advanced knowledge.

Ice manBefore the invention of refrigerators there were basically three techniques to cool down things, typically for preservation purposes. The first technique was storing ice and snow in ice houses during winter. Good insulation and the mass of accumulated snow allowed to keep low temperatures during the warm months. Later in time, services were built around the need for cold, cutting and transporting ice blocks from cold regions to service warm ones. People went around in carts selling ice blocks, typically giving small chunks of ice to children to have fun with. Your parents may recall being among these children. The blocks of ice were put into ice boxes, together with perishables.

Another solution to make cold was mixing common salt and snow, to create a frigorific mixture due to its eutectic properties. With this technique, temperatures as low as -21 degrees Celsius can be obtained with ease. The same property is used to melt snow in winter. Similar to the previous option, it requires to have solid water to begin with.

Finally, a third option was to dissolve in water some very specific salts, such as sodium, potassium or ammonium nitrate. These salts require heat to dissolve, subtracting it from the environment (the dissolution is said to be endothermic, in opposition to exothermic ones which produce heat). This concept is successfully used in instant cold packs you may find at your local sport shop. This option has two advantages, namely that it does not require something already cold to operate, and that the salt can be restored and reused by evaporating the water.

With a better understanding of thermodynamics and the state of matter, at the beginning of the 19th century the knowledge was available to develop better technology for the production of cold. Through a proper strategy of expansions and compression of well-chosen gases, efficient removal of heat was both feasible and practical, first industrially, then at the consumer level. When the first refrigerators arrived on the market, the choices for the exchange gas was limited to ammonia (highly toxic), sulfur dioxide (also toxic if inhaled in large quantities), and chloromethane (toxic and flammable). Needless to say, leaks occurred, people died, and the general public preferred the old “big ice cube in a box” solution, or they kept the refrigerator outside, where an eventual leak would pose no immediate danger. A better solution was needed.

In 1929 Thomas Midgley, Jr. and Charles Franklin Kettering teamed together to tackle the problem, and they found a good solution in CFCs, unaware of the environmental danger of their discovery. Incidentally, from the efforts of these two guys also came out the gasoline additive tetraethyl lead, another very troublesome compound. It appears they had a special sense for stumbling on ecologically devastating stuff.

In addition to refrigeration, CFCs were found useful for other tasks, such as propellants for aerosols, solvents, and fire fighting equipment. Their stable, inert and non-toxic properties were just perfect, or so it appeared.

The ozone cycle

To understand why CFCs are so dangerous, we first need to know the role of ozone in upper atmosphere. In normal conditions, ozone is a gas made of three oxygen atoms, O3.



In comparison, the “conventional oxygen” we breathe is a molecule made of just two atoms bonded together, O2. Ozone has a characteristic smell we normally call “the smell of electricity”, being generated in appreciable quantities by electrical discharges. At ground level is a dangerous pollutant, because it’s highly reactive and irritant, but in upper atmosphere is our shield against the intense and carcinogenic ultraviolet radiation emitted by the Sun.

Ozone is produced through a very slow process from molecular oxygen. The molecule is smashed into individual atoms by the Sun UV radiations

O2 + UV radiation -> 2 O

and each of these oxygen atoms may attach to other oxygen molecules to form ozone

O2 + O -> O3

a reaction that releases heat via intermediary species. Ozone can now adsorb further UV radiation to split back again

O3 + UV radiation -> O2 + O

and start the cycle again. The net effect is the one of a catalyzer, a substance that eases an interconversion (in this case of dangerous UV radiation into heat) without being depleted, as would be a reactant. Instead, a catalyzer is restored in its active state once the interconversion is over, and it is ready to operate again. Please note: a minimal amount of catalyzer is able to promote a huge amount of interconversions.

Thanks to this chemistry, ozone degrades large quantities of dangerous UV radiations into innocuous heat, in a cycle known as the Ozone-Oxygen cycle.

In normal conditions, there are other two important reactions that can occur. Both destroy ozone and restore molecular oxygen

O3 + O -> 2 O2

2 O -> O2

All the reactions given above (creation, catalysis, and destruction, among many others) constantly happen in upper atmosphere. Their final balance leads to an equilibrium of a relatively stable concentration of ozone, dependent on solar irradiation, which in turn depends on seasons, latitude and solar activity.

How CFC disrupt the ozone cycle

Where do CFC enter in the game? It turns out that the biggest advantage of CFCs, their stability, is also their first major problem. CFCs are heavier than air (and thus tend to sink) but this is not preventing them to reach the upper atmosphere, helped by their long life. Diffusion and winds mix up the atmosphere constantly, creating a relevant concentration of CFCs in upper atmosphere. Once there, the second major problem arises: when hit by UV radiations, CFCs release a chlorine atom:

CCl3F + UV radiation -> CCl2F. + Cl.

The chlorine atom has a lone electron, and in this configuration is highly reactive and combines with ozone, operating as a catalyzer for the ozone distruction. The reactions are complex and numerous (if you want all the gory details, this online book is a start), but the net effect is a reduction of ozone and the creation of molecular oxygen. Remember, a catalyzer emerges unscathed from the reaction it promotes, meaning that a minimal amount of chlorine can promote the destruction of large quantities of ozone, unbalancing the equilibrium previously compensated by the slow reaction of creation O2 + UV -> 2 O. The shielding of UV radiation becomes less and less effective, on par with the decreasing concentration of ozone, and the radiation can now reach the surface.

A swift action is called for: the Montreal protocol

We owe to James Lovelock, Frank Rowland and Mario Molina, among many others if our eyes were open to a dramatic trend. During the 70s, it became clear to the scientific community that CFC were a source of trouble. Confirmation came in the 80s, where incredibly low concentrations of ozone were found over the south polar region, a “ozone hole” of unquestionable evidence.

The Vienna Convention and the Montreal Protocol, enforced on the 1st of January 1989, defined an impressive and immediate worldwide response to the problem, suppressing industrial production and use of CFCs and other ozone-depleting substances. Without this ban, the result would be the one simulated by NASA

Ozone layer simulation by NASA

Total destruction of the ozone layer, with no chance of recovery, before 2060. NASA also released a movie of the simulation, compared side by side with the projected situation we expect with the ban enforced. You can find it at the NASA page for the simulation, or at this YouTube movie. Without ozone layer, the amount of UV radiation reaching the surface would be so high to cause sunburns in minutes, and occurrences of skin cancer would have soared globally. These, of course, would be just the direct effects on humans. The rest of the biosphere would have had an unpleasant situation as well.

CFCs have been banned from almost any application, from refrigeration to pharmaceutical nebulizers. Some temporary, highly scrutinized exceptions have been defined for those applications where no substitute could be found, such as some fire extinction strategies. The general idea is to use compounds which are either degraded before reaching upper atmosphere, or that don’t contain chlorine or bromine atoms, therefore having a reduced impact on the ozone layer.  Common substitutes today are R134a (a fluoroethane, also being phased out) and R600a (isobutane, much safer for the environment but highly flammable). There are also strict regulations in force, concerning maintenance, recovery and recycling of currently existing CFC (see, for example, here at US EPA). Although the dangers have been avoided, the legacy of CFCs usage will linger for at least one hundred years.

The aftermath of a potential catastrophe

The Montreal protocol and the avoided catastrophe of ozone depletion have been a huge wake up call for humanity. We realized that our planet has a fragile ecosystem, whose complexity and interdependency is broad and still to be discovered. As humans, we owe us a big pat on the back, but there are still troubles ahead: global warming, oil depletion, overpopulation. It’s time we stop wasting time, because swift actions are needed again. With Montreal, we demonstrated that humanity can achieve a common goal and solve a common problem. We need strong leaders, iron-clad reason, proper actions and global effort to face the common goals and problems of tomorrow. We have only one planet, this one:

Posted in Chemistry, Environment. Comments Off »

Does chamomile really relax ?

ChamomileNothing says relax better than a peaceful evening in front of a steamy cup of chamomile. Since thousands of years, humanity uses it as a natural remedy for a large amount of ailments, most notably hypertension, sleeplessness and to ease a flu-dominated night, like in my case recently.

Moved by curiosity, I took some time investigating what is scientifically known about the therapeutic effects of chamomile and their mechanism of action. The results, I must say, are interesting and conflicting. Let’s examine what I was able to gather from around the internet and in a couple of scientific papers.

Chamomile is a class of plants whose main representatives, at least for infuse-making, are Matricaria recutita (German chamomile) and Chamaemelum nobile or Anthemis nobilis (Common Chamomile). It can be found wild or cultivated. Its flower is actually a composite daisy-like sprout. The flowers are the tiny yellow corollae forming the central bulb.


The pleasant fragrance the chamomile flowers produce arises from a large set of compounds (more than 120) including in particular sesquiterpenes such as chamazulene and alpha-bisabolol, flavonoids and flavanoids, like Chrysin, and many others. Most of these compounds are not present in free-form, but are bound to sugars through fragile bonds that can be broken easily, for example by heating.

Some research on the effects of these compounds has been performed. In particular, Chamazulene has been found to have antioxidant properties, together with matricin, alpha-bisabolol, and apigenin. Chrysin appears to show anxiolytic effect in laboratory rats (see also here and here) but nothing has been said for humans yet.


Other experiments show that chamomile can have small antibacterial effect on the gut’s bacterial population, both in human and rat. This finding, however, is a mere hypothesis to explain changes in the excreted substances.

So, it appears that chamomile does indeed have relevant activity, and for what concerns anxiolytic effects, some evidence exists. Despite this, The National Institute of Health page for chamomile reports insufficient evidence for most of the claimed therapeutical advantages of chamomile: the report is “C: Unclear scientific evidence for this use”, with only one case (“post-operation sore throat”) where a conclusion has been reached as “D: Fair scientific evidence against this use”. This is a very important example on how evaluation of pharmacological effectiveness is performed. Even when evidence supports presence of therapeutic effect from a given compound or preparation, only a set rigorous tests performed on human subjects allows to finally grant recognition of therapeutic effectiveness (or lack of it). In the case of chamomile, tests have been mainly performed in rats and mice. Even a single successful (or unsuccessful) human test is not enough to grant A (strong positive evidence) or F (strong negative evidence) grades in the Natural Standard grading scale. The grade C refers specifically to

      1. Evidence of benefit from >1 small randomized trials without adequate size, power, statistical significance, or quality of design by objective criteria, OR
      2. conflicting evidence from multiple randomized trials without a clear majority of the properly conducted trials showing evidence of benefit or ineffectiveness, OR
      3. evidence of benefit from >1 cohort/case-control/non-randomized trials AND without supporting evidence in basic science, animal studies, or theory, OR
      4. evidence of efficacy only from basic science, animal studies, or theory.

      Let’s examine the cases one by one.

      The first case occurs when tests result in positive evidence (it works) but the test is not “statistically significant” meaning that it has been performed on too few subjects: for something to be considered working, it must present an effect that occurs with some consistency. If you test a substance Foo on a single sick person, and he recovers, it does not mean that Foo is a cure. That person could have recovered just because he was lucky or strong enough to recover, regardless of Foo. A better test would be: treat 200 sick people with Foo, and take also 200 sick people with no Foo treatment, then compare the recovery rate in the two groups. If in the first group 180 people recover, while in the other only 20 recover, there’s definitely a good point in favour for Foo being effective in curing that sickness. Statistical analysis allows you to decide which numbers of people can be considered strong evidence or not enough evidence for Foo being an effective cure.

      The second case is when two or more tests produce conflicting results. For example if laboratory A sees a recover in its people using Foo, but laboratory B sees no recover. There could be many reasons for this. Improper testing could be one, and even if all tests are performed properly there could be additional factors we don’t know. Example: suppose that people at laboratory A had an unknown strong ease of recovery from that sickness (because they are immune for some biological reason), so the group without Foo medication recovers as well as the group without Foo. The conclusion for the laboratory A is that Foo has no effect, while laboratory B says that Foo has an effect. This is conflicting evidence, and must be resolved by checking more people, until a clear majority allows a unique conclusion to be drawn.

      The third case is when there is evidence for recover, but there’s no evidence from known science, animal studies or theory able to explain the observed phenomenon. This can lead to a scientific breakthrough if a new biological mechanism is found and explained, but until then, it is not possible to say anything about the pharmacological validity. This point also raises the difference from cohort case, control case, randomized or non-randomized trial. It would be an interesting discussion, but it goes a bit outside of my current knowledge, and I am determined to learn more about the details in the future.

      The fourth and last case is when evidence exists only because we infer it should work from what we known today of the human body’s mechanisms, but no actual test has been performed, or tests have been performed only on animals.

      In the case of chamomile, as of today we cannot confirm officially and with strong evidence on humans that a pharmacological effect does exists, because all tests have been performed on animals, with the very few human trials available still insufficient to draw significant conclusions. This does not mean that the effect does not exists.  It could exists, or it could not, and whatever the truth is we cannot put an approval stamp on it yet, because we haven’t tested enough. In agreement to the scientific method, unless something is demonstrated via evidence to hold, it is assumed not to hold. It’s like presumption of innocence in criminal trials: someone is assumed innocent until proven guilty from evidence. The other way around would be disastrous.

      Now, we can probably claim that the pure fact of preparing chamomile and enjoying its pleasant fragrance has a relaxing effect, but that would be a psychological effect triggering internal biochemical actions inside our body, finally leading to a relaxed mood. Mozart could have the same effect. The point is, from the pharmaceutical point of view, the correct answer (as of today) to the question “does chamomile really relax ?” is “some evidence exists that it does, but it’s still not enough to say for sure.”

      Additional Links

      Eight molecules that changed the rules of the game: Bakelite

      Rule changed: it started the world of plastic we live in


      When it comes to materials for making tools, housing, chariots, and dishes, humanity had only one choice for many thousands of years: use what nature provided. Clay, rocks, metals, resins, rubber, and wood were the most common materials directly available for harvesting. As primitive technology improved, materials with new and interesting properties were created, such as glass and concrete, but at that time there was little or no understanding of the “magic” behind the process, the new material’s properties, or how to improve them, except by trial and error. The discovery process improved considerably when the rules of physics and chemistry were rationalized: the gained understanding of existing natural materials made possible to design similar ones, either partially or completely synthetic, endowed with unusual interesting properties. One remarkable example of these man-made compounds is plastic: discovered at the end of the 19th century, plastic materials changed and still change the world. (more…)

      Posted in Chemistry. Tags: , , . Comments Off »

      A Question/Answers site for Popular Science

      The kind folks behind StackOverflow, a free Question/Answers website for programming questions, recently decided to open new Q/A websites for many additional interesting topics, from wine tasting and cooking to mathematics. The fundamental requisite for such new sites to be opened is a rather strict community review and development of a critical mass of contributors and interested people. On the proposals are collected and evaluated by the community.

      I really enjoyed the proposal for a Popular Science Q/A site, and if the site is going to be opened, I will certainly be an active contributor. If you are interested, feel free to click on the link and then click “follow” on the proposal. This will largely increase the chances for such site to be opened. I am also positively interested and enthusiast for Q/A sites for Chemistry, Astronomy, Bioinformatics and, as a very old Dungeons and Dragons player, Role-Playing Games.

      Posted in Astronomy, Biology, Chemistry, Dissemination, Websites. Comments Off »

      Craig Venter programs a bacterium from scratch

      As you probably heard in the news, Craig Venter, the American biologist best known for starting up Celera Genomics and sequencing the human genome, achieved another big success. He created a fully working new bacterium, programming its DNA from scratch.

      Like a computer having hardware and software, a bacterium has a set of components that execute the software written in the DNA to create proteins. For quite some time, the strategy was to put small pieces of new DNA into full genomes, so to add a new piece of genetic code to synthesize a new protein, typically a pharmaceutical drug. For example, people with diabetes must periodically use insulin, a small protein which is normally produced by a fully functional pancreas. If the pancreas does not produce insulin, then diabetes arises. A solution is to inject insulin from outside, but this small protein must be produced somehow. The technique used to produce it is the Recombinant DNA: A small piece of DNA specifying the code to produce insulin was inserted into a normal bacterium (Escherichia coli, the same that lives in your gut). The altered bacterium duplicates, and millions and millions of daughter cells now produce the proteins their genetic code specifies for, like they were small chemical laboratories. Since specification for insulin has been introduced in their DNA, these millions cells also produce the precious insulin, which is then extracted, purified and sold for diabetes treatment.

      At the Craig Venter Institute, they went further. They didn’t add something new to a bacterium. They took all the DNA contained in one, throw it in the dumpster, put another DNA completely designed on a computer, and let it go. This has been done some time ago, but this artificial bacterium was not able to reproduce, until some time ago. Yesterday the paper has been finally published on Science: “Creation of a Bacterial Cell Controlled by a Chemically Synthesized Genome“, marking the fact that the hardware (the mechanism that synthesize proteins) can be programmed at will by totally replacing the software (the DNA). In the most superior example of computer programming skills, the JCVI now controls a chemical computer. Venter walks the path of Wohler, once again demonstrating that life has nothing magic, but it is just a chemical system, obeying the rules of chemistry in a fascinating self-sustaining, self-replicating system made of order and chaos.

      What are the consequences of this achievement? What can we do with a totally programmable, reproductive chemical laboratory ? Well, it’s not that easy. Actually this achievement is great, a milestone that will probably earn Craig Venter the Nobel Prize very soon, but to go from this achievement to practical uses for humankind we are a bit far, although not that much far. Having such control will allow so endless possibilities that are almost difficult to imagine right now in their completeness, but we can start from:

      1. production of now expensive proteins to cure diseases, similar to the insulin case, reducing the cost and increasing the effectiveness of therapies.
      2. production of bacterial species able to consume and transform substances that are toxic for us
      3. bacterial species able to deliver a pharmacological payload near the source of the disease. Today we inject stuff in our body, for example to kill cancer, but we poison every single cell, even the good ones. What if a bacterial species is able to detect and attach to a tumor, and then start producing an anticancer drug right there ?
      4. production of electricity from biological sources. Take wood or sugar, let bacteria digest it and promote electricity creation (a so-called microbial fuel cell). It’s clean, renewable and easy to control. We already do something similar with Biosensors to evaluate the amount of glucose in blood.
      5. production of biofuels from garbage or pollutant, like used plastic.
      6. understand how a simple system like a bacterium works will give us the chance to understand more complex systems

      Yes, some will probably be scared at the idea of such inane level of control: biological weapons, superbugs… danger! danger!… but if you really stop for a moment, check some history, and think deep, you realize that biological warfare is nothing new: people in the middle ages threw corpses hit by plague beyond castle walls to kill the opponents via biological warfare; humanity does not need to create a powerful bacterium as a weapon: a large amount of them are already available in nature, ready to be harvested, and they could go straight on the tip of some rocket ! Being scared that this new technology could be used by mad, aggressive people as a weapon is not an issue. Again: the biological weapon is already out there, since the very beginning. This is the reason why biological weapon stockpiling and production has been banned since 1972 (Biological Weapon Convention) and only defensive research is allowed and pursued.

      In fact, if you think about it, understanding how bacteria work is actually the only way to find effective protection, and not only from human madness…There is a bigger menace out there to be worried about: the pure, crystalline natural cruelty, wiping thousands and thousands of species out with not a blink of compassion since 4 billions years. In 1918, the so called spanish flu wiped out 6% of the world population of that time. Six percent. We humans do not accept this harsh treatment from cruel nature, and we found a way to understand its mechanisms and use them at our own advantage. Our life today is twice as long and many times safer than the life of our ancestors, just 100 years ago: think about living in a world with no anesthesia, no penicillin, no anticancer drugs, no social security or medical assistance, with sounding remedies like skull trepanning, bloodletting, or Hirudotherapy.

      Are you really scared of the 21th century ? I’m not.

      Posted in Bioethics, Biology, Chemistry. Comments Off »

      Eight molecules that changed the rules of the game: Benzene

      Rule changed: stimulated research to explain electronic resonance.



      Except for its nice regular hexagonal shape, benzene is not a nice compound. It is toxic, carcinogen, highly flammable, burns with a very dirty and smoky flame, and if it is not enough, it made chemists go crazy for one hundred years. The latter point is interesting for our discussion. Why, you may ask ? It has to do with its structure and a bunch of data that didn’t add up for quite a long time, starting from its discovery, in 1825. It took more than one hundred years to finally understand what was going on, and it required the development of a whole new scientific discipline: quantum chemistry.


      Posted in Chemistry. Tags: , . Comments Off »