# The Structure and Entropy of Ice

## The Structure of Ice

The title of this section is misleading. Because ice, understood as the solid phase of water, does not have a unique structure. In other words, there are 12 known phases of crystalline ice.

The ice you all know is technically known as Ice Ih, or hexagonal ice. The underlying crystal structure of the oxygen atoms in the crystal is known as wurtzite, which is the general name given to this particular type of hexagonal crystal.

## The Structure of Ice: Macroscopic

If we forget about the H atoms in ice, Oxygens arrange in this structure as the grey and yellow atoms in the figure. It is the oxygen arrangement what gives ice 1h its hexagonal symmetry.

Does the number 6 make you think about ice in any way?

Indeed, I'm sure you have observed many times the hexagonal symmetry of ice grown of ice on top of a surface, like:

• a car window
• a house window
• a plane window (one of my favorite things to do on a plane is to follow the growth of ice on the window soon after we take off)
• also the snowflakes! The best hexagonal crystals are in the powder falling in the Rockies, I've never seen flakes that perfect!!

## The Structure of Ice: Hexagonal crystals

You need to understand the atomic arrangement of the crystals and the differences in energy between different surfaces to understand why the snowflakes can have such different, though similar structures. Once ice nucleates from the supercooled liquid (more on this later), it will not grow at the same rate in all directions. Some surfaces are favored over others.

This means that the ice growth is not isotropic, or in other words, its growth rate depends on which direction it grows. To understand this better lets look at the crystal structure of ice from a purely geometric perspective.

This is the unit cell of hexagonal ice. You can move the structure around and zoom in and out using the right click on you mouse (see at the end of the page for instructions).

One thing you might notice is that all the hydrogen here seem to be ordered. And indeed they are. This crystal structure is built by repeating the unit cell 2 times on each direction. A macroscopic single crystal of ice 1H or hexagonal ice like this is very difficult to obtain. Lets try to build it ourselves.

We are going to use the well known “Random Method” which involves the use of a Random Number Generator. Lets see how good this class is as such. We will use the clickers to record your answers.

From the experiment we did last week, and the experiment we will repeat today, you already know that it is very hard, if not impossible to create an ordered ice structure if the Hydrogens are placed at Random. The next section will deal with the exact rules of the game.

## The Structure of Ice: Ice rules

### So what happens to the Hydrogen atoms?

Well, that is a tricky question. Lets do an experiment. I'm going to give you a wurtzite atomic model. This one is tiny, only 21 atoms (budget was low, sorry!). And I'm going to ask each of you to add a single H atom in the structure, making sure that you satisfy ice rules:

• Each O atom will have two H atoms as first neighbors
• Each O atom will have 2 H atoms as second neighbors
• There should be 1 and only one H atom in the line joining two O atoms.

## The Structure of Ice: Disordered protons

Well this is what you should produce!! This called disordered ice. You can move the structure around and if you right click you have different visualization options. You can also do measurements.

Try this!!

Click the right mouse button with the cursor over the image–>

Render –> Schemes –> CPK Spacefill

Click on the left mouse button and rotate the ice structure.

Do you see the open spaces in ice? They are formed at low temperatures when water molecules form many stable hydrogen bond

To measure distance

Double click on one atom then drag to second atom– double click again

To measure angles

Double click on first atom, once on middle atom, twice on atom three.

## The (configurational) Entropy of Ice

Note: I'll be using the word proton to refer to Hydrogen atom all through this section. By now you should know why.

As you have seen, it is not easy at all to build a model of ordered ice, unless you know the unit cell rule and repeat it periodically. When I asked you to put the Hydrogen atoms in place on the wurtzite structure of Oxygen atoms, you were not able to do it, because we, somehow were forcing the structure to be disordered. The class indeed acted as an excellent random number generator.

If we repeated the experiment again, chances are that you would end up with a structure totally different from the precious one. And if we repeated again, you would have a third candidate, and so forth and so on. The fact that there are many solutions to the problem of giving 2 protons (or Hydrogen atoms) to each oxygen atom in the hexagonal lattice Indeed we can count how many different ways of ordering the protons are there, if we know the number of Oxygen atoms in the lattice. And this bring us to the concept of Entropy.

## A crash course on Entropy

### What is Entropy?

Lets start with the second law of thermodynamics which placed limits on the physically possible thermodynamic processes above and beyond conservation of energy (the 1st law).

One of the earliest statements of the second law, due to Rudolf Clausius is that:

Heat cannot spontaneously flow from a cold object to a hot one, whereas the reverse, a spontaneous flow of heat from a hot object to a cold one, is possible. This is intuitively logical, but indeed intuition is simply telling us what can be expressed in much more powerful words:

The entropy (S) of the Universe tends to a maximum.

Here we introduce a quantity, entropy, which we can use to build a deeper understanding of the second law. We are going to see that entropy relates to the disorder of a system, and is also a measurement of the available thermal energy for a process to occur.

## Entropy as a state variable

A state variable is a variable that describes the current state of a dynamical system. Heat is not a state variable, it depends on the path taken to get the system to it's state. Entropy, on the other hand, is a state variable, the change in entropy required to change a system from one state to another via a reversible process is independent of the path taken.

## Consequence of entropy being a state variable

We can calculate the entropy changes for reversible process using

$dS=\frac{dQ}{T}$

This equation is only valid for reversible processes. However if we want to find the entropy change for an irreversible process (ie. any real process) that goes from state A to state B we can calculate the entropy change for a reversible process that goes from point to A and B, and the entropy change will be the same for the two processes.

Any reversible cycle can be represented as series of Carnot cycles, with each Carnot cycle contributing no increase of entropy, therefore any reversible cycle results in no increase of entropy.

A reversible process will have equal and opposite entropy change if it is reversed.

We will now look at the entropy change for some non reversible processes.

## Entropy change for mixing

We can consider the entropy change when two objects transfer heat from one to another, for example when we mix water at two different temperatures. When the temperature difference is fairly small we can approximate the entropy change for each process as

$\Delta S=\frac{Q}{\overline{T}}$

$\overline{T}$ is an average temperature for the process.

In the example of mixing equal quantities of water at different temperatures $T_{H}$ and $T_{L}$, and amount of heat $Q$ will be transferred from the hot water until the temperature is a new temperature $T_{M}=\frac{T_{H}+T_{L}}{2}$. For the water which is cooling the entropy change will be negative

$\Delta S_{cooling}=-\frac{Q}{T_{cooling}}$

where $T_{H}>T_{cooling}>T_{M}$

For the water whose temperature is increasing the change in entropy is

$\Delta S_{heating}=\frac{Q}{T_{heating}}$

where $T_{L}

We can see that the total entropy change of the system

$\Delta S_{cooling}+\Delta S_{heating}>0$

### Transfer of entropy to the environment

In many cases when considering the total entropy change in a system we need to consider the entropy change to the environment. For example, if we take an object which is cooling by heat lost to the environment through a quasistatic reversible process where $\Delta Q=mC\Delta T$, where C is the specific heat.

## Second law of thermodynamics from an entropy viewpoint

The example we discussed fit with our expectations of how things work. If we mix hot and cold water they will equilibrate to the same temperature and they won't spontaneously separate again in to hot and cold water (which would decrease entropy). If we have a hot object in an cooler environment it will transfer the heat to the environment, the reverse, which would decrease entropy won't happen.

We can express the second law of thermodynamics in terms of entropy:

The entropy of an isolated system never decreases, it either stays constant (for a reversible process) or increases (for irreversible processes). As all real processes are irreversible the total entropy for a system and it's environment increases as a result of any natural process.

While we can decrease the entropy of part of the universe, some other parts entropy will be increased by a greater amount, leading towards an continual overall increase of the universes entropy.

## Order, disorder and availability of energy

Entropy can be seen as a measure of the order of a system.

For example when we have hot and cold fluids we have a form of order, which is lost when we mix them together. We also lose the capacity to use them for work, while they were separated we could have used them to drive a heat engine, which requires a temperature gradient to do work, once they are mixed we cannot get work from them, even though no energy has been lost.

We can view the continual change of order to disorder as a gradual heating of the universe to a uniform temperature (expansion of the universe would complicate this, as this would result in a lower final temperature, which might eventually tend to absolute zero). The long term consequence of the universe acquiring a uniform temperature and maximal entropy state would be it's eventual heat death in which all mechanical energy would be lost.

## Entropy and Statistics

So far we have considered entropy along the lines it was first proposed by Clausius.

Boltzmann was responsible for giving entropy a statistical basis.

So far when we have considered the state of the system we are referring to it's macrostate, ie. what is the pressure, volume, temperature etc. For each of these macrostates there is some set of microstates which give rise to the macrostate.

If we consider a gas knowing the microstate of the gas would imply that we know the velocity and position of each and every molecule. This is impossible to know, but to determine the probability of a given macrostate we don't need to know the details of the microstate, simply how many microstates there are which correspond to that macrostate. Those macrostates which have the greatest number of microstates have the greatest probability of occurrence. Boltzmann expressed entropy in terms of the number of microstates; the entropy of a given macrostate is

$S=k\ln W$

where W is the number of microstates corresponding to that state.

## Second law as a consequence of statistics

$S=k\ln W$

we can now see that the second law of thermodynamics is simply a statement that a change of a system will be towards one that is more probable. For example, it is extremely unlikely that all the molecules of gas in a room will arrange themselves neatly ordered on one side of the room (this would be a single microstate, whereas there are a very large number of microstates which obey the Maxwell distribution in which the molecules are evenly distributed and move randomly.

However, this viewpoint actually leads us to the conclusion that the second law may not be as rigid as we have so far presented it. Processes which increase entropy are not strictly forbidden, they are just very unlikely to occur, and over time the occurrence of a few statistically unlikely events will have little effect on the overall direction.

This has not stopped the harnessing of such events being proposed as a means of faster than light travel.

## Third law of thermodynamics

The statistical definition of entropy leads to the third law of thermodynamics which defines the absolute value of entropy.

As $S=k\ln W$

we can see that the entropy is equal to zero when there is a single microstate for the system, or $W=1$, which corresponds to a perfectly ordered state occurring at $T=0\mathrm{K}$.

## Microstates in Ice-1H

So now you know it. The configurational entropy of ice can be “easily” computed by counting the different number of ways of ordering protons in the crystal lattice of Oxygen atoms, while satisfying the ice rules. We call this configurational entropy, to be distinguished from the vibrational entropy (we will see this one in a future lecture).

Entropy is an extensive property, i.e., it scales with the size of the system. Obviously the more oxygen atoms we have, the more different ways of ordering the protons. Indeed, for two oxygen atoms in the unit cell you have only two choices, lets see them:

To see the animation, right click and choose Animation–> Play.

To see it repeatedly: Animation–>Animation Mode–> Palindrome. And then again Animation–>Play.

There are only two structures, one where the net dipole moment is pointing up and another one where the net dipole moment is pointing down. Both of them correspond to ordered proton structures, and indeed they are the basis of the structure known as Ice XI. Ice XI is the ordered phase of Hexagonal ice (Ice I-H).

## Counting Microstates

Linus Pauling was the first person to compute the configurational entropy of ice. Indeed, accurately compute this entropy is one of the more interesting and unsolved problems in statistical physics. But the Number produced by Pauling is accurate enough for our purposes. You should not that $W$ includes all possible arrangements, even those that exhibit macroscopic order. However, as it occurs in the statistical mechanics of large numbers of particles, the result is completely dominated by the arrangements that appear completely disordered.

First of all lets think about the number of possible orientations of an H$_2$O molecule at a given site on the crystal lattice. As you can see, there are 6 possible orientations.

Now consider a crystal of N molecules. N is large enough so as the surfaces can be ignored. Another way of saying it is the surface to volume ratio is almost zero.

Now, each water molecule in the lattice can be placed in one of the six possible orientations shown above.

Lets for the moment ignore the ice rule that tell us that there should be a single proton on each O-O bond line. If so, the total number of arrangements for the N molecules is 6$^N$. This is the number of permutations with repetition. We have 6 choices to make per molecule and we have to choose N molecules. That is the definition of permutations with repetition (order matters!).

Now, according to what we have done, each O-O bond will have one of four, equally probable states (H is presence of H, - is absence of H):

HH (bond is wrong), H- (bond is correct), -H (bond is correct), - - (bond is wrong).

This implies that the probability of a bond being correctly formed is $P=\frac{2}{4}=\frac{1}{2}$.

Now, each of N molecules in the crystal, can only form 2 O-Obonds with their own H atoms. So if there are N molecules, each molecule contributes with 2 bonds, we can have a maximum of 2N bonds.

If the probability of each bond to be correctly formed is $\frac{1}{2}$, then the fraction of the 6$^N$ possible arrangements, in which all the 2N bonds are correctly formed is: $(\frac{1}{2})^{2N}$.

Then the total number of acceptable configurations in the crystal is: $W=6^N(\frac{1}{2})^{2N}=(\frac{3}{2})^{N}$

And the Configurational entropy is: $S=k_B ln((\frac{3}{2})^{N})=N k_B ln(\frac{3}{2})$.

As you see it is an extensive quantity (linearly scales with N). We usually instead give the molar entropy, where we replace $K_B$ by $K_B N_A=R$. For one mole of ice we have: $S=R ln(\frac{3}{2})=3.37 J K^{-1}mol^{-1}$. No too bad if you compare to the experimental value: $S=3.41 J K^{-1}mol^{-1}$ !