Thursday, July 30, 2009

Immersive video yet to make its mark

When Google first released its StreetView add-on to Google Map it immediately caught attention of millions despite being quite rudimentary. The concept was not new. I remember viewing a very similar application for Miami a few years earlier. I can't find a link to that site but it was very impressive, with much smoother transitions between the scenes and better quality of images. Yet StreetView predecessor barely registered on the Internet.

Not many may know that Google was also experimenting with video equivalent of what is now StreetView scenes. It was one of those "jaw dropping" technologies that so far did not make to the big league. The 360 degree navigation of streaming video is quite impressive yet Google decided do abandon this capability. I suspect there were at least two reasons -the volume of data that would be required for movie-like StreetView of the entire world and potential difficulties in dealing with privacy issues (ie. lack of capability of blurring efficiently faces and number plates, and blocking individual scenes, which is much easier with static images). And then there is also a possibility of adding more functionality to interact with static images (street names and symbolic directions, merging StreetView with other imagery on the Internet, etc. ) but I think it was more an afterthought rather than a planned end game from day one.











[video courtesy of immersivemedia.com; click and scroll the mouse to look in any direction!]

Not all "exciting new technologies" make it big because of too cautious approach from its creators, or lack of practical use, or not enough luck with investors, strategic mistakes or else. Potential users of those technologies are not rational either, as we all know from the outcome of the battle between VHS and Beta video systems, so not always "the best one" wins. It will be interesting to watch whether immersive video technology can make it to the mainstream of consumer electronics (eg as cameras enabling capturing 360 degree video images) and consumer applications (eg. like online sightseeing, real estate house demonstrations, etc). For now though, the battle is brewing between Miscosoft's Photosynth and Googles StreetView for the online interactive imagery viewing capabilities.

Wednesday, July 29, 2009

Google has Photosynth in sight

Google engineers have just added a new functionality to StreetView that will take some shine off Microsoft's Photosynth. Now you have the option to preview user uploaded images from Panoramio in StreetView mode AND navigate those images in Photosynth-like manner (ie. footprints and relative perspective of related images are displayed on the background image as a white, transparent overlay; double-clicking on the overlay will load a new image into the viewer). Try for yourself: go to Google Map and type "Sydney Opera House" in the search box, then select StreetView option and click on "user photos". When you move the mouse around white overlays of related imagery will start to appear on the screen.


It is all still a bit raw but with more photos depicting a given landscape it should be a much better experience. I believe the next step will be an integration of user added images with actual StreetView scenes. The same "double-click zoom on white overlay" functionality is already implemented in StreetView viewer so it may be only a matter of "merging" the two together.

I must admit that Google's implementation is much more closer to the original concept behind Photosynth than the product ultimately delivered by Microsoft. That is, the starting point was to organise millions of photos available on the Internet into a seamless, 3-dimensional view of the world. Yet Microsoft abandoned that path and released Photosynth as a personal image viewer (and a stand alone product for corporate clients). It looks that Google may win yet another battle for the dominance on the Internet.

Tuesday, July 28, 2009

The Diet-Heart Hypothesis: Subdividing Lipoproteins

Two posts ago, we made the rounds of the commonly measured blood lipids (total cholesterol, LDL, HDL, triglycerides) and how they associate with cardiac risk.

Lipoproteins Can be Subdivided into Several Subcategories

In the continual search for better measures of cardiac risk, researchers in the 1980s decided to break down lipoprotein particles into sub-categories. One of these researchers is Dr. Ronald M. Krauss. Krauss published extensively on the association between lipoprotein size and cardiac risk, eventually concluding (
source):
The plasma lipoprotein profile accompanying a preponderance of small, dense LDL particles (specifically LDL-III) is associated with up to a threefold increase in the susceptibility of developing [coronary artery disease]. This has been demonstrated in case-control studies of myocardial infarction and angiographically documented coronary disease.
Krauss found that small, dense LDL (sdLDL) doesn't travel alone: it typically comes along with low HDL and high triglycerides*. He called this combination of factors "lipoprotein pattern B"; its opposite is "lipoprotein pattern A": large, buoyant LDL, high HDL and low triglycerides. Incidentally, low HDL and high triglycerides are hallmarks of the metabolic syndrome, the quintessential modern metabolic disorder.

Krauss and his colleagues went on to hypothesize that sdLDL promotes atherosclerosis because of its ability to penetrate the artery wall more easily than large LDL. He and others subsequently showed that sdLDL are also more prone to oxidation than large LDL (
1, 2).

Diet Affects LDL Subcategories

The next step in Krauss's research was to see how diet affects lipoprotein patterns. In 1994, he published a
study comparing the effects of a low-fat (24%), high-carbohydrate (56%) diet to a "high-fat" (46%), "low-carbohydrate" (34%) diet on lipoprotein patterns. The high-fat diet also happened to be high in saturated fat-- 18% of calories. He found that (quote source):
Out of the 87 men with pattern A on the high-fat diet, 36 converted to pattern B on the low-fat diet... Taken together, these results indicate that in the majority of men, the reduction in LDL cholesterol seen on a low-fat, high-carbohydrate diet is mainly because of a shift from larger, more cholesterol-enriched LDL to smaller, cholesterol-depleted LDL [sdLDL].
In other words, in the majority of people, high-carbohydrate diets lower LDL cholesterol not by decreasing LDL particle count (which might be good), but by decreasing LDL size and increasing sdLDL (probably not good). This has been shown repeatedly, including with a 10% fat diet and in children. However, in people who already exhibit pattern B, reducing fat does reduce LDL particle number. Keep in mind that the majority of carbohydrate in modern America comes from refined wheat and sugar; a diet of unrefined carbohydrate may not have these effects.

Krauss then specifically explored the effect of saturated fat on LDL size (free full text). He re-analyzed the data from the study above, and found that:
In summary, the present study showed that changes in dietary saturated fat are associated with changes in LDL subclasses in healthy men. An increase in saturated fat, and in particular, myristic acid [as well as palmitic acid], was associated with increases in larger LDL particles (and decreases in smaller LDL particles). LDL particle diameter and peak flotation rate [density] were also positively associated with saturated fat, indicating shifts in LDL-particle distribution toward larger, cholesterol-enriched LDL.
Participants who ate the most saturated fat had the largest LDL, and vice versa. Kudos to Dr. Krauss for publishing these provocative data. It's not an isolated finding. He noted in 1994 that:
Cross-sectional population analyses have suggested an association between reduced LDL particle size and relatively reduced dietary animal-fat intake, and increased consumption of carbohydrates.
Diet Affects HDL Subcategories

Krauss also tested the effect of his dietary intervention on HDL. Several studies have found that the largest HDL particles, HDL2b, associate most strongly with HDL's protective effects (more HDL2b = fewer heart attacks). Compared to the diet high in total fat and saturated fat, the low-fat diet decreased HDL2b significantly. A separate study found that the effect persists at one year. Berglund et al. independently confirmed the finding using the low-fat American Heart Association diet in men and women of diverse racial backgrounds. Here's what they had to say about it:

The results indicate that dietary changes suggested to be prudent for a large segment of the population will primarily affect [i.e., reduce] the concentrations of the most prominent antiatherogenic [anti-heart attack] HDL subpopulation.
Saturated and omega-3 fats selectively increase large HDL. Dr. B. G. of Animal Pharm has written about this a number of times.

Wrapping it Up

Contrary to the simplistic idea that saturated fat increases LDL and thus cardiac risk, total fat and saturated fat have a complex influence on blood lipids, the net effect of which is unclear. These blood lipid changes persist for at least one year, so they may represent a long-term effect. It's important to remember that the primary sources of carbohydrate in the modern Western diet are refined wheat and sugar.  Healthier sources of carbohydrate have different effects on blood lipids.

* This is why you may read that small, dense LDL is not an "independent predictor" of heart attack risk. Since it travels along with a particular pattern of HDL and triglycerides, in most studies it does not give information on cardiac risk beyond what you can get by measuring other lipoproteins.

Saturday, July 25, 2009

MRFIT Mortality

The Multiple Risk Factor Intervention trial was a very large controlled diet trial conducted in the 1980s. It involved an initial phase in which investigators screened over 350,000 men age 35-57 for cardiovascular risk factors including total blood cholesterol. 12,866 participants with major cardiovascular risk factors were selected for the diet intervention trial, while the rest were followed for six years. I discussed the intervention trial here.

During the six years of the observational arm of MRFIT, investigators kept track of deaths in the patients they had screened. They compared the occurrence of deaths from multiple causes to the blood cholesterol values they had measured at the beginning of the study. Here's a graph of the results (source):


Click on the graph for a larger image. Coronary heart disease does indeed rise with increasing total cholesterol in American men of this age group. But total mortality is nearly as high at low cholesterol levels as at high cholesterol levels. What accounts for the increase in mortality at low cholesterol levels, if not coronary heart disease? Stroke is part of the explanation. It was twice as prevalent in the lowest-cholesterol group as it was in other participants. But that hardly explains the large increase in mortality.

Possible explanations from other studies include higher infection rates and higher rates of accidents and suicide. But the study didn't provide those statistics so I'm only guessing.

The MRFIT study cannot be replicated, because it was conducted at a time when fewer people were taking cholesterol-lowering drugs. In 2009, a 50-year old whose doctor discovers he has high cholesterol will likely be prescribed a statin, after which he will probably no longer have high cholesterol. This will confound studies examining the association between blood cholesterol and disease outcomes.

Friday, July 24, 2009

New features on Google Map for mobile v3.2

The latest upgrade of Google Map for mobile phones extends the functionality of the application and allows simultaneous viewing of multiple information layers on top of background map. The layers include traffic, local search results for business listings, Latitude friend locations and points-of-interest descriptions from Wikipedia. Google also enabled viewing user created My Maps content as a layer and there were significant improvements made in presentation of local search results on the map. For full description of new features please see Google Mobile blog.



Google Maps for mobile works on most phones including:
  • Android
  • iPhone (pre-installed)
  • All color BlackBerry devices
  • Most Java-enabled (J2ME) mobile phones
  • Windows Mobile devices with Windows Mobile 5.0 and above
  • Symbian S60 3rd Edition (most new Nokia smartphones and all 3G Symbian devices)
  • Palm devices with Palm OS 5 and above
As with many “things Google”, there are different options to view maps on your mobile device and there is extensive functionality available for each of these options but you have to discover it all for yourself. Documentation can be confusing. For clarity, maps for mobile are different from recently upgraded Google Map v3 available for Internet browsers (including those on mobile devices!)

Thursday, July 23, 2009

The Diet-Heart Hypothesis: A Little Perspective

Now that we've discussed the first half of the diet-heart hypothesis, that saturated fat elevated total and LDL cholesterol, let's take a look at the second half. This is the idea that elevated serum cholesterol causes cardiovascular disease, also called the "lipid hypothesis".

Heart Attack Mortality vs. Total Mortality

We've been warned that high serum cholesterol leads to heart attacks and that it should be reduced by any means necessary, including powerful cholesterol-lowering drugs. We've been assailed by scientific articles and media reports showing associations between cholesterol and heart disease. What I'm going to show you is a single graph that puts this whole issue into perspective.

The following is drawn from the Framingham Heart study (via the book Prevention of Coronary Heart Disease, by Dr. Harumi Okuyama et al.), which is one of the longest-running observational studies ever conducted. The study subjects are fairly representative of the general population, although less racially diverse (largely Caucasian). The graph is of total mortality (vertical axis) by total cholesterol level (horizontal axis), for different age groups: If you're 80 or older, and you have low cholesterol, it's time to get your affairs in order. Between the age of 50 and 80, when most heart attacks occur, there's no association between cholesterol level and total mortality. At age 50 and below, men with higher cholesterol die more often. In the youngest age group, the percent increase in mortality between low and high cholesterol is fairly large, but the absolute risk of death at that age is still low. There is no positive association between total cholesterol and mortality in women at any age, only a negative association in the oldest age group.

Here's more data from the Framingham study, this time heart attack deaths rather than total mortality
(from the book Prevention of Coronary Heart Disease, by Dr. Harumi Okuyama et al.): Up to age 47, men with higher cholesterol have more heart attacks. At ages above 47, cholesterol does not associate with heart attacks or total mortality. Since the frequency of heart attacks and total mortality are low before the age of 47, it follows that total cholesterol isn't a great predictor of heart attacks in the general population.

These findings are consistent with other studies that looked at the relationship between total cholesterol and heart attacks in Western populations. For example, the observational arm of the massive MRFIT study found that higher cholesterol predicted a higher risk of heart attack in men age 35-57, but total mortality was highest both at low and high cholesterol levels. The "ideal" cholesterol range for total mortality was between 140 and 260 mg/dL (reference). Quite a range. That encompasses the large majority of the American public.

The Association Between Blood Cholesterol and Heart Attacks is Not Universal
The association between total cholesterol and heart attacks has generally not been observed in Japanese studies that did not pre-select for participants with cardiovascular risk factors (
Prevention of Coronary Heart Disease, by Dr. Harumi Okuyama et al.). This suggests that total blood cholesterol as a marker of heart attack risk is not universal. It would not necessarily apply to someone eating a non-Western diet.

Subdividing Cholesterol into Different Lipoprotein Particles Improves its Predictive Value

So far, this probably hasn't shocked anyone. Most people agree that total cholesterol isn't a great marker. Researchers long ago sliced up total cholesterol into several more specific categories, the most discussed being low-density lipoprotein (LDL) and high-density lipoprotein (HDL). These are tiny fatty droplets (lipoproteins) containing fats, cholesterol and proteins. They transport cholesterol, fats, and fat-soluble vitamins between tissues via the blood.

The LDL and HDL numbers you get back from the doctor's office typically refer to the amount of cholesterol contained in LDL or HDL per unit blood serum, but you can get the actual particle number measured as well.
One can also measure the level of triglyceride (a type of fat) in the blood. Triglycerides are absorbed from the digestive tract and manufactured by the liver in response to carbohydrate, then sent to other organs via lipoproteins.

The level of LDL in the blood gives a better approximation of heart attack risk than total cholesterol. If you're living the average Western lifestyle and you have high LDL, your risk of heart attack is substantially higher than someone who has low LDL. LDL particle number has more predictive value than LDL cholesterol concentration. The latter is what's typically measured at the doctor's office. For example, in the EPIC-Norfolk study (free full text)
, patients with high LDL cholesterol concentration had a 73% higher risk of heart attack than patients with low LDL. Participants with high LDL particle number had exactly twice the risk of those with low LDL number. We'll get back to this observation in a future post.

In the same study, participants with low HDL had twice the heart attack risk of participants with high HDL. That's why HDL is called "good cholesterol". This finding is fairly consistent throughout the medical literature. HDL is probably the main reason why total cholesterol doesn't associate very tightly with heart attack risk. High total cholesterol doesn't tell you if you have high LDL, high HDL or both (LDL and HDL are the predominant cholesterol-carrying lipoproteins).

Together, this suggests that the commonly measured lipoprotein pattern that associates most tightly with heart attack risk in typical Western populations is some combination of high LDL (particularly LDL particle number), low HDL, and high triglycerides.
In the next post, I'll slice up the lipoproteins even further and comment on their association with cardiovascular disease. I'll also begin to delve into how diet affects the lipoproteins.

“Spatial” dilemma

Since I used word “spatial” in the name of my blog, it is appropriate that I explain my views on the connotations of this word, especially since its meaning is rather vague for most people.

The free directory defines “spatial” (adjective) as “pertaining to or involving or having the nature of space” or “relating to size, area, or position: spatial dimensions”. The word can have alternative spelling as “spacial” (although it shows up as an error in MS Word) and is closely related to the noun “space”, with multitude of meanings.

With my “spatial industry” background, the associated meaning of the word is relatively straightforward: locations + maps + GIS = spatial. I am also interested in true, 3 dimensional visualisation and modelling (with or without geographic references) and the word “spatial” is very relevant in that context too. In addition, the capacity of Google map – currently the main focus of my online experimentation – to show the sky, satellite imagery, Moon and Mars, brings the extra-terrestrial dimension to the meaning of “spatial”. All in all, I favour the broadest possible meaning of “spatial” to define the focus of this blog.

However, I am first to admit that this broad definition is also a problem in creating a clear identity for the blog. The name “All-things-spatial” will have some meaning for those who have anything to do with the spatial industry, maps or GIS, but will probably mean nothing to all the others.

In marketing, there is a concept of “positioning” – a clear, unique, and advantageous perception about the brand in customer mind. Just to illustrate, if you ask about “fast food” most people would associate it with McDonald or KFC or maybe a handful of other brands. But if you say “spatial”, the most likely response would be: “Eh?”. And that’s the real problem in trying to position the offering using references to “spatial”… Disambiguation and clear segmentation of capabilities is needed in order to make the product or service well defined in customer mind. So, “All-things-spatial” is simply about tools and concepts to add a unique perspective and context to a place and its immediate surroundings.

I am not the only one facing challenges associated with the ambiguity of word “spatial” but by focusing on a well defined range of topics in my blog I can draw a clear boundary to define its meaning. It is a much tougher task for all those who try to create an identity and sell the benefits of the “spatial industry” to the rest of the business community and the governments…

The challenge is that, what is referred to as “spatial industry”, encompasses a large array of professions and solutions that are not able to be defined with a simple, common descriptor. The differences between surveyors, cartographers or remote sensing scientists, or CAD, GIS, image processing and GPS navigation software creators and operators, seems to be larger than commonalities. They are all operating in the “spatial domain”, yet in different dimensions…

At the same time there are many other professions that utilise “spatial” skills. For example, in IT, database administrators take on management of spatial data, programmers code “spatial functionality” and add maps to non-spatial applications, graphic designers create maps, farmers integrate GPS and remotely sensed imagery for crop management, geologists and geoscientists in general create maps and model 3D structures to help in analysis and presentation of their research, game designers create and use digital elevation models for flight simulations... And the list can go on and on.

So, who is a “spatial professional”? In reality, because the word “spatial” can have so many associations, it could be anyone who has anything to do with broadly defined “space”, including astronauts, astronomers, landscape architects, etc. but many of them would never associate with that description. (By analogy, the fact that someone uses spreadsheets in a day to day work does not mean they would call themselves "spreadsheet professionals").

Trying to draw a boundary between what is “spatial” and what is not can be quite illusive. Let me elaborate on this using the following example. If you have a list of cinemas with location attribute like “Sydney”, “Melbourne”, some would argue this is a valid spatial dataset. Yet, the database to store that data does not have to be spatial and querying the data by this attribute does not require any “spatial skills” (it’s a simple SQL statement: select from… where attribute equals ’Sydney’). Even if you add geographic coordinates to identify locations of individual cinemas there is no need for a spatial database and queries like “select within a bounding box” or “select within 5km radius from a point” can easily be done with simple formulas. It is only with complex relationships that spatial skills and tools offer a distinct advantage (eg. queries involving intersections, adjacency, buffering on irregular objects, etc.).

There are many organisations and associations that attempt to capitalise on “spatial” branding and try to bring diverse professions under one common label. In Australia there is Surveying and Spatial Sciences Institute (created recently by amalgamation of the Institution of Surveyors and Spatial Sciences Institute), Spatial Industries Business Association (renamed from Australian Spatial Information Business Association), Geospatial Information & Technology Association, Commonwealth Government policy body the Office of Spatial Data Management. There is also Spatial Journal published by Mapping Sciences Institute of Australia as well as other numerous initiatives with "spatial" references in the title: Cooperative Research Centre for Spatial Information, Australian Spatial Data Directory, Australian Spatial Data Infrastructure etc. These names have special meaning for people from within a narrowly defined “spatial industry” but remain mostly a mystery for the community at large (ie. the target market for the constituents of those organisations).

Fortunately, individual companies seem to recognise the limitation of focusing too much on “spatial” and are reaching into diverse range of industries providing either highly specialised expertise (that can be easily defined and marketed) or a broad range of capabilities (where spatial components are only one aspect of the overall arsenal required to meet a wide range of customer needs).

Wednesday, July 22, 2009

Map basics: datum, coordinate system, projection

Map projections, coordinate systems and geodetic datums are not the most exciting topics to discuss in reference to maps but any user of geographic information should have at least a basic understanding of the concepts.

In a nutshell, Earth is an oblate spheroid an in order to represent its surface as a flat map, complex mathematical transformations are required.

[oblate spheroid - image courtesy of Wikipedia]


Geodetic datum defines reference points on the Earth's surface against which position measurements are made. Central to this concept is an associated model of the shape of the Earth (that is, reference spheroid) to define a coordinate system.

Map coordinates are usually shown in one of two ways, as geographical coordinates (ie latitude and longitude values, in degrees) or grid coordinates, (as easting and northing values, in metres).

Map projection is a method of representing the surface of a sphere on a plane. By definition, all map projections show a distorted representation of the Earth surface therefore different map projections exist in order to preserve some properties of the sphere-like body (ie. either area, shape, direction, bearing, distance and/or scale) at the expense of other properties.

What does it all mean? The key point is that commonly quoted “geographic coordinates” (eg Sydney Opera House: lat, lng) are only meaningful in reference to a specific datum (eg. that point is shifted approximately 200m on AGD66 datum as compared to GDA94 datum). And to represent that point properly on a map you will also need to know projection of the map, otherwise the point may be depicted in a wrong place.


[example of mismatch resulting from source data being in geographic projection and the underlying map in Mercator projection]


More examples to illustrate the point. Satellite based navigation systems (the Global Positioning System or GPS) are becoming more and more popular in Australia so users should be aware that GPS coordinates are based on WGS84 datum, which is different from official Australian datum GDA94 (different spheroid definitions were used). All current official maps and data in Australia are based on GDA94. However, luckily, the difference between WGS84 and GDA94 is negligible and for most common uses both datums can be used interchangeably.

As to map projections, when you have a map showing just a few streets, projections don’t really matter. Similarly, if you view small scale maps in atlases or on wall posters, projection rarely comes to mind simply because it has already been determined by the author to best represent the phenomenon and also to “look nice”. However, when you deal with raw geographic data (whether vector – points, lines and polygons, or raster - images) and need to compile them into a map, projections of source data and final map are of outmost importance. Similarly, if you need to take a reliable measure of distance or area on the map you have to know which projections preserve those properties and therefore are the most appropriate to use.

The following is an illustration of the “distortions” in representation of shapes on a map due to different projections. The first image was generated by applying a common projection to input data and the second shows the result of not applying any projection at all.

[image courtesy of Statistics Canada]


Projections that you are most likely to encounter in Australia are:

1. Geographic/Equirectangular projection: a de-facto standard for computer applications because of the connection between an image pixel and its geographic position.

The following are the international reference codes to precise definitions of the transformation: EPSG:4326 (WGS84 datum) and Australian specific EPSG:4283 (GDA94 datum) – for all common purposes, they are interchangeable, unless you require a sub-meter accuracy.

[image courtesy of spatialreference.org]


2. Transverse Mercator projection used with Universal Transverse Mercator coordination system (UTM zones 49 to 56) - suitable for measuring distances and areas; mostly used for medium scale printed maps.


3. Lambert Conformal Conic projection (EPSG:3112) - well approximates distance between two points; often used for aeronautical charts, small scale maps or road maps.

[image courtesy of spatialreference.org]


4. Mercator projection (EPSG:3395) - used for Google Map, Virtual Earth/ Bing Map, and all tile based online maps – distance between two points on the map is distorted, the more the further you move from the equator.


Individual States in Australia also define their own local projections for variety of purposes so it is always wise to check metadata before putting the data to any use.

Monday, July 20, 2009

The Diet-Heart Hypothesis: Stuck at the Starting Gate?

The diet-heart hypothesis is the idea that (1) dietary saturated fat, and in some versions, dietary cholesterol, raise blood cholesterol in humans and (2) therefore contribute to the risk of heart attack.

I'm not going to spend a lot of time on the theory in relation to dietary cholesterol because the evidence that typical dietary amounts cause heart disease in humans is weak.  Here's a graph from the Framingham Heart study (via the book
Prevention of Coronary Heart Disease, by Dr. Harumi Okuyama et al.) to drive home the point. Eggs are the main source of cholesterol in the American diet. In this graph, the "low" group ate 0-2 eggs per week, the "medium" group ate 3-7, and the "high" group ate 7-14 eggs per week (click for larger image): The distribution of blood cholesterol levels between the three groups was virtually identical. The study also found no association between egg consumption and heart attack risk. Dietary cholesterol does not have a large impact on serum cholesterol in the long term, perhaps because humans are adapted to eating cholesterol. Most people are able to adjust their own cholesterol metabolism to compensate when the amount in the diet increases. Rabbits don't have that feedback mechanism because their natural diet doesn't include cholesterol, so feeding them dietary cholesterol increases blood cholesterol and causes vascular pathology.

The first half of the diet-heart hypothesis states that eating saturated fat raises blood cholesterol. This has been accepted without much challenge by diet-health authorities for nearly half a century. In 1957, Dr. Ancel Keys proposed a formula (Lancet 2:1959. 1957) to predict changes in total cholesterol based on the amount of saturated and polyunsaturated fat in the diet. This formula, based primarily on short-term trials from the 1950s, stated that saturated fat is the primary dietary influence on blood cholesterol.

According to Keys' interpretation of the trials, saturated fat raised, and to a lesser extent polyunsaturated fat lowered, blood cholesterol.
But there were significant flaws in the data from the very beginning, which were pointed out in this critical 1973 literature review in the American Journal of Clinical Nutrition (free full text).

The main problem is that the controlled trials typically compared saturated fats to omega-6 linoleic acid (LA)-rich vegetable oils, and when serum cholesterol was higher in the saturated fat group, this was most often attributed to the saturated fat raising blood cholesterol rather than the LA lowering it. When a diet high in saturated fat was compared to the basal diet without changing LA, often no significant increase in blood cholesterol was observed. Studies claiming to show a cholesterol-raising effect of saturated fat often introduced it after an induction period rich in LA. Thus, the effect sometimes had more to do with LA lowering blood cholesterol than saturated fat raising it. This is not at all what I was expecting to find when I began looking through these trials.


Reading through the short-term controlled trials, I was surprised by the variability and lack of agreement between them. Some of this was probably due to a lack of control over variables and non-optimal study design. But if saturated fat has a dominant effect on serum cholesterol in the short term, it should be readily and consistently demonstrable.  

The long-term data are not kind to the diet-heart hypothesis. Reducing saturated fat while greatly increasing LA certainly does lower blood cholesterol substantially. This was the finding in the well-controlled Minnesota Coronary Survey trial, for example (14% reduction). But in other cases where LA intake changed less, such as MRFIT, the Women's Health Initiative Diet Modification trial and the Lyon Diet-Heart trial, reducing saturated fat intake had little or no effect on total cholesterol or LDL (0-3% reduction).  The small changes that did occur could have been due to other factors, such as increased fiber and phytosterols, since these were multiple-factor interventions.

Another blow to the idea that saturated fat raises cholesterol in the long term comes from observational studies. Here's a graph of data from the Health Professionals Follow-up study, which followed 43,757 health professionals for 6 years (via the book
Prevention of Coronary Heart Disease by Dr. Harumi Okuyama et al.): What this graph shows is that at a relatively constant LA intake, neither saturated fat intake nor the ratio of LA to saturated fat were related to blood cholesterol in freely living subjects. This was true across a wide range of saturated fat intakes (7-15%). 

There's more. If saturated fat were important in determining the amount of blood cholesterol in the long term, you'd expect populations who eat the most saturated fat to have high blood cholesterol levels. But that's not the case. The Masai traditionally get a high proportion of their calories from milk fat, half of which is saturated. In 1964, Dr. George V. Mann published a paper showing that traditional Masai warriors eating practically nothing but very fatty milk, blood and meat had an average cholesterol of 115 mg/dL in the 20-24 year age group. For comparison, he published values for American men in the same age range: 198 mg/dL (J. Atherosclerosis Res. 4:289. 1964). Apparently, eating three times the saturated animal fat and several times the cholesterol of the average American wasn't enough to elevate their blood cholesterol. What does elevate the cholesterol of a Masai man?
Junk food.

Now let's swim over to the island of Tokelau, where the traditional diet includes nearly 50% of calories from saturated fat from coconut. This is the highest saturated fat intake of any population I'm aware of. How's their cholesterol? Men in the age group 20-24 had a concentration of 168 mg/dL in 1976, which was lower than Americans in the same age group despite a four-fold higher saturated fat intake.
Tokelauans who migrated to New Zealand, eating half the saturated fat of their island relatives, had a total cholesterol of 191 mg/dL in the same age group and time period, and substantially higher LDL (J. Chron. Dis. 34:45. 1981). Sucrose consumption was 2% on Tokelau and 13% in New Zealand. Saturated fat seems to take a backseat to some other diet/lifestyle factor(s).  Body fatness and excess calorie intake are good candidates, since they influence circulating lipoproteins.

Does dietary saturated fat influence total cholesterol and LDL over the long term?  I don't have the answers, but I do think it's interesting that the evidence is much less consistent than it's made out to be.  It may be that if dietary saturated fat influences total cholesterol or LDL concentration in the long term, the effect is is secondary to other factors.  That being said, it's clear that linoleic acid, in large amount, reduces circulating total cholesterol and LDL.

Free GPS navigation tool for iPhone

A new entrant into Australian GPS navigation market, NAVIGON AG, has just released a free application for iPhone to allow users to display their current position on a map as well as search for nearby points of interest (POIs). MobileNavigator LITE is a scaled down version of NAVIGON’s flagship product that will soon be launched in Australia. It will transform any iPhone 3G equipped with the new Apple OS 3.0 operating system and any iPhone 3G S into a complete navigation device. The LITE version also works on iPod Touch devices with the OS 3.0.

[image courtesy of www.sync-blog.com]

Although the LITE version has no active route guidance, it comes with free NAVTEQ® maps which are saved directly on the device, eliminating the need for paid subscription to map service or for downloading the maps as a data feed. The downside is a massive initial download (reported to be more than 1GB!).

Thursday, July 9, 2009

The Finnish Mental Hospital Trial

This diet trial was conducted between 1959 and 1971 in two psychiatric hospitals near Helsinki, Finland. One hospital served typical fare, including full-fat milk and butter, while the other served "filled milk", margarine and polyunsaturated vegetable oils. Filled milk has had its fat removed and replaced by an emulsion of vegetable oil. As a result, the diet of the patients in the latter hospital was low in saturated fat and cholesterol, and high in polyunsaturated fat compared to the former hospital. At the end of six years, the hospitals switched diets. This is known as a "crossover" design.

The results were originally published in 1972 in the Lancet (ref), and a subset of the data were re-published in 1979 in the International Journal of Epidemiology (ref). They found that during the periods that patients were eating the diet low in saturated fat and cholesterol, and high in vegetable oil, male participants (but not females) had roughly half the incidence of heart attack deaths. There were no significant differences in total mortality in either men or women. The female data were omitted in the 1979 report.

This study is often cited as support for the idea that saturated fat increases the risk of heart attack. The reason it's cited so often is it's one of a minority of trials that came to that conclusion. The only other controlled trial I'm aware of that replaced animal fat with polyunsaturated vegetable oil (without changing other variables at the same time) and found a statistically significant decrease in cardiovascular deaths was the Los Angeles Veterans' Administration study. However, there was no difference in total mortality, and there were significantly more heavy smokers in the control group. The difference in heart attack deaths in the V.A. trial was 18%, far less than the difference seen in the Finnish trial.

I can cite three controlled trials that came to the opposite conclusion, that switching saturated fat for vegetable oil increases cardiovascular mortality and/or total mortality: the Anti-Coronary Club Trial (4 years), the Rose et al. corn oil trial (2 years), and the Sydney Diet-Heart trial (5 years). Other controlled trials found no difference in total mortality or heart attack mortality from this intervention, including the National Diet-Heart Study (2 years) and the Medical Research Council study (7 years). Thus, the Finnish trial is an outlier whose findings have never been replicated by better-conducted trials.

I have three main bones to pick with the Finnish trial. The first two are pretty bad, but the third is simply fatal to its use as support for the idea that saturated fat contributes to cardiovascular risk:

1) A "crossover" study design is not an appropriate way to study a disease with a long incubation period. How do you know that the heart attacks you're observing came from the present diet and not the one the patients were eating for the six years before that? The Finnish trial was the only trial of its nature ever to use a crossover design.

2) The study wasn't blinded. When one wants to eliminate bias in diagnosis for these types of studies, one designs the study so that the physician doesn't know which group the patients came from. That way he can't influence the results, consciously or unconsciously. Obviously there was no way to blind the physicians in this study, because they knew what the patients in each hospital were eating. I think it's interesting that the only outcome not susceptible to diagnostic bias, total mortality, showed no significant changes in either men or women.

3) The Finnish Mental Hospital trial was not actually a controlled trial. In an editorial in the November 1972 issue of the Lancet, Drs. John Rivers and John Yudkin pointed out, among other things, that the amount of sugar varied by almost 50% between diet periods. In the December 30th issue, the lead author of the study responded:
In view of the design of the experiment the variations in sugar intake were, of course, regrettable. They were due to the fact that, aside from the fatty-acid composition and the cholesterol content of the diets, the hospitals, for practical reasons, had to be granted certain freedom in dietary matters.
In other words, the diets of the two hospitals differed significantly in ways other than their fat composition. Sugar was one difference. Carbohydrate intake varied by as much as 17% and total fat intake by as much as 26% between diet periods (on average, carbohydrate was lower and total fat was higher in the polyunsaturated fat group). The use of psychiatric drugs with known cardiovascular side effects differed substantially between groups and could have accounted for some of the difference in cardiovascular events.  

The definition of a controlled trial is an experiment in which all variables are kept reasonably constant except the one being evaluated. Therefore, the Finnish trial cannot rightfully be called a controlled trial. The fact that the result has never been replicated casts further doubt on the study.
I could continue listing other problems with the study, such as the fact that the hospital population included in the analysis had a high turnover rate (variable, but as high as 40%), and patients were included in the analysis even if they were at the hospital for as little as 50% of the time between first admission and final discharge (i.e., they came and went). But what's the use in beating a dead horse?


Tuesday, July 7, 2009

Google threat to real estate listing duopoly in Oz

On Monday Google announced new functionality for its online map application: real estate listings search (rentals and properties for sale). Current data is supplied by the Real Estate Institute of Western Australia and homehound.com.au but inevitably the arrangements will be extended to include more service providers. The most obvious candidates are the big real estate franchises because they have nothing but gain from listing on Google maps. That is, they can capitalise on the opportunity of generating extra traffic directly to their sites and, at the same time, reduce expensive reliance on local market leaders – realestate.com.au and domain.com.au .


So far Google's entry into the real estate segment of the online market has not been taken too seriously by industry insiders. Simon Baker, former CEO of realestate.com.au, reportedly said that “…Google Maps has a lot of work to do before it can obtain market dominance”. And Peter Ricci, from the real estate industry news website Business2, …”agreed […] that Google Maps will have to cover a lot of ground before it is considered a major player”. As yet, no official comments from companies operating in the industry.

The market and investors in companies like realestate.com.au also seem not to be perturbed at all with the announcement (REA Group shares stayed relatively flat on Monday and Tuesday). The reality of competing directly with Google did not sink yet with market analysts and investors. But the coming weeks may be interesting.

It is uncertain whether there are any financial arrangements between data suppliers and Google but if this is just a freebie for both sides, then it won’t be long before the market for online real estate listings in Australia will start to transform dramatically. And even if this is not totally free arrangement, as one could suspect based on recent refocusing of Google on money generating deals, smaller real estate listing companies will potentially be the big winners. Unlike for local business listings, there is no way at present for agents to list directly with Google so they will still have to sign up with Google data suppliers (and pay) to appear on Google Maps. For a share of that revenue Google will drive traffic to those sites, and away from majors.

REA Group (realestate.com.au) and Fairfax (domain.com.au) enjoyed dominance in Australia for quite some time and they have a very lucrative revenue model. But no one can ignore “the Google effect”. You only have to recall what happened with the dominance of Telstra's whereis.com mapping site. Google now commands hefty 75% of the market for online maps and at least one aspiring online map provider closed in Australia since Google entered that market (I will not mention the site as it is now just a link farm). The fact is that Google has more than 9.5 million unique visitors a month, comparing to realestate.com.au and domain.com.au combined 6 million. And that is a very strong competitive position to start with.

It’s all about traffic. In the short term, there will probably be no much harm to existing operators because of their revenue model (ie. fat fees from real estate agents) but when the users start to desert the sites, agents will begin to question current premiums charged. Take away the traffic to those sites and agents will leave to where the prospects are more abundant and cheaper to reach. And Google has a powerful weapon in directing the traffic – its search engine. Not that the company will do anything malicious, oh no! But it will be enough if it only starts to provide its own listings at the top of relevant search results. All others have to pay hefty prices to get the same exposure. According to alexa.com, both realestate.com.au and domain.com.au have 15% of their traffic coming from search engines (read: Google!). Considering the size of their respective traffic (3.5 million and 1.6 million) it is a massive 750,000 unique visitors a month. Although REA Group is diversifying from a pure listing provider to a broader business services category, the same cannot be said about domain.com.au. Nevertheless both companies are very vulnerable to changes in online traffic.


Unique visitors trends (courtesy of Google Trends)
homehound.com.au is a minnow now, with only 210,000 uniques a month, but it managed to sign up almost 4,000 real estate agents (comparing to realestate.com.au 9,300) so the extent of listings should be pretty comprehensive. No wonder Google picked them as a partner. To prove the point, I did a quick comparison of what is listed on the three above mentioned sites for Mosman, Sydney. Here are the figures for houses:

realestate.com.au: 92
domain.com.au: 128
homehound.com.au: 59

I didn’t check specifically for overlaps but considering major agents are listing on all 3 sites, it should be significant (especially for high end properties).

Google Map interface may not be the most attractive and intuitive to use but the company has one more ace in this game - a horde of independent developers who will be very keen to use Google’s free real estate search service once it is released (online thematic map provider aus-emaps.com will be one of them!). It is not difficult to imagine that there will be a multitude of attractive applications developed overnight which will further fragment the market.

There is one more factor in favour of Google. The company has means and desire to dominate in the mobile services market. A string of recent investments in its own mobile phone handset, mobile friendly Google Map v3 and now also real estate listing search service is a good example of the company ambitions. And independent developer community can bring many new handy tools to help Google win the battle in this space.

If REA Group and Fairfax have smart strategists that can act decisively, they will join Google just to slow down its advancement and capitalise on the extra traffic that can come from this source (Telstra eventually did come to senses and got into bed with Google to salvage what is left of its local listing service yellopages.com.au). There is also a substantial information value in listing inventory so granting free access to that resource to a developer community may buy them some more time and some market share. Smart partnerships with Telstra’s whereis.com map and its mobile services or Microsoft’s Bing search engine and Virtual Earth map, may also help.

If Telstra acted faster to gain "first mover advantage" with their maps and released free API well in advance of Google, rather than charging $100 per single point map, they might have been still ahead of Google in online map market in Australia. This proves how vulnerable Australian companies are in the truly global market, even large ones by local standards.

If the companies dominating the real estate listing market do not act promptly and decisively they have already lost the battle as maintaining a status quo is no longer an option. In my humble opinion the shake-up and long term structural changes in the industry are inevitable.

As a way of disclosure, I have no direct investments in the companies mentioned in this post but I do have a vested interest in getting access to that whole listing directory! Counting on you again, Google!

* * * * * * * * * * * *
Update [9/7/2009]

Further details are emerging about the arrangements for real estate listings on Google Maps. New Zealand’s scoop.co.nz quotes Google Product Manager Andrew Foster saying that “…The new mapping service allows real estate agents, franchises and websites to upload their house listings directly onto Google Maps free of charge.”

Although there are no obvious pointers on the Google site to this functionality Simon Baker names Google Base as a likely tool. This potentially can get very messy if multiple websites (plus individual sellers themselves!) start to list on Google the same property multiple times…

Formal responses from online listing companies are also starting to circulate in the media. But how drastically opposing views are presented! While Australian market leader realestate.com.au plays down the threat and will not engage with Google at this time, its New Zealand’s counterpart realestate.co.nz has enthusiastically embraced the opportunity of the new partnership. For now Fairfax will not be taking Google offer either.

It will be interesting to review the winners and losers in 12 months time…

Interesting discussions:
crikey.com.au
mumbrella.com.au
www.realestate.co.nz

Monday, July 6, 2009

Unrefined vs. Refined Carbohydrates and Dental Cavities

There's a definite association between the consumption of refined carbohydrates and dental cavities. Dr. Weston Price pointed this out in a number of transitioning societies in his epic work Nutrition and Physical Degeneration. Many other anthropologists and dentists have observed the same thing.

I believe, based on a large body of anthropological and medical data, that it's not just an association-- sugar and flour cause cavities. But why? Is it that they lack micronutrients-- the explanation favored by Price-- or do they harm teeth by feeding the bacteria that participate in cavity formation? Or both?

I recently found an interesting article when I was perusing an old copy of the Journal of Dental Research: "A Comparison of Crude and Refined Sugar and Cereals in Their Ability to Produce in vitro Decalcification of Teeth", published in 1937 by Dr. T. W. B. Osborn et al. (free full text). I love old papers. They're so free of preconceptions, and they ask big questions. The authors begin with the observation that the South African Bantu, similar to certain cultures Dr. Price visited, had a low prevalence of tooth decay when eating their native diet high in unrefined carbohydrate foods. However, their decay rate increased rapidly as modern foods such as white flour and refined sugar became available.

To test whether refined carbohydrates have a unique ability to cause tooth decay, the investigators took pieces of teeth that had been extracted for reasons other than decay (for example, crowding), and incubated them with a mixture of human saliva and several different carbohydrate foods:
  • crude cane juice
  • refined cane sugar
  • whole wheat flour
  • white wheat flour
  • whole corn meal
  • refined corn meal
After incubating teeth in the solutions for 2-8 weeks at 37 C (human body temperature), they had trained dentists evaluate them for signs of decalcification. Decalcification is a loss of minerals that is part of the process of tooth decay. Teeth, like bones, are mineralized primarily with calcium and phosphorus, and there is a dynamic equilibrium between minerals leaching out of the teeth and minerals entering them.

The researchers used teeth incubated in saline solution as the reference. The dentists were "blinded", meaning they didn't know which solution each tooth came from. This is a method of reducing bias. Here are some of the results. Cane juice vs. refined sugar:

Unrefined cane juice was not very effective at causing decalcification, compared to refined sugar. This was a surprise to me. Here is the result for wheat:Note that the scale is different on this graph. Wheat, and particularly refined wheat, is very good at decalcifying teeth in vitro. Corn:

Refined corn is much more effective at decalcifying teeth than whole meal corn. Next, the investigators performed an experiment where they compared the three types of refined carbohydrate to one another:
As one would predict from the graphs above, refined wheat is worse than refined corn, is worse than refined sugar. This is really at odds with conventional wisdom.

It's important to keep in mind that these results are not necessarily directly applicable to a living human being, who wouldn't let a mouthful of wheat porridge sit in his mouth for five weeks. But it does show that refining carbohydrates may increase their ability to cause cavities due to a direct effect on the teeth (rather than by affecting whole-body nutritional status, which they do as well).

The authors tested the acidity of the different solutions, and found no consistent differences between them (they were all at pH 4-5 within 24 hours), so acid production by bacteria didn't account for the results. They speculated that the mineral content of the unrefined carbohydrates may have prevented the bacterial acids from leaching minerals out of the teeth. Fortunately for us, they went on to test that speculation in a series of further investigations.

In another paper, Dr. T. W. B. Osborn and his group showed that they could greatly curb the decalcification process by adding organic calcium and phosphorus salts to the solution. This again points to a dynamic equilibrium, where minerals are constantly leaving and entering the tooth structure. The amounts of calcium and phosphorus required to inhibit calcification were similar to the amounts found in unrefined cane sugar, wheat and corn. This suggests the straightforward explanation that refined sugar and grains cause decay at least in part because most of the minerals are removed during the refining process.

However, we're still left with the puzzling fact that wheat and corn flour decalcify teeth in vitro more effectively than cane juice. I suspect that has to do with the phytic acid content of the grains, which binds the minerals and makes them partially unavailable to diffusion into the teeth. Cane juice contains minerals, but no phytic acid, so it may have a higher mineral availability. This explanation may not be able to account for the fact that refined sugar was also less effective at decalcifying teeth than refined wheat and corn flour. Perhaps the residual phytic acid in the refined grains actually drew minerals out of the teeth?

No, I'm not saying you can eat sugar with impunity if it's unrefined. There isn't a lot of research on the effects of refined vs. unrefined sugar, but I suspect too much sugar in any form isn't good. But this does suggest that refined carbohydrates may be particularly effective at promoting cavities, due to a direct demineralizing effect on teeth subsequent to bacterial acid production. It also supports Dr. Price's contention that a food's micronutrient content is the primary determinant of its effect on dental health.

Reversing Tooth Decay
Preventing Tooth Decay
Dental Anecdotes

Friday, July 3, 2009

Mobile map giveaway from Nokia

Nokia, the world’s largest mobile handset supplier has just released full production version of its free Ovi Map. The 3.0 version includes 3-D views (and 3-D landmarks for over 200 cities), weather information, walking and driving directions, and POI (point of interest) data from Lonely Planet, Michelin and Wcities. Ovi Maps 3.0 can be downloaded from Nokia's Web site and will work in any phone compatible with Symbian S60 3rd and 5thEdition operating system (see list of supported handsets).



[image courtesy of www.informationweek.com]


Launch of the first version of Ovi Maps coincided with the acquisition of Navteq by Nokia for U$8.1 billion in 2007. The market considered the purchase price as excessive. As one analyst put it, “…we think [it] is more about long-term control and inhibiting competition than about a financial investment in a growing asset”. But for Nokia, the acquisition was a logical step in a transformation from a world-leading handset vendor into a provider of Internet-based services and applications over a broad range of Nokia-branded devices.

As the competition for the mobile and online services market intensifies more and more companies will be prepared to bundle functional widgets with their core offerings just to maintain the position. It means there will be more and more freebies for the consumers but also that marginal industries, like mobile and online mapping technology vendors, will be devalued (that is, a few lucky ones will be bought out and the rest will struggle to compete with freebies). This is another example that online/mobile maps are becoming just a commodity.