Tuesday, October 27, 2009

Perils of online services

The online world reliant on distributed services is such a frustrating place. The events of the last few days totally drained my enthusiasm for “adding value” and “mashing up” all that free content available on the Internet. I feel like I am running in circles, constantly adjusting old applications to keep up with endless changes to browsers and data services, rather than moving forward. A slap of reality, I guess… And don’t get me started on that I should keep my applications “future proof” and stick to “standards”! I try to but others don’t follow the same rules!

How can you possibly future proof for IE8 not being able to handle screen clicks (and hence requiring yet another, IE8 specific, hack to make Google Map working correctly)? How can you proof your javascript code to read CSS “background” property consistently in all browsers (it happens that FireFox, Internet Explorer and Opera handle it totally differently!). There are millions of small things that collectively make the life of online application developer pretty miserable… unless someone else is paying handsomely for her/his time!

Not even the largest Internet websites do it all by the book. Just open your javascript console and see how many problems are reported by your browser when you visit popular pages. So, I am not the only one struggling to make it all work, somehow and despite…

However, the biggest cause of frustrations for me are data/ service providers. My entire website is built with third party services and content. That is what I set out to do from the start – a big experiment trying to determine whether it is a viable model for online existence. There are many sources of free information available for mashing up but there is a big cost of continuous maintenance to comply with the latest updates.

Just this Monday I found out that all my weather information pages are not working because Bureau of Meteorology services stopped responding! I am a legitimate subscriber not a site scraper (although that second option crossed my mind many times as scraping would be more reliable!). I am yet to hear as to causes and whether it is just a temporary disruption or else… And to add to my frustration, YouTube also happen to update their URL for playlists which mean I will have to put more work into Online Video Player application (the old URLs still work but users cannot play any newly created playlists). Then there are several GeoRSS feeds I use in my Hazard Monitor that also changed format and the information is no longer showing up on my pages….

I will risk the statement that examples quoted above are the proof that interoperability on a global scale does not work! That is, as long as suppliers only have their own interest in mind (ie. “caveat emptor” principle - take my feeds/services but we will keep changing them as we see fit) and as long as information/ services are supplied on “as is” basis, there cannot be any viable online interoperable environments. Because there is no guarantee that the services will be there when you need them (as I experienced during Victorian bushfires in February 2009 when fire hotspots data services were unable to cope with the demand. And who says that Yahoo, Microsoft or Google cannot stop serving maps, emails etc? Too big to fail?) Yes, it all seem to work ok most of the time but only thanks to myriads of hacks and $billions and uncounted hours spent on maintenance. We learnt to live with less than optimal arrangements but it doesn’t mean it is the proof that “all is fine”…

Let me quote another example. OGC web map standards were developed in early 2000’s but we are yet to see globally consistent deployments in a fully interoperable fashion – with proper and interoperable data discoverability portals (and metadata!), service delivery undertakings from suppliers, and authoritative and comprehensive information sources. Don’t get me wrong, US Government is doing it, EU is doing it and in Australia SLIP and AuScope projects are good examples of where OGC standards were put to good use. But these only work because they are implemented in tightly controlled environments (ie. end to end implementations, from access to source data, through to cataloguing and dissemination)! These are not collections of random service nodes but only nodes that comply to that particular environment standards. My recent post summarising Ed Parsons thoughts on Spatial Data Infrastructure has more on the issue if you care to read on.

Anyway, enough of grievances for one day. Back to hacking my way through the problems!

Thursday, October 22, 2009

Free Address Validation Tool

Today I am announcing release of another freebie from aus-emaps.com – Address Validation Tool. It is an online application for geocoding and validating address information in a semi-automated fashion. It is built with Google Maps API and Google geocoding engine and is suitable for handling small to medium volume of data.

Geocoded geographic coordinates of points of interest can be adjusted manually by repositioning location marker on the map (latitude and longitude will be updated from corresponding map coordinates). Address and accuracy code details can also be edited manually before saving the record. All saved records can be processed into CSV, KML or GeoRSS output format on completion. Individual records in input data are identified with a sequence number which is maintained throughout the entire process to facilitate easy reconciliation of output file with original information.

Geocoded information is classified according to accuracy, eg. “address level accuracy”, “street level accuracy”, “town/ city level accuracy” etc. Counts of records in each accuracy level are maintained during the process and all saved points can be previewed on the map at any time.


Address validation is a 3 step process:

Step 1. Paste list of address or locations to be geocoded and validated into a text area in “Input” tab and click “Press to Start/ Reset!” button to initiate the process.


Step 2. Edit geocoded information in “Edit” tab and save the result (one at a time). “Save” button saves current record and geocodes the next from the input list. Any text and accuracy code edits will be saved as well. Use “Next” button to skip to the next record on the input list without saving (skipped record will not be included in the final output file).



Step 3. Generate output from saved records to reconcile with the original information. CSV is the most obvious choice for updating original dataset. Although KML and GeoRSS outputs generated by the tool can be used with Google Map or Google Earth without further edits, it is recommended that you update content of at least "title" and "description" elements to improve presentation of the information.



Useful tips:
  • Include “country” field in the input data to improve geocoding accuracy if you are getting too many results from incorrect parts of the globe.
  • You have a chance to preview saved locations and to make final positional adjustments by selecting any of the options from “Show saved records by accuracy:” pull-down menu in “Edit” tab. Please note, all makers displayed on the map can be moved however, any changes in latitude and longitude coordinates will be saved automatically and cannot be undone.
  • Composition of address detail will differ depending on geocoding accuracy level. For ease of further processing, avoid mixing various accuracy levels in the final output file if you intend to split address details into components.
  • Geocoded address information is written into CSV file as a single text field but it can be split further using spreadsheet's “Data/Text to Column” function if you require individual address components as separate fields.

Address Validation Tool is a replacement for my earlier application - Bulk Geocoder - which was also built with Google geocoding engine. Since Google terms of use changed earlier this year, it is now prohibited to run fully automated batch geocoding using free Google service. To comply with those restrictions this new tool allows to geocode only one point at a time. And if I interpret the wording correctly, the information itself can only be used with Google applications.

I have submitted this application as my second entry in the MashupAustralia contest (the first one was Postcode Finder). I hope that it will be a handy resource to help improve spatial accuracy of data released for this competition and beyond. Any comments, feedback and suggestions greatly appreciated!

Wednesday, October 21, 2009

Butter vs. Margarine

I came across an interesting study the other day, courtesy of Dr. John Briffa's blog. It's titled "Margarine Intake and Subsequent Coronary Heart Disease in Men", by Dr. William P. Castelli's group. It followed participants of the Framingham Heart study for 20 years, and recorded heart attack incidence*. Keep in mind that 20 years is an unusually long follow-up period.

The really cool thing about this study is they also tracked butter consumption.  Here's a graph of the overall results, by teaspoons of butter or margarine eaten per day:

Heart attack incidence increased with increasing margarine consumption (statistically significant) and decreased slightly with increasing butter consumption (not statistically significant). 

It gets more interesting. Let's have a look at some of the participant characteristics, broken down by margarine consumption:

People who ate the least margarine had the highest prevalence of glucose intolerance (pre-diabetes), smoked the most cigarettes, drank the most alcohol, and ate the most saturated fat and butter. These were the people who cared the least about their health. Yet they had the fewest heart attacks. The investigators corrected for the factors listed above in their assessment of the contribution of margarine to disease risk, however, the fact remains that the group eating the least margarine was the least health conscious. This affects disease risk in many ways, measurable or not. I've written about that before, here and here.

The investigators broke down the data into two halves: the first ten years, and the second ten. In the first ten years, there was no significant association between margarine intake and heart attack incidence. In the second ten, the group eating the most margarine had 77% more heart attacks than the group eating none:

So it appears that margarine takes a while to work its magic.

They didn't publish a breakdown of heart attack incidence with butter consumption over the two periods. The Framingham study fits in perfectly with most other observational studies showing that full-fat dairy intake is not associated with heart attack and stroke risk. 


It's worth mentioning that this study was conducted from the late 1960s until the late 1980s. Artificial trans fat labeling laws were still decades away in the U.S., and margarine contained more trans fat than it does today. Currently, margarine can contain up to 0.5 grams of trans fat per serving and still be labeled "0 g trans fat" in the U.S. The high trans fat content of the older margarines probably had something to do with the result of this study.

That does not make today's margarine healthy, however. Margarine remains an industrially processed pseudo-food. I'm just waiting for the next study showing that some ingredient in the new margarines (plant sterols? dihydro vitamin K1?) is the new trans fat.

Butter, Margarine and Heart Disease
The Coronary Heart Disease Epidemic


* More precisely, "coronary heart disease events", which includes infarction, sudden cardiac death, angina, and coronary insufficiency.

Sunday, October 18, 2009

A Little Hiatus

I'm going to a conference next week, followed by a little vacation. I've written two posts that will publish automatically while I'm gone. I may or may not respond to comments for the next two weeks. I probably won't respond to e-mails. I'll resume the malocclusion series when I get back.

Friday, October 16, 2009

MashupAustralia contest update

MashupAustralia contest I mentioned in my earlier post has been running for a week and a bit now. Only five entries so far (in descending order, from the newest to the oldest):

Your Victoria Online: Google Map based application to locate the nearest Emergency Services, Government Services, Schools and Public Internet etc.


Victorian Schools Locator: Google Map based application, created with mapspread.com map maker, this application shows locations of over 2,000 schools in Victoria.



Broadband Locator: the site is using Google Maps and Street View to display address information - visitors can enter the address and the application will show what broadband services are in their area.

Geocoded List of Medicare Office Locations: a geocoded ATOM feed of Medicare offices.

Postcode Finder: my first entry into the contest – with postcodes and suburbs boundaries and Victorian Police Stations as POI. I am planning to add more POI, eventually. Unfortunately, the data supplied for the contest is all over the place and cannot be just “plugged in” without major rework (I mean, to show consistent information or reasonable spatial accuracy).



Judging by the number of visitors coming to my page from mashupaustralia.org site and the number of entries to date the contest is not as widely embraced as many may have hoped for, but these are still early days. Hopefully, my blog can bring a bit of extra publicity for this contest. It is after all a very worthy cause.

The closing time for lodging entries into the contest has been extended to 4PM Friday, 13th November 2009 so it gives plenty of time for building complex applications. There will also be a number of mashup events over the next few weeks which should bring plenty of exciting new entries:


I can already claim one “conciliation prize” in this contest – being the first entrant into the competition! It does not come with any formal accolades nor a cheque but this will do me just fine. I am not really in contention for any prizes. Just wait till you see what is cooking in garages around the country and what master chefs - cream of Australian GIS industry - will soon start to serve!

Thursday, October 15, 2009

Mark Scott ABC MD on Future of Media

As a postscript to my earlier post on The Great Media Evolution, just a few reflections on key points raised by Mark Scott, Managing Director of Australian Broadcasting Corporation, in his recent presentation. I draw my conclusions based on secondary sources - namely online article published by ABC –“summarised and interpreted” material but still quite an insightful read.

Mr Scott put forward a view that “…the power is now in the hands of audiences, and only those who realise the rules have changed will survive”. This supports my earlier comment that “… the convergence of media and resulting greater choice of access platforms also mean greater competition between different media channels…. News is becoming a commodity. For example, Internet gives … the ability to watch English version of French news, listen to the American radio for the latest updates, get news headlines from Google and if the event is important enough, it will be faster on Twitter than in any other media channel.”

Further, he adds (quoting after ABC) “…We have to come to terms with the undeniable fact that for the scoop on many news events, we cannot hope to compete with the audience." And then “…We need to team up with them, they have the time, the opportunity and particularly now with that powerful instant publishing double act, Twitter and Twitpic, they have the numbers." But the most interesting is Mr Scott’s statement that “… the future lies not in owning everything but in being a part of something”. He compares “… public broadcasters [to] the town square in which the community meets to discuss its affairs.” These views are certainly coinciding with my earlier observations and conclusions.

I don’t have much insight into how media business works but the latest developments are quite intriguing - partly because, by committing to presence on the web with this blog and my site, I found myself inadvertently in the online publishing game. Therefore, I am trying to understand what motivates people to visit “that and not the other” site or how to provide content and not to plagiarise or infringe other people’s intellectual property. As a side note, I cannot quite understand the gossipy side of media business. Gossips, especially of personal nature, seems to have such a “great value” for many people (for example, photos of celebrities are worth a fortune and latest details on their private lives attract perving crowds willing to pay for the information). Where do we draw the line between what is “privileged information” that you can claim exclusive ownership over or what is in the public domain? Or with the news, where does the story begin… with the event itself or when someone writes or tells about it? Can you really have “exclusive” news or does it only count who is the first with the information to the market? What does count the most in the media, mere describing the facts or events or the interpretation and insightful commentary, cross referenced with other news items, to bring a wider perspective and insights? I do not have answers to these questions right now but I sense they could be helpful in understanding how to prosper in the online publishing game… So, on with the goss and commentary - to cover all angles :-).

Mr Scott’s presentation contained also a few salvos directed at Mr Murdoch and especially his attempts to introduce fees for content generated by News Limited. According to ABC journalist “…Mr Scott compared the News Limited boss to a ‘frantic emperor’ who is trying to control the media as he always has, unaware that his power is long gone.” Pretty strong words that highlight that Mr Scott and Mr Murdoch are at two opposing ends in terms of strategies to win/maintain media market dominance.

I concluded my post with a thought that “…The dominant ‘new age’ media player may not be the News or Fairfax but just an aggregator of content or an organisation like Australian Broadcasting Corporation that covers many channels (subsidised by taxpayers and free to the public).” With a visionary leader like Mr Scott at the helm of the organisation this may prove to be the case after all… Time will tell whether there is room for both business models to coexist in the market, and without major structural shake-ups, or only one will come on top to the demise of the opposition.

Acknowledgement: humors used in this post are collages of anonymous images found on the Internet.

Wednesday, October 14, 2009

Malocclusion: Disease of Civilization, Part IV

There are three periods during the development of the face and jaws that are uniquely sensitive to environmental influences such as nutrition and muscle activity patterns.

1: Prenatal Period

The major structures of the human face and jaws develop during the first trimester of pregnancy. The maxilla (upper jaw) takes form between the 7th and 10th week after conception. The mandible (lower jaw) begins two weeks earlier. The nasal septum, which is the piece of cartilage that forms the structure of the nose and divides the nostrils, appears at week seven and grows most rapidly from weeks 8 to 11. Any disturbance of this developmental window can have major consequences for later occlusion.

2: Early Postnatal Period

The largest postnatal increment in face and jaw growth occurs from birth until age 4. During this period, the deciduous (baby) teeth erupt, and the activity patterns of the jaw and tongue influence the size and shape of the maxilla and the mandible as they grow. The relationship of the jaws to one another is mostly determined during this period, although it can still change later in development.

During this period, the dental arch widens from its center, called the midpalatal suture. This ensures that the jaws are the correct size and shape to eventually accept the permanent teeth without crowding them.

3: Adolescence

The third major developmental period occurs between ages 11 and 16, depending on the gender and individual, and happens roughly at the same time as the growth spurt in height. The dental arch continues to widen, reaching its final size and shape. Under ideal circumstances, at the end of this period the arch should be large enough to accommodate all teeth, including the third molars (wisdom teeth), without crowding. Narrow dental arches cause malocclusion and third molar crowding.

Growth of the Dental Arch Over Time

The following graph shows the widening of the dental arch over time*. The dotted line represents arch growth while the solid line represents growth in body height. You can see that arch development slows down after 6 years old, resumes around 11, and finally ends at about 18 years. This graph represents the average of many children, so not all children will see these changes at the age indicated. The numbers are in millimeters per year, but keep in mind that the difference between a narrow arch and a broad one is only a few millimeters.

In the next few posts, I'll describe the factors that I believe influence jaw and face structure during the three critical periods of development.


* These data represent many years of measurements collected by Dr. Arne Bjork, who used metallic implants in the maxilla to make precise measurements of arch growth over time in Danish youths. The graph is reproduced from the book A Synopsis of Craniofacial Growth, by Dr. Don M. Ranly. Data come from Dr. Bjork's findings published in the book Postnatal Growth and Development of the Maxillary Complex. You can see some of Dr. Bjork's data in the paper "Sutural Growth of the Upper Face Studied by the Implant Method" (free full text).

Mapping Stimulus Projects in Oz

Last month in my post on Google tools for public sector I provided a few examples of how Australian government departments and organisations are using Google maps to present various information. Today another interesting example: a map showing where and what projects billions of dollars committed by the government in the economic stimulus package is spent on. Information is available for six different expenditure categories: education, community infrastructure, road and rail, housing, insulation and solar. Zoom to your local area to find out what is actually happening in your neighbourhood with the allocated money.

Some of the information available on the Nation Building - Economic Stimulus Plan site has also been released under Creative Commons - Attribution 2.5 Australia (CC-BY) licence and can be freely used for various mashups and analysis. In particular, you can access information on all current community infrastructure, and road and rail projects across Australia. And if you have a great idea on how to use this data you can enter Mashup Australia contest for great prizes. It is run by Government 2.0 Taskforce for a limited time.

Tuesday, October 13, 2009

Ed Parsons on Spatial Data Infrastructure

I recently attended Surveying & Spatial Sciences Institute Biennial International Conference in Adelaide and was privileged to see Ed Parsons’ presentation. For those who don’t know Ed, his bio describes him as “… the Geospatial Technologist of Google, with responsibility for evangelising Google's mission to organise the world's information using geography, and tools including Google Earth, Google Maps and Google Maps for Mobile.” He delivered a very enlightening and truly evangelistic presentation outlining his views on the best approach to building Spatial Data Infrastructures. The following paragraphs summarise the key, thought provoking points from the presentation – with some comments from my perspective.

The essence of Ed’s position is that the currently favoured approach of building highly structured and complex to the n-th degree “digital libraries” to manage spatial information is very inefficient and simply does not work. There is much better framework to use – the web – which is readily available and can deliver exactly what is needed by the community, and in a gradual and evolutionary fashion rather than as a pre-designed and rigid solution.

I could quote many examples of failed or less than optimal implementations of SDI initiatives in support of Ed’s views. There is no doubt that there are many problems with the current approach. New initiatives are continuously launched to overcome the limitations of previous attempts to catalogue collections of spatial information. And it is more than likely that none of the implementations is compatible with the others. The problem is that metadata standards are too complex and inflexible and data cataloguing software is not intelligent enough to work with less than perfectly categorised information. I recently had first hand experience with it. I tried to use approved metadata standards for my map catalogue project, hoping it will make the task easier and the application fully interoperable, but in the end, I reverted to adding my own “interpretations and extensions” (and proving, at least to myself, that “one-fit-all” approach is almost impossible). I will not even mention the software issues…

Ed argued that most SDI initiatives are public sector driven and since solution providers are primarily interested in “selling the product”, therefore by default it all centres on data management aspect of the projects. In other words, the focus is on producers rather than users, on Service Oriented Architecture (SOA) rather than on “discoverability” of relevant information. All in all, current SDI solutions are built on the classic concept of a library where information about the data (metadata) is separated from the actual data. Exactly as in a local library, where you have an electronic or card based catalogue with book titles and respective index numbers and rows of shelves with books organised according to those catalogue index numbers. For small, static collections of spatial data this approach may work, but not in the truly digital age, where new datasets are produced in terabytes, with myriad of versions (eg. temporal datasets), formats and derivations. And this is why most SDI initiatives do not deliver what is expected of them at the start of the project.

Ed made a point that it is much better to follow an evolutionary approach (similarly to how web developed over time) rather than strict, “documentation driven” process, as is the case with most current SDI projects. The simple reason is that you don’t have to understand everything up-front to build your SDI. The capabilities may evolve as needs expand. And you can adjust your “definitions” as you discover more and more about the data you deal with. In an evolutionary rather than prescriptive way. It is a very valid argument since it is very, very hard to categorise the data according to strict rules, especially if you cannot predict how the data will evolve over time.

[source: Ed Parsons, Google Geospatial Technoloist]

The above table contrasts the two approaches. On one side you have traditional SDIs with strict OGC/ISO metadata standards and web portals with search functionality - all built on Service Oriented Architecture (SOA) principles and with SOAP service (Simple Object Access Protocol) as the main conduit of information. Actually, the whole set up is much more complex as, in order to work properly, it requires formalised “discovery” module - a registry that follows Universal Description, Discovery and Integration (UDDI) protocol and a “common language” for describing available services (that is, Web Service Description Language or WSDL in short). And IF you can access the data (big “if” because most public access SDI projects do not go as far) it will most likely be in a “heavy duty” Geographic Markup Language (GML) format (conceived over a decade ago but still mostly misunderstood by software vendors as well as potential users). No wonder that building SDI based on such complex principles poses a major challenge. And even in this day and age the performance of such constructed SDI may not be up to scratch as it involves very inefficient processes (“live” multidimensional queries, multiple round trips of packets of data, etc).

On the other side you have the best of web, developed in an evolutionary fashion over the last 15 years: unstructured text search capabilities delivered by Google and other search engines (dynamically indexed and heavily optimised for performance), simple yet efficient RESTful service (according to Ed Parsons, not many are choosing to use SOAP these days) and simpler and lighter data delivery formats like KML, GeoRSS or GeoJSON (that have a major advantage – the content can be indexed by search engines and therefore making the datasets discoverable!). As this is much simpler setup it is gaining a widespread popularity amongst “lesser geeks”. US government portal data.gov is the best example of where this approach is proving its worth.

The key lesson is, if you want to get it working – keep it simple and do not separate metadata from your data to allow easy discovery of the information. And let the community of interest define what is important rather than prescribe upfront a rigid solution. The bottom line is that Google strength is in making sense of chaos that is in cyberspace so it should be no surprise that Ed is advocating similar approach to dealing with chaos of spatial data. But can the solution be really so simple?

The key issue is that most of us, especially scientists, would like to have a definite answer when we search for the right information. That is: “There are 3 data sets matching your search criteria” rather than: “There are 30,352 datasets found, first 100 closest matches are listed below…” (ie. the “Google way”). There is always that uncertainty, “Is there something better/ more appropriate out there or should I accept what Google is serving as the top search result?... What if I choose the incomplete or not the latest version of the dataset?”… So the need for highly structured approach to classify and manage spatial information is understandable but it comes at a heavy cost (both time and money) and in the end it can serve only the needs of a small and well defined group of users. “The web” approach can certainly bring quick results and open up otherwise inaccessible stores of spatial information to masses but I doubt it can easily address the issue of “the most authoritative source” that is so important with spatial information. In the end, the optimal solution will probably be a hybrid of the two approaches but one thing is certain, we will arrive at that optimal solution by evolution and not by design!

Saturday, October 10, 2009

Malocclusion: Disease of Civilization, Part III

Normal Human Occlusion

In 1967, a team of geneticists and anthropologists published an extensive study of a population of Brazilian hunter-gatherers called the Xavante (1). They made a large number of physical measurements, including of the skull and jaws. Of 146 Xavante examined, 95% had "ideal" occlusion, while the 5% with malocclusion had nothing more than mild cro
wding of the incisors (front teeth). The authors wrote:
Characteristically, the Xavante adults exhibited broad dental arches, almost perfectly aligned teeth, end-to-end bite, and extensive dental attrition [tooth wear].
In the same paper, the author presents occlusion statistics for three other cultures. According to the papers he cites, in Japan, the prevalence of malocclusion was 59%, and in the US (Utah), it was 64%. He also mentions another native group living near the Xavante, part of the Bakairi tribe, living at a government post and presumably eating processed food. The prevalence of malocclusion was 45% in this group.

In 1998, Dr. Brian Palmer (DDS) published a paper describing some of the collections of historical skulls he had examined over the years (2):
...I reviewed an additional twenty prehistoric skulls, some dated at 70,000 years old and stored in the Anthropology Department at the University of Kansas. Those skulls also exhibited positive [good] occlusions, minimal decay, broad hard palates, and "U-shaped" arches.

The final evaluations were of 370 skulls preserved at the Smithsonian Institution in Washington, D.C. The skulls were those of prehistoric North American plains Indians and more contemporary American skulls dating from the 1920s to 1940s. The prehistoric skulls exhibited the same features as mentioned above, whereas a significant destruction and collapse of the oral cavity were evident in the collection of the more recent skulls. Many of these more recent skulls revealed severe periodontal disease, malocclusions, missing teeth, and some dentures. This was not the case in the skulls from the prehistoric periods...
The arch is the part of the upper jaw inside the "U" formed by the teeth. Narrow dental arches are a characteristic feature of malocclusion-prone societies. The importance of arch development is something that I'll be coming back to repeatedly. Dr. Palmer's paper includes the following example of prehistoric (L) and modern (R) arches:


Dr. Palmer used an extreme example of a modern arch to illustrate his point, however, arches of this width are not uncommon today. Milder forms of this narrowing affect the majority of the population in industrial nations.

In 1962, Dr. D.H. Goose published a
study of 403 British skulls from four historical periods: Romano-British, Saxon, medieval and modern (3). He found that the arches of modern skulls were less broad than at any previous time in history. This followed an earlier study showing that modern British skulls had more frequent malocclusion than historical skulls (4). Goose stated that:
Although irregularities of the teeth can occur in earlier populations, for example in the Saxon skulls studied by Smyth (1934), the narrowing of the palate seems to have occurred in too short a period to be an evolutionary change. Hooton (1946) thinks it is a speeding up of an already long standing change under conditions of city life.
Dr. Robert Corruccini published several papers documenting narrowed arches in one generation of dietary change, or in genetically similar populations living rural or urban lifestyles (reviewed in reference #5). One was a st
udy of Caucasians in Kentucky, in which a change from a traditional subsistence diet to modern industrial food habits accompanied a marked narrowing of arches and increase in malocclusion in one generation. Another study examined older and younger generations of Pima Native Americans, which again showed a reduction in arch width in one generation. A third compared rural and urban Indians living in the vicinity of Chandigarh, showing marked differences in arch breadth and the prevalence of malocclusion between the two genetically similar populations. Corruccini states:
In Chandigarh, processed food predominates, while in the country coarse millet and locally grown vegetables are staples. Raw sugar cane is widely chewed for enjoyment rurally [interestingly, the rural group had the lowest incidence of tooth decay], and in the country dental care is lacking, being replaced by chewing on acacia boughs which clean the teeth and are considered medicinal.
Dr. Weston Price came to the same conclusion examining prehistoric skulls from South America, Australia and New Zealand, as well as their living counterparts throughout the world that had adhered to traditional cultures and foodways. From Nutrition and Physical Degeneration:
In a study of several hundred skulls taken from the burial mounds of southern Florida, the incidence of tooth decay was so low as to constitute an immunity of apparently one hundred per cent, since in several hundred skulls not a single tooth was found to have been attacked by tooth decay. Dental arch deformity and the typical change in facial form due to an inadequate nutrition were also completely absent, all dental arches having a form and interdental relationship [occlusion] such as to bring them into the classification of normal.
Price found that the modern descendants of this culture, eating processed food, suffered from malocclusion and narrow arches, while another group from the same culture living traditionally did not. Here's one of Dr. Price's images from Nutrition and Physical Degeneration (p. 212). This skull is from a prehistoric New Zealand Maori hunter-gatherer:


Note the well-formed third molars (wisdom teeth) in both of the prehistoric skulls I've posted. These people had ample room for them in their broad arches. Third molar crowding is a mild form of modern face/jaw deformity, and affects the majority of modern populations. It's the reason people have their wisdom teeth removed. Urban Nigerians in Lagos have 10 times more third molar crowding than rural Nigerians in the same state (10.7% of molars vs. 1.1%, reference #6).

Straight teeth and good occlusion are the human evolutionary norm. They're also accompanied by a wide dental arch and ample room for third molars in many traditionally-living cultures. The combination of narrow arches, malocclusion, third molar crowding, small or absent sinuses, and a characteristic underdevelopment of the middle third of the face, are part of a developmental syndrome that predominantly afflicts industrially living cultures.


(1) Am. J. Hum. Genet. 19(4):543. 1967. (free full text)
(2) J. Hum. Lact. 14(2):93. 1998
(3) Arch. Oral Biol. 7:343. 1962
(4) Brash, J.C.: The Aetiology of Irregularity and Malocclusion of the Teeth. Dental Board of the United Kingdom, London, 1929.
(5) Am J. Orthod. 86(5):419
(6) Odonto-Stomatologie Tropicale. 90:25. (free full text)

Saturday, October 3, 2009

Malocclusion: Disease of Civilization, Part II

The Nature of the Problem

In 1973, the US Centers for Disease Control and Prevention (CDC) published the results of a National Health Survey in which it examined the dental health of American youths nationwide. The following description was published in a special issue of the journal Pediatric Dentistry (1):
The 1973 National Health Survey reported 75% of children, ages 6 to 11 years, and 89% of youths, ages 12 to 17 years, have some degree of occlusal disharmony [malocclusion]; 8.7% of children and 13% of youth had what was considered a severe handicapping malocclusion for which treatment was highly desirable and 5.5% of children and 16% of youth had a severe handicapping malocclusion that required mandatory treatment.
89% of youths had some degree of malocclusion, and 29% had a severe handicapping malocclusion for which treatment was either highly desirable or mandatory. Fortunately, many of these received orthodontics so the malocclusion didn't persist into adulthood.

This is consistent with another survey conducted in 1977, in which 38% of American youths showed definite or severe malocclusion. 46% had occlusion that the authors deemed "ideal or acceptable" (2).

The trend continues. The CDC National Health and Nutrition Examination Survey III (NHANES III) found in 1988-1991 that approximately three fourths of Americans age 12 to 50 years had some degree of malocclusion (3).

The same holds true for Caucasian-Americans, African-Americans and Native Americans in the US, as well as other industrial nations around the world. Typically, only 1/3 to 1/2 of the population shows good (but not necessarily perfect) occlusion (4- 8).

In the next post, I'll review some of the data from non-industrial and transitioning populations.


Malocclusion: Disease of Civilization


1. Pediatr. Dent. 17(6):1-6. 1995-1996
2. USPHS Vital and Health Statistics Ser. 11, no 162. 1977
3. J. Dent. Res. Special issue. 75:706. 1996. Pubmed link.
4. The Evaluation of Canadian Dental Health. 1959. Describes Canadian occlusion.
5. The Effects of Inbreeding on Japanese Children. 1965. Contains data on Japanese occlusion.
6. J. Dent. Res. 35:115. 1956. Contains data on both industrial and non-industrial cultures (Pukapuka, Fiji, New Guinea, U.S.A. and New Zealand).
7. J. Dent. Res. 44:947. 1965 (
free full text). Contains data on Caucasian-Americans and African-Americans living in several U.S. regions, as well as data from two regions of Germany. Only includes data on Angle classifications, not other types of malocclusion such as crossbite and open bite (i.e., the data underestimate the total prevalence of malocclusion).
8. J. Dent. Res. 47:302. 1968 (free full text). Contains data on Chippewa Native Americans in the U.S., whose occlusion was particularly bad, especially when compared to previous generations.

Friday, October 2, 2009

Mushup Australia Contest

A few days ago Australian Government 2.0 Taskforce announced an open invitation to any "able and willing body" to create mashups with nominated datasets from various Federal and State jurisdictions in Australia. It is a contest so there will be prizes for winning entries:
* $10,000 for Excellence in Mashing category
* $5,000 for Highly Commendable Mashups
* $2,500 for Notable Mashing Achievements
* $2,000 for the People’s Choice Mashup prize
* $2,000 for the Best Student entry
* $1,000 bonuses for the Transformation Challenge

Anyone in the world is eligible to enter but prizes will only be awarded to individuals from Australia or teams where at least one member have Australian credentials. The contest is open from 10am October 7 to 4pm November 6 2009 (Australian Easter Standard Time - GMT+11.00).

I will be entering at least two of my applications that have been running on aus-emaps.com site for the last couple of years and are already used by quite a few people. These are: bushfire incidents map (part or my larger Natural Hazards Monitor concept) and Postcode Finder with links to Australian demographic information from the Australian Bureau of Statistics. If you have an application that is suitable to present information nominated in the contest rules and need Australian representative on the team or would like to access some of the data from my site I invite you to partner with me in this competition.

Region shaken by tragic events


Several terrible tragedies struck our region in the last few days. I was travelling for most of the week so only now am able to put together a few words regarding the events. There were deadly tsunamis on islands of Samoa, Tonga, American Samoa - caused by an earthquake in early morning of 30 September - and later that day two devastating earthquakes in Indonesia killing thousands. It is very sad news… Despite all the scientific advancements and sophisticated technology to monitor and predict such events, as well as significant infrastructure investment in warning systems, humanity is still very vulnerable to natural disasters.


My thoughts go out to families of those killed, and all injured and affected by these events.