I taught a course in Data Mining at KCGI the first half of November 2007. I arrived back home in time to celebrate Thanksgiving Day with family and friends.
Thanksgiving Day is a traditional North American holiday to give thanks for the things that one has at the conclusion of the harvest season. Thanksgiving is celebrated on the fourth Thursday of November in the United States and on the second Monday of October in Canada. [from Wikipedia]
This is a holiday of many facets, but for all of us it is a time to "give thanks", meaning: to think of the things that make our lives good and to reflect on how these things came to be.
Of course, we are all thankful for family and friends and health. I am particularly thankful to the many inventors, including engineers and businessmen, who have developed what we know as the Information Age. The tremendous growth in computing power has changed our lives for the better so rapidly that we hardly see it happening.
We can summarize this whole phenomenon as "Moore's Law." Gordon Moore, a founder of Intel, noticed that the number of transistors on integrated circuit chips approximately doubles every two years. All the great growth of computing power seems to follow similar exponential growth rates. This includes: computing speed, memory size, disk sizes and speeds, data communication speed, as well as the frequent introduction of wonderful novel components.
We can view this growth from another perspective: the continually decreasing costs of computing. The English slang for this is "more bang for the buck." (Bang is the value, and a buck is a dollar.)
A few years ago, it was unthinkable to try to hold a high-resolution image in a computer memory. Now movies are created inside computers.
As part of my Thanksgiving celebration, I located a catalog for computer enthusiasts printed in 1987, 20 years ago. The offered $3,500 computer had 12,000 bytes of memory, a 15 megabyte hard disk, and a processor with speed of 20 MHz.
Moore's Law predicts that the transistor density doubles ten time in those twenty years, multiplying by approximately 1,000. If we naively multiply the specifications of the 1987 computer by 1,000, we get 0.5GB of memory (a little small), 15GB hard disk (very small), and processing speed of 20 GHz. We don't really have this computing speed in today's personal computers, but for the $3,500 we can now buy ten PCs that process at 2 GHz each (if we take inflation into account, we should get about 18 computers).
The course I taught in November, Data Mining, would have been unthinkable 20 years ago. We "mine" the data because now we have so much of it.
It is also fun to consider and thank a lot of the other actors besides the inventors, entrepreneurs, and exploiters. Because of tremendous demand for computing power, businesses compete madly to meet the demand, offering more powerful and less expensive devices. One of the biggest demands comes from people just using information technology to have fun. People play games, watch movies, trade photos, and so on. The results of all this interplay among inventors, consumers, and business has delivered to us such superb, useful, and interesting technology.
I have lots of fun with computers: I learn and teach interesting algorithms, programming languages, and data mining. Moore's Law translates immediately into a world where we all have fun. And for this I am deeply thankful.
To KCG and KCGI: I am especially thankful for the opportunity to visit you and teach the students there.