Pondering the infinite is an activity usually relegated to undergraduate philosophy students, particularly in their sophomore year. Physicists often spend their time reducing physical phenomena that are for all practical purposes, like the size of the universe, infinite to comprehensible descriptions. Mathematicians are perhaps the most facile in dealing with and manipulating concepts of infinity. For a mathematician, it is a simple matter to specify a mathematical surface that is infinite in area, but encloses a finite volume. In other words, mathematicians can conceive of a shape that one could fill with paint, but not paint. It is not until recently, that people in computer sciences have considered quantities and qualities, which if formally finite, may prove to be practically infinite.
In 1965, Gordon Moore, one of the founders of Intel, extrapolated from the fact that the number of transistors on a integrated circuit grew from one in 1959, to 32 in 1964, to 64 in 1965 that transistor density was doubling every 18 to 24 months. This is the narrow statement of Moore’s Law. The more general statement of Moore’s Law is that computer computing power doubles every 18 to 24 months.
The latter formulation of Moore’s law has been given more depth by MIT-educated computer scientist, entrepreneur and writer Ray Kurzweil. He has tracked back the growth of computer power from electromagnetic punch card calculators used in the 1890 census to Pentium 4 processors that have 42 million transistors. Kurzweil foresees accelerating increases in computer power past physical limits of silicon-based devices as manufactures employ more exotic bio-chemical technologies.
Much thought has been given to whether Moore’s Law can really exceed limits posed by silicon-based technology and the ever-increasing capital costs required to construct chip-manufacturing plants. Additional consideration has been given as to what this increased computer power can be used for. Kurzweil is not shy about predicting a future with machines that are more intelligent than humans and computer implants interfaced to human minds. While increases in computer capacity has proven to be more persistent in time that anyone has a right to expect, predictions about the future abilities of artificial intelligence have a notorious record of over optimism.
What has not received much thought is the rapid increase in data storage. Writing in American Scientist, Brian Hayes explains how recent changes in technology are actually increasing rate of growth is disk storage. A large disk on a personal computer is about 120 GBytes. Technologies in the laboratory presently achieve storage densities equivalent to disks with 400 GBytes of storage.
At the present rate of increase, personal computer disks will reach 120 Terabytes (120,000 GBytes) in size in ten years. Even if the growth rate decreases by 60 percent, the 120 TByte level will be reached in 15 years. What are we to do with this storage capability? Is natural American acquisitiveness sufficiently great to use of this space.
Recently MP3 digtial music files have been filling disks, especially in college dorms. However, as Hayes points out, if you put enough music to listen to different songs 24 hours a day for an 80 year lifetime you barely fill a third of a 120 TByte disk disk. Even this assumes that storage technology would remain fixed over the 80-year lifetime.
Digital photographs are a new source of data filling up disks. Assuming each such photograph require 1 MByte of storage and assuming a itchy shutter finger producing 100 photographs a day certainly a well-documented life less than 3% of the 120 TBytes would be filled.
Fundamentally, storage of video is the only data source likely to fill 120 Tbyte disks. Even so, with growth beyond 120 TBytes over our lifetimes, we likely face the prospect of being able to store more data than we have. It is roughly comparable to having an attic that is growing so fast that we cannot fill it fast enough.
It seems that if we are having problems filling up new disks over a lifetime, the only solution is to increase lifetimes.
- Fixmer, Rob, “Internet Insight, Moore’s Law and Order,” Eweek, April 15, 2002.
- Hayes, Brian, “Terabyte Territory,” American Scientist, 90, 212-216, May-June, 2002.
Letting Standards Fall
Sunday, May 5th, 2002One bit of conventional wisdom holds that an academic degree from Harvard is a ticket to a life of affluence and ease. Regardless of the accuracy of that observation, it appears that admission to Harvard as an undergraduate is now virtually a guarantee of excellent grades. Students overwhelmingly accumulate A’s and B’s and a remarkable 90% graduate with honors. The honors citation has increasingly become a way to identify the few poor students who don’t receive honors rather than a means to focus particular tribute upon outstanding students.
Patrick Healy of the Boston Globe, interviewed Trevor Cox in his senior year at Harvard. Only in his last year was Cox finally challenged by the work on his senior thesis. Cox explained, “I’ve coasted on far higher grades than I deserve… It’s scandalous. You can get very good grades, and earn honors, without ever producing quality work.”
A few professors at Harvard have attempted to maintain an island of integrity in the on rushing torrent of easy A’s. Professor Harvey C. (“C-minus”) Mansfield, who teaches Government 1061, was a notoriously hard grader in comparison to his colleagues. Actually, his grading policy had remained constant over time while policies had loosened around him. Students were torn. If they took Mansfield’s course, they might be challenged but only at the cost of hurting themselves in the class rank competition among other students.
In an effort to strike a compromise, Mansfield now awards two grades. The official grade for the transcript is in keeping with the easy grading policies of his colleagues. The second grade tells students what they truly deserve. For the official record, only 27% of his students received a B or lower. The overwhelming majority were awarded a B-plus or above. For the second grade, only 15% of his students earned a grade higher than a B.
There are many reasons for grade inflation in academia, particularly at Ivy League schools. Part of it began during the Vietnam era when high grades helped students remain in school and retain an academic deferment from the draft. Interestingly, the largest jump in grades occurred when the average SAT scores dropped.
In 1969, Harvard made a bold effort to admit additional minority students. The African-American enrollment in the freshman class doubled from 60 to 120. SAT scores for entering freshman dropped, yet the fraction of grades of B or higher increased 10%. Not only were professors making allowances for a new set of less academically prepared students, out of fairness, they made it easier for other students as well.
The University of California system is preparing to embark on a similar reduction of standards in the service of attempts to change the demographics of enrollment. Richard Atkinson, the president of the university system, released a report arguing that the SAT tests should no longer be required for admission.
Atkinson frets that when he visited a private school he observed, “students studying verbal analogies in anticipation of the SAT.” Some observers might be heartened by diligent students improving their verbal skills. Atkinson, who is more astute in these matters than most of us, concluded, “America’s overemphasis on the SAT is compromising our educational system.” Obviously, we must free students from the oppressive yoke of verbal analogies.
How about this for a verbal analogy? Atkinson is to academic excellence, what rust is to metal. Atkinson is a corrosive force that if left unchecked can undermine the academic integrity of the University of California system.
There really are two not-so-attractive reasons for eliminating SAT scores as a requirement. The first is to allow the University of California school system to make admissions decisions based on skin color and ethnic background without the interference of academic preparedness as an inconvenient constraint. Ironically, the second reason some schools have eliminated SAT’s as an academic requirement is to increase the apparent (though not actual) academic selectiveness of the university. When SAT’s are optional, the better students tend to be the only ones who submit them. The average SAT scores (among those reporting, of course) will increase. When various college ranking services rank colleges by the SAT scores of incoming freshman, schools that eliminate SAT’s as a requirement are at an advantage. Dickinson College reportedly had a 60-point increase in the average SAT score of incoming freshman when the test was made optional.
The truth is that Atkinson of the University of California and Harvard University need to recognize that no amount of fudging the results of the educational process at the tail end is sufficient. Indeed, such efforts are probably counterproductive. There is a real and urgent problem with educational opportunity for minority students. The sooner we grant such students the means to escape failed public school systems, the sooner the University of California system, Harvard University, and other schools can return to the celebration of academic excellence rather than avoiding its consequences.
References:
Posted in Education, Social Commentary | No Comments »