The date was March 1998. The Internet was at a critical decision point, as the U.S. government considered what infrastructure should be privatized, how to share or cede responsibility to other nations and how to transition to an e-commerce based future over the following decades.
The U.S. Government helped create the Internet
I felt pride, as a U.S. citizen, yes, call it patriotism (it is NOT a sin, you know!) as I read this:
"The Internet would not exist if it were not for the U.S. Government. It helped to create the Internet, and has been an excellent steward for over 25 years. It funded the necessary research, made sure the community had responsibility for its operation, and insulated it from bureaucratic obstacles and commercial matters so that it could evolve dynamically... The U.S. Government has enabled an enormous industry to be created and to grow, such that a large part of our economic base can be attributed to the Internet... this situation is likely to occur in other countries around the world."
Apparently, the issue of domain names was quite a pressing concern. There was a Green Paper that proposed a specific approach, but that only provoked more suggestions, none of which could be agreed upon. The situation had become urgent:
"Certain steps could be taken to neutralize temporarily the legal risks and economic concerns associated with the current Internet structure and its transition to "whatever comes next". One such step would be to institute a temporary immunity time-zone [to work out these matters, and avoid taking] preemptive action to ward off law suits pertaining directly to the current Internet infrastructure management. Another would be to create an interim period of competition for the "registrar" functions while existing registries are operated on a cost-recovery basis."
Robert Kahn and Vinton Cerf
Robert Kahn and Vinton Cerf were central to the development of the Internet during the 1970's and 1980's.
"The key technical contribution which enabled a "network of networks" to become the Internet was an architecture of gateways (routers) placed between the networks, and a protocol, now known as TCP/IP, used by the computers and the routers. [Robert Kahn and Vinton Cerf, then at Stanford University, are widely credited with development of TCP/IP.] It was presented in September 1973 in Sussex, England and published by IEEE in May 1974."
Overall management of the Internet was handled by DARPA through Kahn and Cerf.
"Robert E. Kahn was President and CEO of the Corporation for National Research Initiatives (CNRI), a not-for-profit scientific research organization established in 1986, in Reston, Virginia. CNRI hosted the Internet Service Providers group known as iops.org..."
Cerf was with DARPA from 1976 to 1982. Vinton Cerf joined Google some time after 2000. Robert Kahn did not.
IP addresses versus domain names
The critical thing needed by the Internet Service Providers and all Internet applications programs is for IP addresses to work reliably. Domain names are a simple way of using names instead of numbers and, while they have become tightly associated with use of the Internet, they are not a fundamental requirement for operation of the Internet.
IANA handled policy for both domain names and IP addresses. Kahn recommended that domain name management be separated from IP addresses.
The American Registry for Internet Numbers (ARIN) had recently been set up. Kahn suggested that separate global registries be formed, and each would have responsibility for IP address reliability. IP addresses are crucial for internet functionality, unlike domain names. Kahn was concerned that measures were taken to ensure that IP addresses were:
"insulated as much as possible from bureaucratic, commercial and political wrangling"
Brief history of IANA in the early days
One of the decisions we made during that period was to delegate responsibility for maintaining information about key Internet parameters to Jon Postel, a researcher at the University of Southern California who had been carrying out similar functions for the ARPANET. While DARPA retained the ultimate authority for decisions about policy and procedures, Jon Postel assumed primary responsibility. During that period, there was no need to second guess his decisions.
This function, performed by Jon Postel under USC's contract with DARPA, eventually became known as the Internet Assigned Numbers Authority (IANA)
and included policy for domain names as well as IP addresses and protocol parameters.
When the Domain Name Service (DNS) was first proposed in the 1980's by Paul Mockapetris (also from USC) most sites could be characterized as
- educational (EDU),
- US government (GOV & MIL),
- network (NET),
- organization (ORG),
- commercial sites with research labs (COM), and
- special cases e.g. testing or global experiments (ARPA and INT).
With DARPA's permission, Jon delegated certain clerical and operational functions to SRI International
, while retaining other functions. Among the former were the maintenance of a database which mapped Internet names to Internet addresses and making this resource available on the Internet.
Initially, the number of domain names was so small that it was trivial to download the entire database from SRI on a daily basis.
SRI should have been very grateful to Jon Postel. He gave them a wonderful opportunity.
The end of the ARPANET
"The ARPANET was phased out in 1990 and replaced by a higher-speed backbone built by IBM, MCI and Merit under a National Science Foundation (NSF) award. With help from DARPA, NSF took over responsibility for maintaining most of the Internet management infrastructure from DoD, and recompeted the contract that the DoD had with SRI International."
SRI should have done whatever was necessary, spared no expense, nor balked at any requirements, in order to keep that contract! Instead, Network Solutions, Inc. (NSI) won the competition for providing the domain name registration services. There were a few exceptions, most notably, country codes.
"...two letter country codes were domain names that could be managed by individual countries according to policies developed by the countries themselves... IANA made the determination of who in a given country would be responsible for that countries domain, but gave deference to the legitimate government of the country if it chose to weigh in."
Robert Kahn goes to Washington
I knew that I was going to like Robert Kahn's testimony, after reading this:
"The US is still in a position to insure that a stable management structure for the Internet is put in place without the need for government involvement in its day-to-day operations... In many other countries of the world, telecommunication providers are closely associated with governments, if not actually run by governments. Thus, it is likely that any privatization approach will bring other governments into the picture either directly or indirectly. A reasoned plan for how the Internet can run, that takes into account the international dimension along with the commercial dimension is critical. I believe the U.S. Government has a responsibility to do the right thing, not the most expedient thing or the most politically acceptable solution of the moment, even if it takes time to discover what it might be."
Emphasis mine. This was Kahn's modest, but very realistic estimate:
"...with several million domain names in existence and the potential for many more in the future, the annual revenue derived from domain name registrations could easily exceed $100,000,000 per year... Although the fee for individual domain name registrations has been $50 per year (it has since been announced that the fees will be reduced somewhat), many individuals and organizations have expressed strong feelings that the existing fee structure and organizational arrangements are untenable in the long term and should be rectified."
Throughout the 1990's, the National Science Foundation (NSF) had been subsidizing NSI's domain name registration service. At some point, the NSF stopped subsidizing NSI, and put them on pay-as-you-go. The NSF contract with NSI was due to expire later that year, in 1998.
This was the crux of the matter! How to assign responsibility for domain name registration going forward: privatized, with regulatory oversight, or with some amount of government involvement, and if the latter, then which governments? And the future of IANA?
Postscript: About Digital Object Identifiers
CNRI also provided registry services for an alternative identifier system (known as Digital Object Identifiers or DOIs), with U.S. and European publishers. DOI's were developed with support from DARPA and were used by the Department of Defense, the Library of Congress and digital library research. They are still used today.
DOI.org was run by Esther Dyson. I thought it was her primary contribution to the Internet. I was wrong. DOI's are very useful. As Robert Kahn said, they are a registry that is a single logical entity, distributed in multiple locations, supporting open interfaces. The handle identifiers are part of this system.
One of my most admired librarians, he is more of a superuser, or "meta-librarian", is Micah Altman, PhD of the Massachusetts Institute of Technology has led recent work with DOI's, since 2007. He is also a non-resident Brookings Institution Fellow. That does not mean that he spends rough nights sleeping between the stacks at the library, while Brookings tries to find him housing. I asked.
Well-capitalized start-up seeks extremely talented C/C++/Unix developers to help pioneer commerce on the Internet. You must have experience designing and building large and complex (yet maintainable) systems, and you should be able to do so in about one-third the time that most competent people think possible. You should have a BS, MS or PhD in Computer Science or the equivalent. Top-notch communication skills are essential. Familiarity with web servers and HTML would be helpful but is not necessary.
Expect talented, motivated, intense, and interesting co-workers. Must be willing to relocate to the Seattle area (we will help cover moving costs).
Your compensation will include meaningful equity ownership.
Send resume and cover letter to Jeff Bezos:
US mail: Cadabra, Inc.
10704 N.E. 28th St.
Bellevue, WA 98004
We are an equal opportunity employer.
"It's easier to invent the future than to predict it." -- Alan Kay
Programmers cross swords
I caught this very amusing programmer contretemps, a Twitter pas de deux, a few days ago (maybe it was a little longer than that... hmmm, scarey, I hope I am not losing track of time). The central players were Zed Shaw and Ted Dzuiba ummm... I forgot how to spell his name. Some poor soul, the @centipedefarmer, got caught up in the middle of it, which is what I have tried to embed below.
@jamesiry good times tweetlibrary.com/18438475549889…
— Jerry Chen (@jcsalterego) March 26, 2012
Disturbance in the Wikiverse
I have been wasting way too much time editing Wikipedia. I have listened, been BOLD. Among other things, I started tidying up the Stuxnet Wikipedia article, given that one of its primary means of propagation expired on June 24, 2012 per Mikko H. of F-Secure. I will ALWAYS hold F-Secure in high esteem, by the way. Yet I was pulled, of my own free will and curiosity, into the related Industrial Control Systems (ICS) article, which was quite neglected.
I am actually somewhat qualified to edit an Industrial Control Systems (ICS) article, based on my undergraduate and graduate school education, and my early work experience. No boasting here, just typical female under-confidence and need to justify. Or not "female" per se, just me maybe.
My dilemma: How much information to include?
I really wish there was someone with whom I could speak, get some guidance. I don't want to link to, nor quote what I am guessing are documents that don't need to be part of Wikipedia. I mean, I found them, but there were warnings not to download or copy them.
Since Stuxnet first appeared on the (public) scene, or at least publicly accessible infosec community scene, in July 2010, on Brian Krebs' Krebs on Security website (well, he wrote about Belarus security software company Virus Blok Ada's findings of it), there has been progress. Specifically, progress in creating better standards for Industrial Control Systems security practices and protocols.
Contrary to what SO many of the LOUDEST but least informed seem to think, the people who run nuclear power plants, hydro-electric generation facilities a.k.a. dams, like my favorite Hoover Dam, water treatment plants, oil drilling platforms and heavy equipment manufacturing facilities are NOT incompetent idiots. Nor are they fragmented and hostile, even if in different countries! They have been working together, quietly, slowly, but steadily, with productive output since the fall of 2010.
I am hesitant to link to, or even provide details of the trail of breadcrumbs to the carefully drafted documents for more secure ICS in this "post Stuxnet era", and link to it on Wikipedia. Rather, my common sense indicates that, but perhaps it would be just fine, I simply do not know.
Genuine and trustworthy expert insight sought, to no avail
I wanted to ask Eugene Spafford, Ph.D. a.k.a. @TheRealSpaf, of Purdue University renown and cyber security pioneer status fame for input. Gene Spafford should be trustworthy and well-informed, probably more so than anyone else I have stumbled over on Twitter. Unfortunately, I am fairly certain that he is quite ill with a serious medical condition, so he is not in the best mood. He definitely doesn't have patience or time for me, and I can't blame him. I tried to signal to him on "open Twitter", that I needed his guidance, but merely succeeded in irritating him, by unintentionally appearing smarmy-obnoxious.
Who else is there to ask? No one is active in such subject matter on Wikipedia anymore e.g. Scadateer hasn't been active in any way since 2010, nor has anyone else. No one seems to be watching the article either. At least with The Periodic Table of the Elements, there are vigilant chemists who watch over the thing night and day. They even told me to go ahead and not be afraid to change anything, as they would quickly clean up after me if I were wrong.
The Industrial Control Systems article is not an isolated case. IBM Sequoia, holder of the June 2012 Top 500 List designation as "The Fastest Supercomputer in the World" is a mess too. I refer to the IBM Sequoia article on Wikipedia (I am rather certain that the physical object is lovingly well-tended). The article sources much of its content from an online article via the BBC. Unfortunately, the BBC article was full of errors. Some were grammatical, which is frightening. Others were content-based, which can happen with supercomputer articles. But that doesn't help me!
First of all, the Wikipedia article is ridiculously brief for an entry of such importance. Why is that? Probably because everyone would rather write about the 9 zillion manifestations of My Little Pony. Or private equity. (There is an obsession with private equity on Wikipedia, and nearly everywhere else that non-finance or C.P.A. or attorney types are found on the internet.) This is partly IBM's fault as well. IBM should spend less time worrying about "social media" and "analysis of BIG unstructured DATA sets" for retail marketing insights. Instead, they need to stop uploading pictures of supercomputers to Flickr and making awful spelling errors that people prior to me have tried to ever-so-politely point out, months earlier, to no avail e.g. "IBM Sequoia Instillation". They should also have someone help me re-write that Wikipedia article!
Does IBM Sequoia require 7.9 MW of power, or 6 MW to operate?
At present, the Wikipedia article has both figures. There are other errors too, of an arithmetic sort e.g. the number of cores, nodes whatever compared to the prior fastest computer in the world, the K Computer. Again, I need someone to ask.
Or: Perhaps there is someone more qualified than me to do this, hein? One would think... but if so, they haven't stepped up yet, and it is now July 21, 2012. The Top 500 results were announced on June 14 or 18, 2012. You'd think one of the teeming hordes of Wikipedia editors would have had a look at this. Instead, Wikimedia seems preoccupied with their ANTI-anti-pornography star chamber meetings.
High performance computing spam
I removed a hilarious bit of spam from the IBM Sequoia article. I can't stop laughing long enough to type this. Under the "Operating System" heading, the spammer said that IBM Blue Gene series supercomputers, which is what Sequoia is, run on Windows XP.
How versatile! Portable! And so very open! Someone please tell Tim O'Reilly!
Yes, that's correct, the Wikimedia Foundation voted to overturn their previous decision to implement pornography controls. Do they have any idea of the kind of pornography that makes its way into Wikipedia? Yes, I realize that their concern is regarding legitimate content, for example, articles about human reproduction, or outre social or cultural practices, and their photographic depiction.
What they don't realize is that spammers and the 4chan folks do TERRIBLE things. I will look for those photos that I found during my first month as a Wikipedia registered user. Huge, as in 5000px by 3000px high resolution png, and even bmp, format photos uploaded by a rogue Wikipedia editor, of himself. And with multiple women, at the same time, including captions for each and every one. I had no idea what to do. They needed to be removed from Wikimedia Commons ASAP before children were exposed to them, and I couldn't find anyone to tell, or anywhere to even report the problem. I still don't know what to do when I find this sort of thing, who to tell. They had been inserted in the article for the ancient and world famous Yu-Yuan Gardens in Shanghai. That was my only first-hand experience of this sort, but there have been others, as apparent from the article Talk page for Facebook. I will provide links. If I remember.
To be continued, and parsed into appropriate prior posts when I have a little more time.
Part I: Google Search Appliance
Many of the Google help pages have relevance beyond Google products. The content quality is significant, as it is the result of experience (and probably formal analysis) of internet and network behavior from a point-of-view, at scale, that is rivaled by few others. The following are culled from several GSA (Google Search Appliance) and GSA Blue Mini pages. The intended audience is any Google Enterprise search customer. Let's begin with Google Search Appliance Configuration.
- Load balancer A software or hardware application that distributes network traffic
- Failover Refers to a configuration that typically involves two instances of an application or a particular type of hardware. If the first instance fails, the second instance takes over.
Particularly useful links for
"If a search appliance stops running, the load balancer stops sending requests. To monitor the status of the appliance, configure the load balancer to send periodic search requests. After each request, close the connection. Do not configure the load balancer to monitor status by sending TCP packets to port 80 of the appliance, making a connection, and sending a reset. Using TCP packets this way can cause a search appliance to become unresponsive."
- GSA Enterprise Secure Search
The GSA Admin Toolkit is a package of tools for GSA administrators. It is not supported by Google.
The Google GSA SAML Bridge for Windows enables the search appliance to fully access the user's Windows domain login credentials and removes the need for redundant logins. How does Google GSA SAML work? Google SAML Bridge for Enterprise facilitates authentication and authorization for search results, mediating between users and a Windows domain. The SAML Bridge is implemented as an ASP.NET website that resides in IIS. It enables users to gain seamless access to content that resides on file systems, web servers, or Microsoft Office SharePoint servers.
Part Two: IPv6 Tools
tcptraceroute6 is a lightweight tool using TCP packets to perform an IPv6 trace route.
It is very similar from the user’s perspective to Michael Toren’s TCP trace route for IPv4.
"The more traditional traceroute sends out either UDP or ICMP ECHO packets with a TTL of one, and increments the TTL until the destination has been reached. ...The problem is that with the widespread use of firewalls, many of the packets that traceroute sends out end up being filtered, making it impossible to completely trace the path to the destination. Often, these firewalls will permit inbound TCP packets to specific ports that hosts sitting behind the firewall are listening for connections on. By sending out TCP SYN packets instead of UDP or ICMP ECHO packets, tcptraceroute is able to bypass the most common firewall filters.... tcptraceroute never completely establishes a TCP connection with the destination host..."
See TCP trace route overview and tcptraceroute for IPv6. Both are featured on the Rem Lab page for ndisc6.
To trace the path to a web server listening for connections on port 80
use tcptraceroute webserver
To trace the path to a mail server listening for connections on port 25 use tcptraceroute mailserver 25
This entry is motivated by my recently revived (but always extant) interest in chemistry, subsequent to editing the Periodic Table of the Elements and biographical article for Dmitri Mendeleev on Wikipedia. I learned of the importance of the IUPAC, The International Union of Pure and Applied Chemists, and followed the IUPAC's International Year of Chemistry web footprint, albeit a year after the fact. The IYC was IYC 2011, not IYC 2012. Yet there is so much to digest and enjoy, even after the fact! One fine example uses an always fascinating data visualization. In this instance, it is applied to that most important of chemical compounds, H2O otherwise known as water. The Global Experiment of the International Year of Chemistry 2011 culminated in a global water experiment and visualization.
There were several other informative and fun displays, including a geographical mapping visual, although it requires the Google Earth browser plug-in in order to fully appreciate.
Other chemistry-related items
I guess it should not have surprised me that chemists are often historians as well as stamp collectors or philatelists. My posts covers all of that, but not much more. Well, a few things: The discovery of two new elements, a chemistry tattoo and a silly cartoon.
Open source publish and subscribe
Subscribe to anything, whether RSS or Atom feed, or custom syndication, with Pubsubhubbub. According to the Google Code website
where it is hosted as an open source project, PubsubHubbub is a
server-to-server web-hook-based pubsub (publish/subscribe) protocol as an extension to Atom and RSS. Parties (servers) speaking the PubSubHubbub protocol can get near-instant notifications (via webhook callbacks) when a topic (feed URL) they're interested in is updated.
Publish and subscribe at Microsoft
What is PubSub on Microsoft Codeplex
? Is that a version of PubsubHubbub implemented successsfully for Microsoft internal usage? It is active as of November 2011.
I think this is more likely: PubSubHubbub is a specific case of a web-hook based publish/ subscribe protocol. Microsoft's PubSub is another specific case of the protocol. It might be different or the same.
I am very fond of Beethoven's Ode to Joy, (from the Ninth Symphony) both instrumental and choral versions. My father played it often on Deutsche Grammaphon Gesellschaft cassette tapes. I played some appalling adaptations that were probably a travesty of the original, arranged for solo beginner flutists (flautists?) too.
I finally read a translation of the German chorus on YouTube a few years ago, and was stunned by the catholic, ecumenical words. (There was no mention of Jesus, which tends to put me off of a lot of traditional religious music. When Jesus is mentioned, it just reminds me how the music isn't intended for me, if you know what I mean). As a result, I have grown even fonder of Ode to Joy, and often sing along. That is probably an even worse travesty...
A few days ago, there was a new question on Musical Practice & Performance StackExchange, about stage position of choral soloists, particularly for Beethoven's Missa Solemnis (a lengthy work written for Roman Catholic Mass) which made me think of Ode to Joy last night. I don't know the answer to the Music StackExchange question. Yet I found a pleasant entry on the subject from National Public Radio online, from nearly six years ago. The following is a summary of the key points, along with my own asides, of course. At the end there is an NPR recording of 50 seconds of Ode to Joy.
* Verbatim excerpts are quoted in green font with grey shaded background, the rest is mine.
Transcript 'Missa Solemnis: A Divine Bit of Beethoven', NPR (12 February 2006)
Beethoven was nominally Catholic but did not attend church regularly.
God, however, interested Beethoven a lot. Beethoven read books about Eastern religions and revelations of the divine and of nature.
He was particularly fond of, and often quoted to friends, this phrase of Emanuel Kant:
The starry skies above and the moral law beneath.
Beethoven also evoked the divine with three aphorisms which he said were from ancient Egypt.
I was thinking that the first sounded somewhat like something Donald Rumsfield said. These were the sayings:
I am that which is.
I am all that is, that was and that will be.
No mortal man has lifted my veil. He is solely found, himself, in all things owe their being to him alone.
I can't figure out the meaning of the third. It has lost something in transcription or translation, probably.
The most surprising thing in Beethoven's
is the last movement... That culminating section of the Roman Catholic Mass is a prayer for peace. The last words are
aredona nobis pacem ... (give us peace)
The God that Beethoven intimates in the Missa Solemnis is not strictly Roman Catholic nor even Christian. Rather, He is pantheistic and all-encompassing, yet not tangibly present in our physical world.
Divinity is beyond, in the stars. Humanity is here on Earth.
NPR offers a brief recording from Missa Solemnis
, 'Dona Nobis Pacem' from Agnus Dei
, which I found rather nerve wracking and over-wrought. But the link is there, for those who may be interested.
comes down to an unanswered prayer. Whether God has heard us, we don't know, but we do know that in the distance the drums of war are still beating.
Did Beethoven ever provide an answer to this unprecedented open ending? I believe he did. His answer is the Ninth Symphony. The famous choral finale of the Ninth Symphony is based on Schiller's
Ode to Joy
, written at a time of revolution.
Those words and Beethoven's music, call for humankind to bow to the Creator regarding the heavenly, the celestial.
For answers, turn to one another.
In the Ninth Symphony, Beethoven proclaims that as comrades, brothers and sisters, husbands and wives, we can unite to celebrate Joy, the beautiful daughter of Elysium. And that the path to peace is bestowed not from above but from within us and among us in universal brotherhood here on Earth.
That was Beethoven's reply to the unanswered prayer of the Missa Solemnis.
There may be an unavoidable moment of loud advertising preceding each recorded passage. Today it was for a Lexus.
I recommend the choral passage in this case, even though I'm no fan of opera, in general.
Many other photos and further information about this project may be found there.
Update (31 December 2012)
I read an article in The Wall Street Journal today. There seems to be a harmonic convergence of activity related to Amazon dot com. According to The Wall Street Journal, The Long Now is a project funded and sponsored by Jeff Bezos. It is not open to the public.