Copy Link
Add to Bookmark
Report

Netizens-Digest Volume 1 Number 528

eZine's profile picture
Published in 
Netizens Digest
 · 17 May 2024

Netizens-Digest       Saturday, October 11 2003       Volume 01 : Number 528 

Netizens Association Discussion List Digest

In this issue:

[netz] Architectural issues involving Sitefinder & related functions
Re: [netz] Fwd: VeriSign Capitulates posts from the North American Network Operators Group
Re: [netz] Architectural issues involving Sitefinder & related functions
Re: [netz] Fwd: VeriSign Capitulates posts from the North American Network Operators Group
[netz] 10,000 foot view of DNS/Sitefinder/Verisign
[netz] Internet and epistemic communities

----------------------------------------------------------------------

Date: Tue, 7 Oct 2003 01:25:37 -0400
From: "Howard C. Berkowitz" <hcb@gettcomm.com>
Subject: [netz] Architectural issues involving Sitefinder & related functions

(since I haven't gotten back my enrollment confirmation, it seemed
appropriate to crosspost this to NANOG. While I will address
Sitefinder, there are broader architectural and operational issues).

Let me assume, for the sake of this discussion, that Sitefinder is an
ideal tool for the Web user, helping with the problem of
not-quite-correct URLs. Given that, I'll stipulate in this
discussion that the implementation of Sitefinder, along with the .com
and .net wildcards that lead to it for unresolved domains, is a true
benefit for the Web user.

The Internet, however, is more than the World-Wide Web. It seems only
logical to be able to discuss Sitefinder in two contexts:

1. Where it becomes the default, as with the recent Verisign
wildcards

2. Where it is reached in some other manner.

My architectural concern is defining a way in which context #1 serves
the _non-Web_ services of the Internet. If DNS were purely an
information service for Web users, the architectural conflict would
go away, and only commercial and policy issues remain.

I would hope that within the scope of the Sitefinder discussion list,
or alternatively in another forum, is an approach to reconciling the
IP-level DNS such that it continues to serve non-Web applications.

Is there disagreement that Sitefinder provides no functionality to
SMTP trying to deliver to an unresolved domain? To a user who
mistypes the name of an FTP site and does not intend to use a Web
browser?

What about failover schemes for non-HTTP cooperative research across
the Internet, where the inability to resolve a host name (assume that
cached records have a zero lifetime) triggers selection of an
alternate server?

Seriously, technical people at Verisign may have thought about this
and actually have suggestions. They may be very good ones, but,
judging on the reactions to the Sitefinder deployment, it might be
well to discuss them in open technical forums before a change is made.

I'm really not trying to make it a matter of personalities, but there
have been public statements by Verisign executives that such a
process inhibits innovation. If Verisign policy is that as operator
of .com and .net, it has the right to make unilateral changes, I
think that needs to be clear to all concerned. I recognize that a
number of independent parties suggest that the ICANN contract does
not explicitly prohibit such unilateral action.

Ironically, I worked with the original founders of Network Solutions,
and almost was a principal back when it was a couple of rooms in
McLean. Gary Desler, the founder and a fine engineer, always used to
say "there is no technical solution to a management problem". In the
current context, I simply want to know the rules for the playing
field.

------------------------------

Date: Tue, 07 Oct 2003 12:40:58 +0200
From: Alexandru Petrescu <petrescu@nal.motlabs.com>
Subject: Re: [netz] Fwd: VeriSign Capitulates posts from the North American Network Operators Group

Howard C. Berkowitz wrote:
> This has also spread to the Internet Law list of the American Bar
> Association;

Which is at...?

Alex
GBU

------------------------------

Date: Tue, 07 Oct 2003 12:48:17 +0200
From: Alexandru Petrescu <petrescu@nal.motlabs.com>
Subject: Re: [netz] Architectural issues involving Sitefinder & related functions

Howard C. Berkowitz wrote:
> Is there disagreement that Sitefinder provides no functionality to
> SMTP trying to deliver to an unresolved domain? To a user who
> mistypes the name of an FTP site and does not intend to use a Web
> browser?

HTTP clients, FTP clients, SMTP sendmail queues are still only a small
part of the entire range of types of client software users use. Content
streaming (such as video) is another type largely in use. Peer-to-peer
apps, database access and others come to mind too.

I assume that enhancements in the way DNS replies back to this myriad of
clients can only work if all clients are modified too to support these
enhancements. However, modifications of each and all of these looks to
me like a Herculean task, but can be done of course.

Alex
GBU

------------------------------

Date: Tue, 7 Oct 2003 09:25:33 -0400
From: "Howard C. Berkowitz" <hcb@gettcomm.com>
Subject: Re: [netz] Fwd: VeriSign Capitulates posts from the North American Network Operators Group

>Howard C. Berkowitz wrote:
>>This has also spread to the Internet Law list of the American Bar
>>Association;
>
>Which is at...?
>
>Alex
>GBU

Send a message to listserv@abanet.org
with "subscribe ST-ISC" in the body

------------------------------

Date: Wed, 8 Oct 2003 09:50:57 -0400
From: "Howard C. Berkowitz" <hcb@gettcomm.com>
Subject: [netz] 10,000 foot view of DNS/Sitefinder/Verisign

After attending the afternoon ICANN Security & Stability Committee
meeting, I realized that the issues involved fall into several
related but independent dimensions. Shy person that I am *Cough*, I
have opinions in all, but I think it's worthwhile simply to be able
to explain the Big Picture to media and other folk that aren't
immersed in our field.

In these notes, I'm trying to maintain neutrality about the issues. I
do have strong opinions about most, but I'll post those separately,
often dealing with one issue at a time. For those of you new to the
media, it's often best to put things into small, related chunks.

1. Governance issues
- --------------------

Did Verisign have the right, regardless of technical merit, to do
what it did without prior warning? I'm simply saying "did they do
anything contractually or otherwise legally forbidden", not "was it
strongly counter to the assumptions of the Internet" or "were they
mean and nasty."

The news/political interest here is whether any other group should or
could have affected this, or if we need new governance mechanisms.

Has this revealed any conflict of interest issues? To what extent
should a registry be able to act unilaterally? These points are
meant to be examined here in the context of law, regulation and
governance, as opposed to the less formal points in #2.

2. Process (slightly different than governance) issues.
- ------------------------------------------------------

Moving away from the letter of their contracts, what should they
have done (if anything) about open comment and forming consensus?
This is vaguely making me wonder if they had evidence of
WMDs....oops, wrong controversy.

Assume they had no requirement for prior discussion. What, if any,
requirements did they have for testing and validating their approach,
given that a top-level registry is in a unique connectivity position
with special privileges.

3. Internet architectural impact (slightly different than effects on
innovation and/or effect on existing software).
- --------------------------------------------------

I think it's reasonable to state that Sitefinder, and changes of
"internal" behavior, violates at least the traditional end-to-end
and robustness principles. This should be considered in the spirit
of the core vs end state discussion in RFC 3439, and the
architectural work going into midboxes.

A general question here is to what extent is it important that the
Internet be consistent with its relatively informal architectural
assumptions? Even among the newer technical folk, when teaching, I
rarely hear anyone aware of the architecture work---they think "7
layers" is the ultimate answer [1].

[1] I spent over five years of my life in OSI research, development and
promotion. We may have had the answer, but, unfortunately, we never
could articulate the question. That is a lesson here

4. Is the Internet the Web? Are all Internet users people?
- ----------------------------------------------------------
I don't think it's unfair to say Sitefinder is web-centric. The
current responses may be useful for people who can interact with it.
Apparently, there are patches that will help with mail response and
even anti-spamming tools.

But what of other protocols, especially those intended to run without
human intervention? What about failover schemes that employ DNS
non-resolution as an indication that it's time to pick an alternate
destination?

Is the apparent trend to move from "everything over IP" to
"everything over HTTP" a good one? _could_ it be a good one in
well-defined subsets of the Internet?

5. Effects on innovation
- ------------------------

Innovation and stifling innovation has come up quite a bit. If one
looks at the End-to-End Assumption, the historic perspective is that
the "killer apps" appear at the edges and depend on a consistent
center (e.g., web and VoIP, the latter with a QoS-consistent center
[2]). Development in the core tends to be more evolutionary and
subject to discussion (e.g., CIDR). Other development in the core
tends to be with the implementations (e.g., faster routers and lines).

[2] Remember that the access links to an ISP usually aren't the QoS
problem. Once you get to the POP, voice and other delay-critical
services can go onto VPNs or other QoS-engineered alternatives to
the public Internet.

Verisign says Sitefinder is innovative, and let's assume that it is.
But, if so, it's an innovation in the core, which is not the
"time-proven way". When I speak of time-proven, I certainly don't
mean that there isn't innovation -- this message did NOT reach you
over a 56 Kbps line between IMPs.

Internet Explorer, for example, has a means of dealing with domain
typos, but it is contrary to the way Sitefinder does things. IE also
does it at the edge. How do we deal with potential commercial wars
between the edge and core as far as competition for innovation?

6. Stability
- ------------

Assume that Sitefinder and the associated mechanisms are ideal. In
such a case, users would expect it. Unless a large number of users
learn to spel and tipe gud, these instances will be points of heavy
traffic.

What are the availability requirements to make the service
dependable? This includes clustered servers at individual nodes, as
well as distributed nodes. There has to be sufficient bandwidth to
reach the nodes, and even if the node has adequate connectivity
bandwidth, there are subtle congestion issues. It was pointed out
that wireless implementers, used to expecting a small error message
in their bandwidth-limited edge environments, are less than thrilled
about getting a 17K HTML response.

Remember, if these concepts prove themselves in .com and .net, users
will expect them in all TLDs -- or we get to the generally
undesirable situation of different behavior in different domains.
Let's assume Verisign has an adequate track record of running
reliable servers -- but what would be the requirements for a new
operator of .com and .net for people expecting the Sitefinder
functionality. In a new TLD, what has to be the support on Day 1?

A very different question is whether business models associated with
this service are sufficiently robust to be sure it stays present once
users expect it.

------------------------------

Date: Wed, 08 Oct 2003 11:53:05 -0400
From: Mark Lindeman <lindeman@bard.edu>
Subject: [netz] Internet and epistemic communities

I'm not sure how far this thread will go, but it certainly seemed time for
a new subject line.

Howard had written, in small part, that "there is a very real sense of
community -- or meritocracy -- among a group of people who live by
electronic communications." I related this to the idea of "epistemic
communities" in the political science literature, and mentioned that some
people have used "global Internet community" and "global epistemic
community" interchangeably. Howard responded in part:

>After my own google, I like the discussion at
>http://www.svet.lu.se/webcourses/webkurser/002_Politisk_kommunikation/Grundlaeggande/Extra_resurser/Sem6_resurser/epistcomm.pdf
>
>This brings up some immediate questions beyond the original point of my
>thread. Assuming the Internet engineering community forms an epistemic
>community A, do our definitions of "Netizen" meet the criteria for such a
>community B? If so, what is the relationship of A and B? Overlapping? A
>is a subset of B? Disjoint sets, if A's technocratic barriers to entry are
>emphasized?

The PDF to which Howard refers is a handy short primer from the standpoint
of political scientists in the field of international relations. It is
convenient to quote Peter Haas's definition as reproduced there:

>An epistemic community is a network of professionals with recognized
>expertise and
>competence in a particular domain and an authoritative claim to
>policy-relevant knowledge
>within that domain or issue-area
>The professionals may be from a variety of disciplines and backgrounds but
>must have:
>1) a shared set of normative and principled beliefs, which provide a
>value-based rationale for the
>social action of community members.
>2) shared causal beliefs, which are derived from their analysis of
>practices leading or contributing
>to a central set of problems in their domain and which then serve as the
>basis for elucidating the
>multiple linkages between possible policy action and desired outcomes
>3) shared notions of validity-that is, intersubjective, internally defined
>criteria for weighing and
>validating knowledge in the domain of their expertise
>4) a common policy enterprise- that is, a set of common practices
>associated with a set of
>problems to which their professional competence is directed, presumably
>out of the conviction
>that human welfare will be enhanced as a consequence.

I believe that Haas originally developed the term as a way of describing
the role that environmental scientists had played in shaping negotiations
on water pollution in the Mediterranean. In the mid-90s I got interested
in applying the concept to global climate change, but never got around to
trying to publish anything -- I haven't even gotten around to finding out
who _did_ go ahead and publish on that topic.

To Howard's questions (with the disclaimer that every sentence that follows
could be liberally larded with "maybe"s and "IMHO"s): Based on Howard's
characterizations, I believe that the Internet engineering community (1)
can be described as an epistemic community under Haas's definition, and (2)
is an important subset of Netizens, defined as Internet users who try to
contribute to the Internet's use and growth. Many Netizens primarily make
'social' contributions to the Internet; the engineering community also and
crucially makes technical contributions, which take form through a distinct
social process. (These technical contributions require professional
expertise -- not necessarily a formal credential, but demonstrated
competence within the domain.) Moreover, many other epistemic communities
thrive on the Internet, and function as Netizens in so doing.

Note that Haas defines epistemic communities as networks of _professionals_
not to be elitist, but on the premise that their professional standing is
the source of their authority in policymaking. Many Internet communities
function as 'knowledge communities' [1] without striving for
policy-relevance. It may well be that some Internet communities develop
policy-relevant expertise as they go.

[1] "Epistemic" comes from the Greek for "knowledge"; "scientific" comes
from the Latin. Haas wanted a phrase that would describe not "the
scientific community" but a certain kind of knowledge community, and this
is what he came up with [or perhaps borrowed from someone else].

There are many, plural, global epistemic communities ("global" meaning
geographically extensive, not universal or even necessarily large in
number). Google just gave me some 8100 hits for "global Internet
community," which seems to be used sometimes as a plural concept ('creating
a global Internet community of doctors') and sometimes as a singular
concept ( a delegee "has a duty to serve the residents of the relevant
country or territory, as well as the global Internet community" --
http://www.noie.gov.au/publications/speeches/twomey/ccTLD/tsld013.htm
). "Global epistemic community" seems generally to be used as a plural
concept ('the global epistemic community of climate scientists'), but in
the essay I mentioned previously, at
http://www.casayego.com/webconf/papers/2001bugs/2001bugs.htm , the author
seems to equate "the global Internet community" with "the global epistemic
community," citing another paper. The idea of a singular global epistemic
community doesn't seem very fruitful to me at first glance, but I may be
misunderstanding the argument.

It doesn't really matter what the True Definition of "epistemic community"
is, but I'm interested because of the distinctive role that Internet
engineers seem to play in protecting the terrain on which we function as
Netizens. Are we troubled by the thought that we depend on the
contributions of technocrats? (I had written, "are at the mercy of
technocrats," but that seemed both hyperbolic and rude!) Across a wide
range of issues, citizens must hope that experts will 'use their knowledge
only for good'. A recent book by two U.S. political scientists argues that
in the United States, at least, most citizens would actually prefer the
government to be run by benevolent technocrats, if only we could find
any. (I do not mean "benevolent _dictators_": in the U.S. even more than
most other countries, citizens emphatically support limiting the scope and
powers of government.) In some sense, we seem to have found some
benevolent technocrats (again, not dictators) to "run" the Internet, and/or
to intercede influentially (as in Haas's framework) with the
authorities. The Internet is intensely "democratic," but certainly in
nothing approaching the strong sense that we all can participate
meaningfully in ICANN committee meetings.

Sigh. Every time I try to fill in the obvious gaps and misdirections in
what I've written (the ones obvious to _me_, never mind anyone else), the
post gets longer. This might mean that there is room for a careful paper
that tries to develop some of these themes more systematically, but I need
to canvass the prior art. I just received the following invitation [APSA
is the American Political Science Association]:

>This is just a reminder of the Information Technology Politics (ITP)
>Section call for papers for the 2004 APSA conference. This year's
>conference theme is "Global Inequalities". The section encourages
>proposals that consider the evolving role of information technology at
>the local, national, and/or global level, as well as the theoretical and
>policy implications for interactions between these levels of governance.
> Such topics include, but are in no means limited to e-government,
>e-democracy, digital divide, intellectual property rights, activism,
>representation, and rule making. [...]

Mark Lindeman

------------------------------

End of Netizens-Digest V1 #528
******************************


← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT