Copy Link
Add to Bookmark
Report

Current Cities Volume 12 Number 04

eZine's profile picture
Published in 
Current Cities
 · 5 years ago

  


[1]Current Cites (Digital Library SunSITE)

Volume 12, no. 4, April 2001

Edited by [2]Roy Tennant

The Library, University of California, Berkeley, 94720
ISSN: 1060-2356 -
http://sunsite.berkeley.edu/CurrentCites/2001/cc01.12.4.html

Contributors: [3]Charles W. Bailey, Jr., [4]Margaret Gross, [5]Shirl
Kennedy, [6]Leo Robert Klein, [7]Eric Lease Morgan, [8]Roy Tennant

Issue Spotlight: Freeing the Research Literature

This topic isn't new, but when Science, Nature, and Scientific
American all weigh in on the same topic, you get the sense that
something big is afoot. And there is. A number of scientists and
researchers are as mad as hell and they're not going to take it
anymore. What are they not going to take? It's probably best to go to
the [9]Public Library of Science site and find out for yourself. But
in a nutshell, they no longer want to give away their intellectual
content to publishers and have publishers lock it up for perpetuity
except for those who pay to access it. They're calling for their
published work to be freely available six months after publication.
Read on to find out more.

Butler, Declan, editor. [10]"Future E-Access to the Primary
Literature" [11]Nature (April 27, 2001).
(http://www.nature.com/nature/debates/e-access/). - This Nature "web
debate" and the recent attention of Science and Scientific American on
this same topic (see other cites in this issue), means that major
scientific publications are waking up to the fact that there is a
revolution in their midst. Faculty and researchers are no longer
complacent with what one researcher has termed the "Faustian bargain"
of giving up copyright in an effort to obtain tenure. Neither are they
complacent about the amount of money libraries are being charged to
buy back their intellectual effort. I have no idea where the chips may
fall, but fall they must, and discussions such as these can only serve
to shed light on the possibilities for change and the positions of the
antagonists. Be forewarned, this debate has many contributions, from
many different perspectives. You could easily spend a day or more
reading, sifting, and thinking about what the future may hold for
scholarly communication. - [12]RT

Karow, Julia. [13]"Publish Free or Perish" [14]Scientific American
(April 23, 2001)
(http://www.sciam.com/explorations/2001/042301publish/). - Karow pens
a readable and interesting overview of the controversy surrounding the
[15]Public Library of Science open letter calling for publishers to
make scientific journal articles freely available six months after
publication. Read this before diving into the debates in Science and
Nature on this issue, and you'll have a good introduction to the
players and the issue. - [16]RT

Richard J. Roberts, et. al. [17]"Information Access : Building A
'GenBank' of the Published Literature" [18]Science 291(5512, Issue 23)
(Mar 2001): 2318-2319
(http://www.sciencemag.org/cgi/content/full/291/5512/2318a) and The
Editors [Science]. [19]"Science's Response : Is a Government Archive
the Best Option?" [20]Science 291(5512, Issue 23) (Mar 2001):
2318-2319 (http://www.sciencemag.org/cgi/content/full/291/5512/2318b).
- The first piece is a group of scientists calling for free and open
access to scientific literature six months after publication, and for
the centralization of this material in a common repository. This is
not just a small group of scientists calling for this, but as of this
writing over 15,000. The "movement" to free the scientific literature
is called the [21]Public Library of Science. To enforce their call for
change, they suggest a boycott of journals that do not comply. The
boycott, scheduled to begin September 2001, would not just include
article contributions, but also editing or reviewing for such a
publication as well as personal subscriptions. In the second cited
piece, the editors of Science suggest a somewhat different strategy to
achieve some of the same ends. Rather than having all scientific
publishers submit their content to a central repository, the Science
editors favor a distributed model, where publishers retain their
content but it can be searched at a central location. The editors also
predictably raise economic questions and other concerns. Meanwhile,
they plan on making the research reports and articles of Science
freely available after a year (not the six months advocated by Roberts
and his colleagues), on their own web site, not in a central
repository. It will be interesting to see what happens come September,
but this is a war of unknown duration and it has only just begun. -
[22]LRK and [23]RT

---------------------------------------------------------------------

Anderson, Kent, John Sack, Lisa Krauss, and Lori O'Keefe.
[24]"Publishing Online-Only Peer-Reviewed Biomedical Literature: Three
Years of Citation, Author Perception, and Usage Experience." [25]The
Journal of Electronic Publishing 6(3) (March 2001)
(http://www.press.umich.edu/jep/06-03/anderson.html). - Back in 1997,
an online-only section of [26]Pediatrics, the journal of the American
Academy of Pediatrics, was established and made available at no cost
on the Internet. In this research study, Anderson and his coauthors
analyze Web use statistics, citation data, and author perceptions to
gauge how well the online-only section of the journal stacks up
against the print section for the period 1997-1999. On the negative
side, the results show that the online-only section faces an uphill
battle when it comes to author perceptions (e.g., they see it as a
"second-tier" publication), online-only articles get fewer citations
compared with their print counterparts, and they are not cited any
more quickly than print articles. On the positive side, online-only
articles were included in authors' resumes, tenure committees accepted
them, they were indexed like print articles, their Web use was higher
than electronic copies of print articles, their Web use over time
decayed in the same way as print articles, and it was significantly
cheaper to publish them. - [27]CB

Berkman, Eric. [28]"When Bad Things Happen to Good Ideas" [29]Darwin
(April 2001)
(http://www.darwinmag.com/read/040101/badthings_content.html). -
Coincidence? Irony? It seems like the phrase "knowledge management"
started its ascent into the realm of corporate buzz just about the
same time many companies were downsizing and/or eliminating their
libraries. This article provides some insight into how these phenomena
might be related. As the author explains, by way of cruising the
exhibit floor and commenting on products being hawked at the
KMWorld2000 trade show, "In many cases KM devolved into a purely
technical process, resulting in expensive software implementations
sitting unused by oblivious, fearful or resentful employees."
Executives watching this happen have become increasingly wary of the
whole KM concept, perceiving it as overhyped and/or "a total bust."
The article goes on to describe the evolution of knowledge management
as a discipline, and suggests that one big reason it has failed to
perform as anticipated is because IT departments have been put in
charge, resulting in a technical rather than a user-oriented focus. -
[30]SK

Berners-Lee, Tim, James Hendler, and Ora Lassila. [31]"The Semantic
Web" [32]Scientific American 284(5) (May 2001):35-43
(http://www.scientificamerican.com/2001/0501issue/0501berners-lee.html
). - Imagine the following reference question. "I met a person at ALA.
Their last name was Cook, but I don't remember their first name. I do
remember they worked for an ARL library and their son attends my alma
mater, Bethany College. What is Cook's email address?" In order to
answer this question with the given information you would need to know
the email address of all the Cooks at ARL libraries who also have a
son at Bethany College. According to Berners-Lee, the Semantic Web
would be able to answer such a question. "The Semantic Web will bring
structure to the meaningful content of Web pages, creating an
environment where software agents roaming form page to page can
readily carry out sophisticated tasks for users." It sounds like
science fiction, but through the use of ontologies -- a document or
file that formally defines the relationship between terms --
interconnections can programmatically be made between Web pages and
conclusions can be drawn. These ontologies are implemented in the
[33]Resource Discovery Framework (RDF). For me, the process is similar
to library work. First we collect data and information. Second, we
classify the it using our own ontologies and make the materials
available to users. Finally, we access a particular piece of this
information and find similar pieces through the use of the
classification scheme. The key is a thorough classification system and
its implementation. The Semantic Web is a proposal for this sort of
implementation on a much wider scale. It is not really cataloging the
Web. Rather, it is describing items on the Web using a uniform syntax
(RDF) and a variety of classification schemes agreed upon by discrete
populations (ontologies). This article is a good read; it provides an
interesting spin about the Web for librarians and librarianship. -
[34]ELM

Broughton, Kelly. [35]"Our Experiment in Online, Real-Time Reference"
[36]Computers in Libraries 21(4) (April 2001)
(http://www.infotoday.com/cilmag/apr01/broughton.htm). - A report from
the front lines, this article describes the system used and what it's
like to be expected to respond right now, without the benefit of
face-to-face signals. At [37]Bowling Green State University, where the
author is a reference coordinator, they chose [38]HumanClick to begin
their experiment with online chat reference. A major problem was
system incompatibility for users on Macs; a major benefit was a
feature HumanClick added recently which allows the reference staff to
briefly "can" messages and draw upon prepared responses (only when
appropriate, of course, but it must be tempting to abuse this
feature). Also, the author liked the ability to send chatters the
appropriate web pages so they can be seen as they would in a reference
session at the library. The fact that this was all free was very
attractive, but after HumanClick announced fees, they shopped around
and bought the [39]Virtual Reference Desk package, and will come
online with it any time now. A good case study for library
organizations kicking this idea around. - JR

Chapman, Stephen. "Content Follows Form: Preservation via Systems
Design" Microform & Imaging Review 30(1) (2001). - One day recently I
was listening to my local public radio station, and heard an
"interview" (love-fest is actually more like what it was), with
Nicholson Baker -- a library gadfly who, among other things, protested
the destruction of card catalogs as if they were vast treasure trove
of unrecoverable information. Now he has moved on, and is presently
attacking the practice of replacing decaying newsprint with
preservation microfilm. His new book Double Fold: Libraries and the
Assault on Paper apparently reads like a who-dunit, complete with
theories of conspiracy and evil intent. I say "apparently" beause I
haven't yet brought myself to buy it, and thereby sending royalties in
his direction. But I digress. The reason I bore you with this
(although stay tuned, Baker's book may be reviewed in a future issue
of Current Cites) is that Chapman's article landed on my desk the next
day and seemed to be a near-perfect antidote to Baker's polemic. In
his usual thoughtful and learned style, Chapman investigates territory
that few have seen, let alone explored. He discusses the differences
between the artifact and the intellectual content the artifact holds,
and the impact on preservation decisions. He asserts that decisions on
what constitutes object integrity should be based on functional
characteristics as opposed to physical attributes. So much so, that
"it must be acceptable for an 'authentic' copy to have an entirely
different look and feel from the source item." Going even further,
Chapman makes a reasoned statement that must surely drive Nicholson
Baker up the wall, "If the goal of preservation is persistent utility,
then functionality rather than aesthetics should drive system design."
- [40]RT

Fishman, Stephen. [41]The Public Domain: How to Find Copyright-Free
Writings, Music, Art & More. Berkeley, CA: [42]Nolo, 2001. ISBN
0-87337-433-9. - If you have tried to obtain the rights to digitize a
currently copyrighted work, you can easily understand why so many
digitization projects focus on public domain works instead. Forget
about the knotty technical problems involved in creating digital
libraries; the really tough problems involve intellectual property
rights issues. So, it should be easy to identify public domain
materials to avoid these problems, right? Well, maybe not. How about a
photograph of a drawing? The photograph may be in the public domain,
but the drawing may not be. What happens if a work is in the public
domain in the U.S., but not in another country? Was the copyright of a
foreign work that had been in the public domain in the U.S. prior to
1996 restored by the GATT treaty? What you need to sort out these
issues is a book, written by an knowledgeable attorney, that provides
detailed background information about the public domain and discusses
specific problems associated with different types of materials (e.g.,
artworks, architectural documents, choreographic works, databases,
films, maps, sheet music, sound recordings, television programs,
photographs, software, and written works). Stephen Fishman has written
such a book, and, like other Nolo publications, you don't need to have
a law degree to understand it. - [43]CB

Guevin, Carole. [44]"Visual Architecture: The Rule Of Three."
[45]Digital Web Magazine (April 10, 2001)
(http://www.digital-web.com/features/feature_2001-4.shtml). - Been
burned by numbers lately? Are all the "rules" of Ten or Seven or Three
starting to add up to numeric overload? If so, don't let this prevent
you from having a look at [46]Visual Architecture : The Rule of Three
by Montreal-based designer, Carole Guevin, which appeared recently in
[47]Digital Web. This short yet effectively illustrated article
focuses on how meaning is conveyed through visual representation and
through the arrangement of objects in print or on a web page. The
author notes that as users rely more on scanning rather than on
thoroughly reading a page to ascertain its value, the visual cues
provided by designers become proportionally more important. - [48]LRK

Katz, Richard N. [49]"Archimedes' Lever and Collaboration : An
Interview with Ira Fuchs" [50]EDUCAUSEreview 36(2) (March/April 2001):
16-20 (http://www.educause.edu/ir/library/pdf/erm0120.pdf). - Most
people have a pretty good idea about why they're in higher education
but for those plagued by doubts or for those who just need something
convenient to point the in-laws to, help is on the way in the form of
this interview. The interview gives Fuchs, vice president for Research
IT at the Mellon Foundation, an opportunity to discuss his views on
the current and future role of information technology in higher
education. Fuchs argues that the ability to openly collaborate and to
share information is one of the chief strengths of not-for-profit
institutions and that these institutions can use this strength as a
lever like Archimedes of yore to "move the earth". - [51]LRK

Marsan, Carol Duffy. [52]"Faster 'Net Growth Rate Raises Fears About
Routers" [53]NetworkWorldFusion (April 2, 2001)
(http://www.nwfusion.com/news/2001/0402routing.html). - Geek pundits
periodically fret about the demise of the Internet; every so often, we
read somewhere that the whole works is going to implode, a victim of
its own staggering growth rate. This article directs your attention to
"an obscure statistic that indicates the 'Net is growing -- in size
and complexity -- at a faster rate than today's routers can handle."
That statistic is the number of entries in the Internet backbone's
routing table; routing table size and traffic is a key indicator of
overall Internet health. Over the past six months, "the size of the
routing table and traffic in it exploded," and the necessity for
frequent updates by network managers has created infrastructure
instability. Much of this activity upsurge can be attributed to
"multihoming on corporate networks" -- where a single Internet server
may be connected to two or more ISPs "for improved reliability and
redundancy." And this means...? Large companies may need to up their
spending for more powerful network gear. Routing information may be
much slower to propagate across the Internet. And ultimately, the
Internet Engineering Task Force may have to hammer out a new routing
framework. - [54]SK

Thelwall, M. [55]The Responsiveness of Search Engine Indexes
[56]Cybermetrics 5(1). paper 1 (2001)
(http://www.cindoc.csic.es/cybermetrics/articles/v5i1p1.html)
([57]HTML) and
(http://www.cindoc.csic.es/cybermetrics/articles/v5i1p1.pdf)
([58]PDF). - Cybermetics (ISSN1137-5019) is subtitled: International
Journal of Scientometrics, Informetrics, and Bibliometrics. This
web-only journal is "devoted to the study of the quantitative analysis
of scholarly and scientific communications." As such, commonplace
topics such as the strengths and weaknesses of search engines are
given scholarly treatment and are subject to review before
publication. Given that search engines are a significant tool in
mining the web for information, it is important to understand how
search engines select the URLs for inclusion in their respective
databases. There are three primary methods: 1. yield of URLs from
crawling the web; 2. extraction of links from authoritative web pages
(i.e., whom do they link to); and 3. the submission of URLs by website
owners. Most search engines employ one or several of the above
techniques. However, another important method is the examination of
the quality, reliability and quantity of sites that link to a given
site. This article details an experiment undertaken to determine
whether the quantity of links to a site will affect the likelihood of
its inclusion in search engine databases. The methodology employed to
obtain data is described. The search engines selected for the
comparison are Alta Vista, HotBot (uses Inktomi spider), and Yahoo
(switched from Inktomi spider to Google). Google follows links to
sites that it spiders, and is thus fairly responsive to the existence
of new sites. However, the algorithms used by most search engines to
add and/or delete sites are proprietary secrets. The author concludes
that because of varying spider algorithms, no one search engine is all
inclusive. In order to retrieve the most comprehensive resource yield,
several search engines must be consulted. Furthermore, due to the lack
of knowledge about proprietary indexing criteria, it is a good idea to
manually submit new site URLs to multiple search engines. - [59]MG

United States General Accounting Office. [60]Electronic Dissemination
of Government Publications (GAO-01-428) March, 2001
(http://www.gao.gov/new.items/d01428.pdf). - This GAO report
represents the latest government efforts to deal with a basic problem:
the fragmentation of the federal government publication system which
formerly functioned as a comprehensive method for getting government
information to the public, but since the rise of digitization has been
beset with a loss of control over how publications are disseminated.
The advantages of online access to public documents are obvious, but
serious questions remain about archiving and the accessibility of
print versions for the unwired. Unfortunately, the GAO report is less
about electronic dissemination than it is about bureaucratic
reorganization; specifically, the proposal to transfer responsibility
for the Depository Library Program from the Government Printing Office
to the Library of Congress. This isn't just negligible administrivia,
though, because reading this report and particularly its appendices
gives the status of the depository system and the current state of
debate. And now that I've whetted your appetite for more government
information policy, check out the U.S. National Commission on
Libraries and Information Science report, [61]"A Comprehensive
Assessment of Public Information Dissemination,"
(http://www.nclis.gov/govt/assess/assess.html) which creates a much
bigger context for the many factors involved. - JR
_________________________________________________________________

Current Cites 12(4) (April 2001) ISSN: 1060-2356
Copyright © 2001 by the Regents of the University of California All
rights reserved.

Copying is permitted for noncommercial use by computerized bulletin
board/conference systems, individual scholars, and libraries.
Libraries are authorized to add the journal to their collections at no
cost. This message must appear on copied material. All commercial use
requires permission from the editor. All product names are trademarks
or registered trade marks of their respective holders. Mention of a
product in this publication does not necessarily imply endorsement of
the product. To subscribe to the Current Cites distribution list, send
the message "sub cites [your name]" to
[62]listserv@library.berkeley.edu, replacing "[your name]" with your
name. To unsubscribe, send the message "unsub cites" to the same
address.

References

1. http://sunsite.berkeley.edu/cgi-bin/imagemap/cc
2. http://escholarship.cdlib.org/rtennant/
3. http://info.lib.uh.edu/cwb/bailey.htm
4. http://www.cam.org/~mgross/mgross.htm
5. http://web.tampabay.rr.com/hooboy/
6. http://patachon.com/
7. http://www.lib.ncsu.edu/staff/morgan/
8. http://escholarship.cdlib.org/rtennant/
9. http://www.publiclibraryofscience.org/
10. http://www.nature.com/nature/debates/e-access/
11. http://www.nature.com/
12. http://escholarship.cdlib.org/rtennant/
13. http://www.sciam.com/explorations/2001/042301publish/
14. http://www.sciam.com/
15. http://www.publiclibraryofscience.org/
16. http://escholarship.cdlib.org/rtennant/
17. http://www.sciencemag.org/cgi/content/full/291/5512/2318a
18. http://www.sciencemag.org/
19. http://www.sciencemag.org/cgi/content/full/291/5512/2318b
20. http://www.sciencemag.org/
21. http://www.publiclibraryofscience.org/
22. http://patachon.com/
23. http://escholarship.cdlib.org/rtennant/
24. http://www.press.umich.edu/jep/06-03/anderson.html
25. http://www.press.umich.edu/jep/
26. http://www.pediatrics.org/
27. http://info.lib.uh.edu/cwb/bailey.htm
28. http://www.darwinmag.com/read/040101/badthings_content.html
29. http://www.darwinmag.com/
30. http://web.tampabay.rr.com/hooboy/
31. http://www.scientificamerican.com/2001/0501issue/0501berners-lee.html
32. http://www.scientificamerican.com/
33. http://www.w3.org/RDF/
34. http://www.lib.ncsu.edu/staff/morgan/
35. http://www.infotoday.com/cilmag/apr01/broughton.htm
36. http://www.infotoday.com/cilmag/ciltop.htm
37. http://www.bgsu.edu/colleges/library/
38. http://www.humanclick.com/
39. http://www.virtualreference.net/virtual/
40. http://escholarship.cdlib.org/rtennant/
41. http://www.nolo.com/product/publ/summary_publ.html?t=02590008203202000SubcategoryID*82
42. http://www.nolo.com/
43. http://info.lib.uh.edu/cwb/bailey.htm
44. http://www.digital-web.com/features/feature_2001-4.shtml
45. http://www.digital-web.com/
46. http://www.digital-web.com/features/feature_2001-4.shtml
47. http://www.digital-web.com/
48. http://patachon.com/
49. http://www.educause.edu/ir/library/pdf/erm0120.pdf
50. http://www.educause.edu/pub/er/erm.html
51. http://patachon.com/
52. http://www.nwfusion.com/news/2001/0402routing.html
53. http://www.nwfusion.com/
54. http://web.tampabay.rr.com/hooboy/
55. http://www.cindoc.csic.es/cybermetrics/articles/v5i1p1.html
56. http://www.cindoc.csic.es/cybermetrics/
57. http://www.cindoc.csic.es/cybermetrics/articles/v5i1p1.html
58. http://www.cindoc.csic.es/cybermetrics/articles/v5i1p1.pdf
59. http://www.cam.org/~mgross/mgross.htm
60. http://www.gao.gov/new.items/d01428.pdf
61. http://www.nclis.gov/govt/assess/assess.html
62. mailto:listserv@library.berkeley.edu
63. http://sunsite.berkeley.edu/Admin/copyright.html
64. http://escholarship.cdlib.org/rtennant/
65. mailto:manager@sunsite.berkeley.edu

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT