Copy Link
Add to Bookmark
Report

Atari Online News, Etc. Volume 12 Issue 27

eZine's profile picture
Published in 
Atari Online News Etc
 · 5 years ago

  

Volume 12, Issue 27 Atari Online News, Etc. July 2, 2010


Published and Copyright (c) 1999 - 2010
All Rights Reserved

Atari Online News, Etc.
A-ONE Online Magazine
Dana P. Jacobson, Publisher/Managing Editor
Joseph Mirando, Managing Editor
Rob Mahlert, Associate Editor


Atari Online News, Etc. Staff

Dana P. Jacobson -- Editor
Joe Mirando -- "People Are Talking"
Michael Burkley -- "Unabashed Atariophile"
Albert Dayes -- "CC: Classic Chips"
Rob Mahlert -- Web site
Thomas J. Andrews -- "Keeper of the Flame"


With Contributions by:

Fred Horvat



To subscribe to A-ONE, change e-mail addresses, or unsubscribe,
log on to our website at: www.atarinews.org
and click on "Subscriptions".
OR subscribe to A-ONE by sending a message to: dpj@atarinews.org
and your address will be added to the distribution list.
To unsubscribe from A-ONE, send the following: Unsubscribe A-ONE
Please make sure that you include the same address that you used to
subscribe from.

To download A-ONE, set your browser bookmarks to one of the
following sites:

http://people.delphiforums.com/dpj/a-one.htm
Now available:
http://www.atarinews.org


Visit the Atari Advantage Forum on Delphi!
http://forums.delphiforums.com/atari/



=~=~=~=



A-ONE #1227 07/02/10

~ IBM Endorses Firefox! ~ People Are Talking! ~ RIAA Is Outraged!
~ Google Bowing to China ~ New XP 0-Day Attack! ~ HEO Act of 2008!
~ Big Seagate 3TB Drive! ~ Which ISP Is Fastest? ~ Windows 8 Rumors!
~ Chrome Passes Safari! ~ Dell's Linux Backtrack ~ Sex.com Is on Sale!

-* The Great Atari Rollercoaster *-
-* Obama's Twitter Hacker Is Convicted *-
-* "Internet Kill Switch" Bill Moves Forward! *-



=~=~=~=



->From the Editor's Keyboard "Saying it like it is!"
""""""""""""""""""""""""""



I know that last week I mentioned I had some ideas floating around in my
head that I wanted to talk about this week. But, it's been another one of
those long and tiring weeks; and I don't want to try and rush some comments
out just for the sake of discussing them without much thought. So, we'll
table those ideas for another week. Hey, it's a long holiday weekend, and
most of you probably have plans anyway this weekend. For me, I'm hoping to
be able to enjoy some holiday activities, but those will probably be limited
to a little grilling and a beer or three; my work schedule is unaffected by
the holiday! Oh well...

Until next time...



=~=~=~=



PEOPLE ARE TALKING
compiled by Joe Mirando
joe@atarinews.org



[Editor's note: Due to hardware issues, there will not be a column for this
week's issue. Stay tuned for next week's issue!]



=~=~=~=



->In This Week's Gaming Section - Don't Expect 3DS Until 2011!
""""""""""""""""""""""""""""" Could 3D Video Gaming Wreck Your Eyes?
The Great Atari Rollercoaster!




=~=~=~=



->A-ONE's Game Console Industry News - The Latest Gaming News!
""""""""""""""""""""""""""""""""""



Nintendo: Don't Expect 3DS Until 2011


Putting a knife through the heart of gamers expecting Nintendo's handheld
3D refresh sometime this year, Nintendo president Reggie Fils-Aime now
says the high-end gizmo won't see release until next year.

Fils-Aime must have been hoping no one would notice. He made the
announcement on Friday's Late Night with Jimmy Fallon, after playing a bit
of Donkey Kong Country Returns (for Wii, out later this year).

Alright, technically speaking Nintendo said the 3DS would ship by end of
March 2011, so the announcement's not that surprising. Blame the grapevine
for hyping the other direction: In April this year, CVG claimed "industry
sources" had confirmed the system would launch "in advance of Christmas
this year," pegging the month window to October, and leading some to
speculate about an October 10, 2010 (10-10-10!) date.

The 3DS carries Nintendo's hopes of dominating the upcoming handheld
wars with players like Sony and Apple. It features a 3.53-inch
widescreen along its interior top half against a 3.02-inch touchscreen
along its interior bottom half, both motion and gyroscopic sensors,
802.11n wireless, an analog thumb-nub, three cameras (one inside, two
facing out for 3D picture-taking), a much faster processor for higher
quality graphics, and the headliner: True stereoscopic 3D without the
need for special eye-ware.

Why would Nintendo miss the holiday window? I'd guess it's the screen
technology and nascent manufacturing processes. Instead of debuting
globally and risking months of consumer-exasperating supply constraints,
the company's probably planning a Japanese holiday debut, with the rest
of us taking our turn next year.



Could 3D Video Gaming Wreck Your Eyes?


A cautionary editorial at Audioholics expands on industry warnings that it
could well be hazardous for kids under seven.

Wait...what industry warnings?

In the wake of Nintendo's 3DS announcement at E3, company president Reggie
Fils-Aime said the company would advise children under seven not to view
games in 3D.

"We will recommend that very young children not look at 3D images," he
said. "That's because, [in] young children, the muscles for the eyes are
not fully formed."

"This is the same messaging that the industry is putting out with 3D
movies, so it is a standard protocol."

Speaking of, Samsung just released its own bullet list of warnings
commensurate with its new line of 3D TVs, advising that "parents should
monitor and ask their children about...symptoms as children and teenagers
may be more likely to experience these symptoms than adults."

"Viewing in 3D mode may also cause motion sickness, perceptual after
effects, disorientation, eye strain, and decreased postural stability,"
continues Samsung. "It is recommended that users take frequent breaks to
lessen the likelihood of these effects. If you have any of the above
symptoms, immediately discontinue use of this device and do not resume
until the symptoms have subsided."

Audioholics ratchets up the rhetoric, saying those warnings "come after
years of industry spin and cover ups," and that "the truth is that
prolonged viewing of 3D video may be even more harmful than the consumer
electronics industry wants you to know."

The problem, they claim, involves stereopsis and strabismus. The former is
how we see the world in 3D: Two eyes out front that intake slightly
different images to provide depth perception. The latter is what happens
when both eyes don't focus properly on the same point in space, throwing
off binocular vision and goofing up depth perception, or leading to
amblyopia, aka "lazy eye," a condition in which the brain favors one eye
over another, leading to poor vision in the "weak" eye.

Next: "It's never too late to learn bad habits that could create visual
problems."

According to Optometrists Network, "a network of interconnected patient
education and optometric web sites which educate the public about visual
health," two Harvard Nobel laureates in the early 1960s identified a
"critical period" - up to age seven - during which brains are still
"learning" stereopsis. While recent neuroplasticity research suggests it's
"never to late" to improve strabismus, Audioholics surmises conversely
that it's "never too late to /learn/ bad habits that could create visual
problems."

Resting their exhortation about the perils of 3D on reports of "nausea,
disorientation or postural instability, and visual symptoms" culled from
data collected by Standard Research Lab some 15 years ago for Sega's failed
VR headset, Audioholics advises "protecting yourself and your family by
using that new [3D] HDTV for standard 2D viewing a majority of your time."

So is Audioholics right, or just being alarmist?

It's hard to say without taking an equally indefensible position. We're
still in the nascent stages of mass 3D adoption. We're not (yet) viewing
images using iterations of stereoscopic technology for hours, days, and
weeks on end. We thus lack the necessary up-to-date research to say much
definitively at this point. Audioholics simply builds on what others
have have already been saying, in turn built on dated 1990s research.

What we should do, is demand more research, independently verified,
and presented unvarnished to the public. I want to know as much as
anyone else whether the eyestrain I felt watching a movie like Avatar in
3D could have longterm negative ramifications for my eyesight. I want to
know if my kids (when I have them) should steer clear of the technology
until they're older. I want to know what's scientific and what's
hearsay, and that's why we need independent researchers to step up and
tell us what they're seeing (no pun intended) as soon as possible.



=~=~=~=



->A-ONE Gaming Online - Online Users Growl & Purr!
"""""""""""""""""""



The Great Atari Rollercoaster


You wouldn't be gaming today without Nolan Bushnell's landmark vision.

Though recent history has contained a rollercoaster of highs and lows for
the Atari name - and that's all that remains of it, to be sure - there can
be no doubts about the epic impact of the company founded by Nolan Bushnell
and Ted Dabney on not just videogames, but entertainment and pop culture.
The one-time industry juggernaut, responsible for such highs as the
creation of Pong and the Atari 2600 (and great lows like the release of
E.T. for the 2600, a game that greased the slope for crash of 1983),
celebrates its 38th anniversary today. IGN Retro looked back over the life
and career of Bushnell, but for the anniversary of the giant, it's fitting
to chart the trajectory of Atari over the last three decades. How did the
name once synonymous with videogames become only a front for a completely
unrelated publisher?

Atari was actually not the first name of Bushnell and Dabney's company.
Originally, Bushnell selected Syzygy, which is an astronomical term meaning
"alignment" in reference to celestial bodies. (It's also a Gnostic term,
but considering Bushnell's Mormon faith, it's highly unlikely he was
thinking of the pairing of aeons.) However, after discovering that the
name had already been taken in California, and the simple fact that Syzygy
is a nightmare to pronounce, Bushnell considered several terms from the
Japanese game Go. He chose "Atari," which translates to "target." It was a
fitting moniker for the newly formed company which was about to target
America with a new entertainment medium.

The story of Atari's first game, Pong, is legend within the industry. After
having seen Ralph Baer's Magnavox Odyssey and its tennis game, Bushnell and
his first engineer set about creating a similar game that instead of being
played at home like the Odyssey, would be a coin-operated arcade game.
(Magnavox later sued and settled with Bushnell over the similarities
between Pong and the Odyssey's tennis game.) The first Pong machine was
installed in a Sunnyvale bar named Andy Capp's. The game broke down the
first night it was in the bar. When the machine was opened up in the
morning to see what caused the crash, it overflowed with quarters. The game
was such a smash that customers overloaded the machine's coin collector and
in the process of trying to feed more into the machine, broke it.

Atari was now officially in the arcade game business. Future hits included
Battlezone and Breakout, the latter of which was prototyped by Steve Jobs.
(Yes, that Steve Jobs, now titan of Apple.) Though Atari's arcade business
was tremendous, Bushnell began thinking of the home market in 1975. By the
following year, Atari had created a functional home console design that
used cartridges to change games, but the cost of developing the Atari 2600
was too much for the still-small Atari to bear. Bushnell sold Atari to
Warner Communications to have the capital necessary to complete the 2600,
which was estimated to have a $100 million budget by the time it was ready
for market.

The Atari 2600 debuted in 1977 at $199, which is more than $700 when
adjusted for today's inflation. Though interest was high, it did not take
off like a rocket. Over time, though, when videogames were determined to no
longer be a cultural flash, the Atari 2600 picked up steam and was soon one
of the hottest selling consumer items in America, responsible for a third
of Warner's income. But the decision to sell Atari to maintain both its
vision and survival came back around to wound Bushnell. There are
conflicting accounts over what exactly transpired behind closed doors, but
Bushnell saw the future of the Atari 2600 differently from Warner and thus
departed in 1978.

Atari continued to rack up fantastic sales. In 1980, over two million Atari
2600 consoles were sold. The climb continued and by 1982, almost eight
million 2600s were in living rooms. The machine - and by extension Atari -
was a monster hit.

In addition to its successful 2600, Atari was also producing home computers,
developing another console called the 5200, and exploring a number of side
projects, such as the Atari Cosmos, a tabletop system that employed
holographic backdrops behind LEDs. (Collector alert: This is the Holy Grail.
Only two functional units are known to exist. And neither is for sale.)

However, Atari made a few slip-ups such as the release of a disappointing
home Pac-Man port and the aforementioned E.T. debacle that cost the company
tens of millions. But competition in the industry, a deluge of questionable
quality games, and a resulting price war soon culminated in the videogame
market crash of 1983. Atari itself barely survived the industry implosion
(which fully played out in 1984), saved only by Warner selling the home
division of Atari to Jack Tramiel, the founder of Commodore, which was now
called Atari Corporation.

At this critical juncture in Atari's lifecycle, it was given an incredible
opportunity that it turned down. In 1983, Nintendo actually approached
Atari, offering it the distribution rights of the Famicom/NES in America.
However, the deal fell apart over the rights to Donkey Kong, leaving
Nintendo itself to produce and distribute the NES in America in 1985. As if
the cratering of the videogame industry wasn't hard enough on Atari, its
subsequent phoenix-like rebirth under Nintendo's dominance was a death
knell. But it did not play out right away.

Despite refusing the NES, under Tramiel Atari actually managed to turn a
profit, thanks to decent sales for bargain 2600 consoles, the 7800 console,
and the success of two Atari home computers: Atari ST and Atari XE. (The
tari ST was a bigger hit in Europe than America.) Other Atari systems
produced in this period included the Falcon personal computer and the
Portfolio, which was the very first PC palmtop computer. The Tramiels -
both Jack and his son Sam - were colorful figures in the interactive
industry. As evangelists, they helped Atari gain enormous amounts of
attention, such as during the lead-up to the Jaguar (1993) and the handheld
Lynx (1989).

Of the two systems - which were the last pieces of hardware ever released
under the Atari name - the Lynx was the one with a decent chance at
survival. The Lynx was the world's first color LCD handheld. It was
considerably more powerful than Nintendo's Game Boy. However, due to the
inability to produce enough units to have a nationwide launch in 1989, the
Lynx failed to really challenge the Game Boy and the white-hot Tetris
pack-in game. When the machine did go wide in 1990, it was then unable to
sell enough units (even after dropping to only $99 in late 1990) to
maintain third-party support.

The Jaguar, on the other hand, was a complete disaster. It was underpowered
compared to the industry's new darling, the PlayStation, and was
exceedingly difficult to program for. On top of this, the Jaguar just
looked unattractive. Sam Tramiel's interview with Next Generation magazine
about the soon-to-fail Jaguar remains a favorite, where he declared the
Saturn was "a pooch" and that Atari would be pursuing Sony for alleged
"dumping" if it tried to sell the PlayStation for less than $500. (There
was no suit when the PlayStation debuted in 1995 for $299.) The Jaguar never
sold more than 500,000 units in its relatively short lifecycle and by 1996,
Atari was out of the hardware business.

The Tramiels abandoned Atari to JTS, a hard disk manufacturer, which did
nothing but maintain Atari holdings. Two years later, JTS sold the Atari
brand and catalog to Hasbro Interactive for only $5 million, a mere fraction
of the company's one-time worth. Hasbro mined some of Atari's history, such
as Pong, during a retro-revival phase around the turn of the millennium.
However, Hasbro Interactive was a short-lived success story, eventually
folding in 1999 when Infogrames bought the division for $100 million.

In 2001, Infogrames chose to make an aggressive play to expand its
worldwide footprint in videogames, and so it used the new Atari name, which
still possessed universal cache, to cater to a new audience. Whether or not
the decision helped is debatable, as Infogrames' high-profile titles like
Enter the Matrix, the Dragonball Z series, and Driver 3 had built-in appeal
on their own regardless of the publisher's name.

Enter the Matrix was a hit for the Atari brand.

However, in the last decade, Infogrames/Atari's fortunes were uneven and
the company ran into financial hardship on numerous occasions. By late 2007,
Atari's American operations completely abandoned videogame development in
favor of concentrating on publishing. Close followers of videogames, though,
saw potential for Atari's survival rise when Phil Harrison, then president
of Sony Computer Entertainment and a real face for the PlayStation,
abruptly resigned from Sony in February 2008 and quickly joined Atari as
its general director. Harrison was unable to right the ship, though, and
after less than two years of service, resigned from Atari in April of 2010.

Harrison's departure was unexpected, but perhaps not as much as the newest
member of the Atari Board of Directors: Nolan Bushnell. Bushnell joined on
the same day that Harrison resigned. Within the last few months, Atari has
made moves to get back into the game, including Star Trek Online and the
upcoming Test Drive Unlimited 2.

The name Atari is no longer synonymous with videogames - that title is now
probably shared between Nintendo and PlayStation. But the industry would
look nothing like it does now without the brainchild of Bushnell and
Dabney. As you chart its historical course, you can imagine how things
might be different if alternate decisions were made at crucial points. What
if Bushnell had stayed? What if Atari had exercised serious quality control
over the 2600? What if the Lynx had successfully challenged the Game Boy?
But there are no "do-overs," not even in videogames. And so Atari - the
company, not the brand - remains a fascinating fall and cautionary tale for
all.



=~=~=~=



A-ONE's Headline News
The Latest in Computer Technology News
Compiled by: Dana P. Jacobson



Senate 'Internet Kill Switch' Bill Moves Forward


A Senate committee on Thursday approved a cyber-security bill that has
prompted concern about a presidential "Internet kill switch."

The Homeland Security and Governmental Affairs Committee unanimously
approved the Protecting Cyberspace as a National Asset Act of 2010 (S.
3480). It now moves to the Senate floor for a full vote.

The bill is an over-arching cyber-security measure, which would, among
other things, create an office of cyberspace policy within the White House,
which would be led by a Senate-appointed director. It would also create a
new center within the Homeland Security Department, which would implement
cyber-security policies.

A provision that got the most attention, however, was one that gave the
president the power to "authorize emergency measures to protect the
nation's most critical infrastructure if a cyber vulnerability is being
exploited or is about to be exploited."

Though the language is somewhat vague, this section was interpreted by
many as giving the president an "Internet kill switch" that would
effectively allow him to "turn off" the Web in an emergency.

"While the bill makes it clear that it does not authorize electronic
surveillance beyond that authorized in current law, we are concerned that
the emergency actions that could be compelled could include shutting down
or limiting Internet communications that might be carried over covered
critical infrastructure systems," several privacy groups, including the
American Civil Liberties Union (ACLU), wrote in a Wednesday letter to the
committee.

The bill should be amended to describe exactly what actions the government
can take, the groups said.

Sen. Joe Lieberman of Connecticut, the bill's sponsor, refuted the
"Internet kill switch" assertion as "misinformation" during a Sunday
appearance on CNN, and the committee on Wednesday published a "myth vs.
reality" fact sheet on the bill.

Current law already provides the president with broad authority to take
over communications networks, the committee said, pointing to Section 706
of the Communications Act. That portion gives the president authority to
"cause the closing of any facility or station for wire communication" and
"authorize the use of control of any such facility or station" by the
federal government. This can be done if a state or threat of war exists,
it does not require advance notification to Congress, and can continue for
up to six months after the threat expires.

This bill, the committee said, "would bring presidential authority to
respond to a major cyber attack into the 21st century by providing a
precise, targeted, and focused way for the president to defend our most
sensitive infrastructure."

Specifically, the bill would give the president 30 days to respond to a
threat, requires that he notify Congress beforehand, and demands that he
use the "least disruptive means feasible" to do so. The committee denied
that it lets him "take over" the Web, and said it does not provide any
new surveillance authorities. Owners of private networks would be able
to propose alternative responses to a given threat.

Lieberman's bill "authorizes only the identification of particular systems
or assets - not whole companies, and certainly not the entire Internet,"
the committee said. "Only specific systems or assets whose disruption would
cause a national or regional catastrophe would be subject to the bill's
mandatory security requirements."

A catastrophe would include mass casualties, severe economic impact, mass
prolonged evacuations, or severe degradation of national security
capabilities.

The committee's fact sheet also denied that the bill gives the president
the authority to conduct e-surveillance and monitor private networks or
regulate the Internet.

"Catastrophic cyber attack is no longer a fantasy or a fiction," Lieberman
said in a Thursday statement. "It is a clear and present danger. This
legislation would fundamentally reshape the way the federal government
defends America's cyberspace. It takes a comprehensive, risk-based, and
collaborative approach to addressing critical vulnerabilities in our own
defenses. We believe our bill would go a long way toward improving the
security of our government and private critical infrastructure, and
therefore the security of the American people."

Rep. Jane Harman, a California Democrat, has introduced a House version
of the bill, H.R. 5548, but it has not yet passed committee.

In May 2009, President Obama designated cyber-security as a national
security policy. Seven months later, he appointed Howard Schmidt, a former
eBay and Microsoft executive, as cyber-security coordinator.



RIAA Outraged by YouTube-Viacom Decision


The Recording Industry Association of America (RIAA) on Monday voiced its
opposition to the recent decision in the YouTube-Viacom copyright
infringement case.

"We believe that the district court's dangerously expansive reading of the
liability immunity provisions of the [Digital Millennium Copyright Act]
DMCA upsets the careful balance struck within the law and is bad public
policy," Cary Sherman, RIAA president, wrote in a blog post. "It will
actually discourage service providers from taking steps to minimize the
illegal exchange of copyrighted works on their sites."

Last week, a New York District Court ruled that the posting of
Viacom-owned content by YouTube users on the Google-owned video site did
not constitute copyright infringement because YouTube removed the offending
content as quickly as possible after being alerted to their existence.

The ruling came three years after Viacom filed its $1 billion infringement
case against YouTube.

"The present case shows that the DMCA notification regime works
effectively," the court said, pointing to the fact that YouTube removed
10,000 videos at Viacom's request in one day. "General knowledge that
infringement is 'ubiquitous' does not impose a duty on the service provider
to monitor or search its service for infringements."

Under the DMCA, if a copyright holder finds an infringing piece of content
on a site like YouTube, they can issue a takedown notice. YouTube will pull
the video while it investigates. If the content infringes, it remains down.
If it does not, YouTube will put it back up.

The RIAA argued, however, that sites like YouTube are not doing enough.

"As the White House recently noted in its strategic plan to combat
intellectual property theft it is essential for service providers and
intermediaries generally to work collaboratively with content owners to
seek practical and efficient solutions to address infringement," Sherman
wrote. "We need businesses to be more proactive in addressing infringement,
not less."

Viacom has said it will appeal.

"We expect the Court of Appeals will better understand the balance Congress
struck when it enacted the DMCA," Sherman said.



New Rules Bring Online Piracy Fight to US Campuses


Starting this month, colleges and universities that don't do enough to
combat the illegal swapping of "Avatar" or Lady Gaga over their computer
networks put themselves at risk of losing federal funding.

A provision of the Higher Education Opportunity Act of 2008 is making
schools a reluctant ally in the entertainment industry's campaign to
stamp out unauthorized distribution of copyrighted music, movies and TV
shows.

Colleges and universities must put in place plans "to effectively combat
the unauthorized distribution of copyrighted material by users of the
institution's network" without hampering legitimate educational and
research use, according to regulations that went into effect Thursday.

That means goodbye to peer-to-peer file-sharing on a few campuses - with
exceptions for gamers or open-source software junkies - gentle warnings
on others and extensive education programs everywhere else.

Despite initial angst about invading students' privacy and doing the
entertainment industry's dirty work, college and university officials
are largely satisfied with regulations that call for steps many of them
put in place years ago.

But whether the investment of time and money will make a dent in digital
piracy is uncertain.

"If the university is going to prohibit underage drinking, I think it
ought to prohibit anything on the Internet that's illegal, too," said
Alicia Richardson, an Illinois State University junior who applauds her
school's restrictive policies on file-sharing. "I'm not going to mess
with it. I know the consequences."

Among other things, schools must educate their campus communities on the
issue and offer legal alternatives to downloading "to the extent
practicable."

Colleges and universities that don't comply risk losing their
eligibility for federal student aid.

Many colleges worried they would be asked to monitor or block content.
But the provision says schools can get a great deal of flexibility, as
long as they use at least one "technology-based deterrent."

Their options include taking steps to limit how much bandwidth can be
consumed by peer-to-peer networking, monitoring traffic, using a
commercial product to reduce or block illegal file sharing or
"vigorously" responding to copyright infringement notices from copyright
holders.

Almost all campuses already manage bandwidth or vigorously process
infringement, or "takedown," notices, said Steven Worona, director of
policy and networking programs for Educause, a higher education tech
advocacy group.

While the recording industry has backed off its strategy of suing illegal
file-sharers, it still sends infringement notices to colleges - a shot
across the bow that urges users to delete and disable computer access to
unauthorized music to avoid legal action.

"The problem campuses have is that commercial network providers are not
doing anything to limit the amount of infringement on their networks or
educate their customers about copyright law," Worona said. "Every fall,
a new cadre of students arrives on campuses who have been engaging in
infringing activity since the third grade."

Since October 2008, the Recording Industry Association of America said
it has sent 1.8 million infringement notices to commercial internet
service providers - and 269,609 to colleges and universities.

RIAA, which represents the major music labels, stressed that the numbers
don't necessarily reflect piracy trends, but rather the group's ability
to detect it.

College officials argue notices are a flawed measure of illegal activity
because it's up to copyright holders whether to send them and that false
positives are possible.

RIAA president Cary Sherman said the group can't say whether campus
programs are putting a dent in piracy. But he said the threat of a
gradually tougher response to repeat violations is working, pointing to
the University of California, Los Angeles, as one example.

"We think we're beginning to get to a scale now where it actually can
make a difference," he said.

UCLA has developed a system that notifies users by e-mail when the
school receives a copyright infringement notice, setting into motion a
process that includes a "quarantine" on the computer's Internet access
and the student's attendance at an educational workshop. Repeat
offenders typically face one-semester suspensions.

Since the workshops started, repeat offenders have virtually disappeared,
said Kenn Heller, assistant dean of students. Earlier this year, UCLA also
struck a partnership with Clicker Media Inc. to make both
university-produced videos and network TV shows, music videos and movies
available through its undergraduate student Internet portal.

The Motion Picture Association of America, which also pressed for the
legislation, is encouraged by what campuses are doing but it's too early
to tell whether it will curb piracy, spokeswoman Elizabeth Kaltman said.

Few campuses have gone as far as Illinois State, which raised eyebrows
by seeking and accepting entertainment industry money to underwrite a
now-abandoned research project on digital piracy.

The university also blocked all peer-to-peer activity in residence halls
and on wireless access points, said Mark Walbert, Illinois State's chief
technology officer. Students who use the technology for legal means - like
tapping open-source software Linux or downloading World of Warcraft game
updates - can get exceptions.

For students seeking legal download options, the school developed
BirdTrax, a Web page with links to the free movie and music streaming
websites such as Hulu and Pandora.

In 2007, the University of Michigan took a different approach, launching
a campus initiative called "BAYU," which stands for "Be Aware You're
Uploading." At little cost, the school developed a software program that
automatically notifies users of university networks when they are
uploading, or sharing files from their computer with users elsewhere.

The university does not look at what is being shared, and notices go out
regardless of whether the activity is legal or illegal, said Jack
Bernard, a university lawyer who devised the program, which Michigan
offers free to other schools.

As a result, the number of copyright infringement notices the university
receives has slowed to a trickle, he said.

"We think scare tactics and most technological means don't realize the
ends we want because technological means never seem to keep up with
people's ability to thwart them," Bernard said.

New technologies have made it more difficult to assess how much
enforcement has affected piracy, said Joe Fleischer, chief marketing
officer for tracking firm BigChampagne Media Measurement.

File-hosting services such as RapidShare store infringing content on
distant servers, meaning uploaders' identities are difficult to track.
Websites that share links to those files are searchable through Google.

"It's a much more complicated battle than it was five years ago because
so many new modes of infringement are emerging," Fleischer said.



Google Scrambles To Save Internet License in China


China is threatening to revoke Google's business license over the
company's decision to redirect Chinese traffic to computers in Hong Kong
that are not governed by the communist government's censorship practices.

The latest skirmish between Beijing and the Internet search leader
threatens to cripple the company in one of the Web's biggest markets.

Google agreed Tuesday to dismantle the virtual bridge to its Hong site
that was created in March, but it was unclear whether that will be
enough to stay in business in China. The license is required for the
company to continue providing its mapping and music services in China.

Google hopes to keep its license by turning its Chinese website into a
so-called "landing page" anchored by a link that users must click on to
send visitors to the Hong Kong search service. The company has no plans
to revert back to its previous practice of omitting search results that
the Chinese government considers subversive or pornographic.

"This new approach is consistent with our commitment not to self-censor
and, we believe, with local law," David Drummond, Google's top lawyer,
wrote in a blog post.

A foreign ministry spokesman, Qin Gang, said he had not seen Google's
announcement and could not comment on it. However, he added, "I would
like to stress that the Chinese government encourages foreign
enterprises to operate in China according to law."

The impasse could drag on for months, analysts predicted, as both Google
and the Chinese government jostle in a heavyweight wrestling match
unfolding on an international stage.

Google Inc. announced in January that it would no longer comply with
Chinese censorship after being hit by a hacking attack traced to China.
The high-profile challenge irritated Chinese leaders, even though they
want foreign companies to help develop the country's technology industry.

Google met a Wednesday deadline to apply to renew its Internet license
in China. It's not clear how long the Chinese government will take to
review the application, but BGC Financial analyst Colin Gillis expects
the company "to twist in the wind for a while."

Google's uncertain fate in China could become a distraction for
management, but it's one that is probably worth the trouble, said
Gartner Inc. analyst Whit Andrews.

That's because China already has about 400 million people online, making
it the world's largest Internet market, and that figure is expected to
steadily grow for decades to come.

"Google knows its shareholders think it's important to be in China, and
a lot of its future value is riding on that," Andrews said. And China's
government knows it has to flex its muscle because "if it looks like
Google is running the show, it could affect their power."

China has not produced a big windfall for Google yet, partly because
it's one of the few markets where the company's search engine is not the
most popular. (The homegrown Baidu.com holds a 60 percent share compared
with about 30 percent for Google.)

Analysts estimate Google gets $250 million to $600 million in annual
revenue from China, or about 1 percent to 2 percent of its total revenue.

Even if Chinese regulators approve Google's new navigation tool, the
added click to reach Hong Kong could still drive away some users.

If that were to happen, "then advertisers will panic and cut spending,"
said Edward Yu, president of Analysys International, an Internet
research firm in Beijing.

Google could still remain in China even if the government pulls the plug
on its website in that country. The company has indicated it would like
to retain its engineering staff in China to take advantage of the
country's technology talent and to maintain a sales force that also
sells ads to Chinese businesses trying to reach customers outside the
country.

If Google.cn is shut down, mainland Chinese users could still reach
Google's services by manually typing in the address of the Hong Kong site.
But China's government could also use its own technology tools, sometimes
called a "Great Firewall," to prevent its citizens from connecting to
Google's sites outside the country.

The Mountain View, Calif.-based company launched its China-based site in
2006 after Chinese government filters blocked many users from reaching
the company's U.S. site.



Frenchman Convicted for Hacking Obama's Twitter


A court in central France has convicted a young Frenchman accused of
infiltrating Twitter and peeping at the account of President Barack Obama,
and given him a five-month suspended prison sentence.

The lawyer for Francois Cousteix, whose online name was Hacker Croll, said
his client was happy with Thursday evening's decision by the
Clermont-Ferrand court. He risked up to two years in prison and a 30,000
euro fine for breaking into a data system.

"The verdict is satisfying, given all the media pressure that built up,"
attorney Jean-Francois Canis said Friday by telephone.

Cousteix, 24, infiltrated Twitter, and the accounts of Obama and singers
Britney Spears and Lilly Allen, among other celebrities, but maintained
that his motives were good - to warn Internet users about data security.

Prosecutors said Cousteix did not have access to sensitive information
about the American president.

Cousteix managed to break into the accounts by searching information
that is most commonly used for passwords, such as birth dates or pet
names, on social networking sites. He lives with his parents and has no
college degree, and has not had any special computer training.

After his arrest, Cousteix told France-3 television, "It's a message I
wanted to get out to Internet users, to show them that no system is
invulnerable."

Twitter said last July that it was the victim of a security breach.

The French prosecutor said Cousteix infiltrated the accounts of several
Twitter administrative employees. He was able to access information such
as contracts with partners and resumes from job applicants.

Hacker Croll e-mailed some of the documents to TechCrunch, a widely read
technology blog, and it subsequently published some of them, including
financial projections. Some of the material was more embarrassing than
damaging, like floor plans for new office space and a pitch for a
Twitter TV show.

It is not the first time this self-described well-intended hacker has
had issues with the law.

He was convicted last January and given an eight-month suspended prison
sentence for diverting money from a gambling web site in 2007, his lawyer
said. He used the money - between euro3,000 and euro4,000 euros - to invest
in computer material.

Following Thursday's conviction, Cousteix plans to return to his job at
Rentabiliweb, a webmarketing and micropayment web site, his lawyer said.
Cousteix got the job two months ago, partly due to the fame of his
online deeds.



Microsoft: 10,000 PCs Hit With New XP 0day Attack


Nearly a month after a Google engineer released details of a new Windows
XP flaw, criminals have dramatically ramped up online attacks that
leverage the bug.

Microsoft reported Wednesday that it has now logged more than 10,000
attacks. "At first, we only saw legitimate researchers testing innocuous
proof-of-concepts. Then, early on June 15th, the first real public
exploits emerged," Microsoft said in a blog posting. "Those initial
exploits were targeted and fairly limited. In the past week, however,
attacks have picked up."

The attacks, which are being launched from malicious Web pages, are
concentrated in the U.S., Russia, Portugal, Germany and Brazil,
Microsoft said.

PCs based in Russia and Portugal, in particular, are seeing a very high
concentration of these attacks, Microsoft said.

According to security vendor Symantec, these attacks peaked late last
week. "Symantec has seen increased activity around this vulnerability.
The increased activity started around June 21 and peaked around June 26
and 27," a company spokesman said via instant message Wednesday. Attacks
have leveled out since then, he added.

Criminals are using the attack code to download different malicious
programs, including viruses, Trojans and software called Obitel, which
simply downloads more malware, Microsoft said.

The flaw that's exploited in all of these attacks lies in the Windows
Help and Support Center software that comes with Windows XP. It was
disclosed on June 10 by Google researcher Tavis Ormandy. This Help
Center software also ships with Windows Server 2003, but that operating
system is apparently not vulnerable to the attack, Microsoft said.

Ormandy was criticized by some in the security community for not giving
Microsoft more time to patch the flaw, which he disclosed to the
software vendor on June 5. He released details of the bug five days
later, apparently after failing to convince Microsoft to fix the issue
within 60 days.

In a security advisory released June 10, Microsoft outlines several ways
to turn off the Windows Help Center Protocol (HCP).

Microsoft's next set of security updates are due July 13.



Windows 8 Rumored Features: Your PC, Your Way


Microsoft has some big plans for the future of the PC if alleged Windows
8-related documents recently leaked online are correct. Nearly 18 documents
- supposedly confidential internal Microsoft plans - have surfaced,
including one called "Modern Form Factors" that shows that Microsoft may be
looking at three broad PC categories for the future: Lap PC, Workhorse PC,
and Family Hub PC.

Here's a quick look at each category from the document, and some of the
interesting things Microsoft may be thinking about. But keep in mind
while you're reading that these are just broad goals (if they are even
the real thing), and the document is more about Microsoft's general
vision than an actual product roadmap. Nevertheless, take a look and see
what you think.

The workhorse PC is essentially the PC you have today, but Microsoft
wants to add a facial recognition and proximity sensor feature called
"My PC Knows Me," according to the leaked documents.

The basic idea is that you walk into the room where your PC is, and the
proximity sensor detects your movements and wakes the PC.

As you sit down at the desk, your PC is already on so it can scan your
face to log you in.

The computer can also switch between different user accounts with the
facial recognition feature.

Once you've finished using the computer and leave the room, the
proximity sensor detects that no one is around, logs you off, and puts
the computer to sleep.

The Lap PC is basically a tablet, according to the leaked document, and
pretty much describes what the iPad can do today. The Lap PC would
include an accelerometer used for gaming and to adjust screen
orientation between portrait and landscape modes. The Windows tablet
would also have ambient light detection, a touch-based interface,
virtual keyboard, and location-aware capability.

Tying into the Workhorse PC's "My PC Knows Me" feature, the Lap PC would
also have something called attention detection software, a feature the
iPad doesn't have. The scenario Microsoft describes has you using the PC
to play a video game when someone knocks on your bedroom door. You look
away to see who it is, and your computer uses the built-in Webcam to
detect that your eyes are no longer directed at the screen. So the PC
automatically pauses the game you were playing until your eyes come back
to the computer's display.

It's an interesting idea, although I'll admit the idea that my computer
would constantly be scanning my face has a bit of that creepy Sci-Fi
feel to it. Nevertheless, I'd be interested to see if this could work in
something like a tablet.

Microsoft is still hoping to place a Windows computer in the family
living room to act as the media hub for the entire house. Using the
Family Hub PC, you would be able to take a photo slideshow from your
laptop and transfer it to a larger screen display in your home office.
You could also select a saved or online video from the laptop and
transfer that to the TV downstairs.

All of this functionality would run through a device hub that could
direct content from the laptop to the TV in the living room or to the
computer monitor upstairs. The device hub would be able to send data not
only between your PCs and laptops, but your televisions, computer
monitors, and other peripheral displays and devices.

This concept is nothing new; you can achieve much of this functionality
already using a variety of devices and software like Windows Home Server,
Windows Media Center, Xbox 360, Apple TV, Internet-capable HDTVs, and the
forthcoming Google TV. The problem with the integrated home computing
concept is that it has never been easy for users to set up, a problem that
Microsoft would have to solve.

The purported leaked Windows 8 documents offer an interesting view into
what Microsoft is rumored to be thinking about for the future. How much
of this we'll actually see in real products is anybody's guess.

It's not entirely clear where these documents came form, but the forum
win7vista has apparently had them available for download since June 19.



Big Seagate 3TB Drive Ups Storage Ante


It feels like we've been stuck at 2TB forever. Not anymore: Seagate
announced it's shipping the industry's first 3TB hard drive, the
FreeAgent GoFlex Desk External Drive.

This news is significant because Seagate has figured out a workaround to
the long-standing constraint that has kept hard drive capacity maxed out
at 2TB. (The first 2TB hard drive debuted from Western Digital debuted a
year-and-a-half ago.) Moreover, not only is the FreeAgent GoFlex Desk the
first to break past that limitation, it does so at a reasonable cost per
gigabyte: The drive, with a USB 2.0 connector, will sell for $250, which
works out to $0.08 per gigabyte. By comparison, Seagate sells its 2TB
GoFlex Desk External Drive (also with a USB 2.0 base) for $190, or $0.09
per gigabyte. As with other drives in the GoFlex line, you can swap out
the USB 2.0 base for optional USB 3.0 or FireWire 800 modules, which will
provide better performance.

That's a lot of storage for a single drive. While the company doesn't
specify its file parameters, it does say its 3TB drive can store up to 120
high-definition movies, 1,500 video games, thousands of photos, or
"countless" hours of digital music. Already, I'm thinking about how many
18-megapixel RAW images I can store on a single drive.

So what's kept capacity back all this time? According to Seagate engineers,
the 2TB limitation was neither an issue with the file structure (Window's
NTFS) nor with the Windows operating system itself. Rather, the issue lay
with the master boot record (MBR) partition table, contained in the first
sector of a hard disk drive. The partition table used with Windows XP and
earlier Microsoft operating systems was limited to just 2.2TB - which, a
decade-plus ago, seemed an unthinkably high number. The table works by
using numbers to represent the starting sector and number of sectors of a
partition, and it maxes out at 2.2TB (using 512-byte sector sizes).

Windows Vista and Windows 7 introduced a new partition table scheme, dubbed
GPT (for GUID Partition Table). The GPT blasted past the previous
limitations by supporting up to 8 Zettabytes.(2^64 sectors, which, when
using 512 bytes per sector, equates to 8 Zettabytes). For perspective,
consider that 1024 Terabytes=one Petabyte; 1024 Petabytes=one Exabyte; and
1024 Exabytes =one Zettabyte. Windows Vista and Windows 7 maintain backward
compatibility by also reading and writing MBR partitions.

Seagate's work-around for the 3TB drive is to make the MBR report a 4K
sector size to the operating system, in order to accommodate a larger drive
inside. As such, the drive can then work with Windows XP - still a major
factor in the marketplace Windows Vista, and Windows 7, as well as with Mac
OS and Linux (neither of which ever had to deal with this partition table
constraint to begin with).

With this SmartAlign Technology, as Seagate calls it, the 3TB drive
achieves its capacity boost without increasing areal density. The 3TB drive
has five platters, each with 600GB. That's one platter more than current
2TB drives have. The increased capacity comes from the 4K sector size; as
such, it has fewer sector "headers" required on the drive itself. This in
turn allows for more space to be allocated to data.

As enticing as 3TB sounds, though, this won't be the end of the line for
increasing 3.5-inch hard drive storage this year. Storage analyst Tom
Coughlin, of Coughlin Associates, notes "I expect we will see up to 750GB
to 800GB per platter on 3.5-inch drives before the end of this year. That
would give us 3TB or more with a four-platter drive, or approaching 4TB
with a five-platter drive." Expect more advances in areal density for
2.5-inch drives, too; there, Coughlin expects us to see a two-platter, 1TB
drive that will fit in a standard z-height notebook computer later this
year.



Dell Backtracks on Linux Being Safer than Windows


Recently Dell did something amazing. The Austin, TX computer giant admitted
on one of their Web pages that "Ubuntu [Linux] is safer than Microsoft
Windows. But, now Dell has backed off to the far more generic "Ubuntu is
secure". Boo!

The explanation for both statements has also changed a bit. When the
statement was stronger, Dell's explanation read, "The vast majority of
viruses and spyware written by hackers are not designed to target and
attack Linux." Now, it's been watered down to read, "According to
industry reports, Ubuntu is unaffected by the vast majority of viruses
and spyware."

Ah, no. Anyone who pays any attention to operating system security knows
that _Windows is insecure both by design and by poor execution. Linux,
while far from perfect, is far more secure.

You see Windows was designed as a single-user, non-networked operating
system. That design is still at the heart of Windows, which is why
security must always be an add-on to Windows. Linux, in contrast, was
built from the ground up as a multi-user, networked system. Linux, like
Unix, which came before it, was constructed to work in a world with
hostile users.

Of course, Windows fans like to re-frame the security argument as the
'real' reason why Windows is attacked more often is because Windows is
more popular than Linux. To which, I say, "So what?" Even if that was
the only reason, in practice that makes Windows far less secure than
Linux. You don't have to take my word for it. Just start glancing at the
titles in our ComputerWorld security stories, and count the ones about
Windows, and then count the ones pertaining to Linux. Enough said.

Besides, at the Internet server level, Linux is already as popular as
Windows. Google, Yahoo, Facebook, all the top Internet sites, except the
ones owned by Microsoft, run Linux. If a hacker really wanted to score
big, would you want to crack some guy running Windows 7 or Google?

So why did Dell back down from their claims for Ubuntu Linux. I'm not
getting any answers from Dell, but I think it's pretty easy to guess:
Microsoft took note of people talking about Dell saying nice things about
Linux, and decided to "have a word" with Dell. Microsoft has been pushing
the computer vendors around for decades - which is why Windows is so
popular, not because Windows is better than the alternatives.

But, and this is what I find really interesting, Dell didn't pull the
comments about Linux being more secure. They just softened them. I think
this speaks volumes. It means that Dell remains committed to Ubuntu Linux
on its laptops and netbooks. It also means that Microsoft can't get away
with being the bully it once was to computer manufacturers.

Sure, I wish Dell has stuck by its original wording, but even so, I think
this is another sign that we're seeing the _beginning of the end for
Microsoft's domination of computing. I'm not the only one.

Reuters reports that Amit Midha, Dell's president for Greater China and
South Asia, said, "There are going to be unique innovations coming up in
the marketplace in two, three years, with a new form of computing, we want
to be on that forefront ... So with Chrome or Android or anything like
that we want to be one of the leaders." Interesting don't you think?



Google's Chrome Passes Safari in US Browser Share


Google Chrome is now the third-most-popular browser in the U.S., behind
Microsoft's Internet Explorer and Mozilla's Firefox, but ahead of Apple's
Safari for the first time, according to a study by Web analytics company
StatCounter.

Chrome overtook Safari during the week ending June 27, and now has a share
of around 8.97 percent of the U.S. browser market, just ahead of Safari at
8.88 percent. Internet Explorer and Firefox still dominate with shares of
52 percent and 28.5 percent respectively, all versions combined,
StatCounter said.

Outside the U.S., Chrome passed Safari some time ago, and now has a 9.4
percent share, compared to Safari's 4 percent. That may be down to the
two applications' support for different languages: Safari is available
in 16 languages, while Chrome covers more than three times as many.

StatCounter said it based its statistics on an analysis of 3.6 billion
page views, 874 million of them in the U.S., captured from its network
of counters embedded in the pages of 3 million Web sites.



Internet Explorer 8 Growing Three Times Faster than Chrome


After months of consistent declines in overall market share, Internet
Explorer had an overall gain in May, but only in the United States. The
latest browser market share trends show that Internet Explorer continues
to reverse its losses and make gains in market share - this time globally.
More specifically, Internet Explorer 8 is leading all competitors and
gained more than three times as much as Google Chrome.

By browser version, Internet Explorer occupies three of the top four market
share positions - with Internet Explorer 8 leading the way. Internet
Explorer 8 market share grew .66 percent over last month to 25.84 percent
of the market. Internet Explorer 6, unfortunately, is still the number two
browser with 17.17 percent, while Internet Explorer 7 is in fourth behind
Firefox 3.6 with 11.79 percent of the market.

Combining all versions of Internet Explorer together, the overall market
share for the Microsoft Web browser grew from 59.75 percent to 60.32
percent. The gains came primarily at the expense of Firefox, which
dropped .51 percent. Chrome had a slight gain of .2 percent, but still
drags in with only an eighth of the market share of Internet Explorer.

Ryan Gavin, senior director of Internet Explorer business and marketing
for Microsoft, elaborates on the Net Applications browser statistics in
an Exploring IE blog post. "In June, Net Applications shows overall
Internet Explorer share growing by 0.57% worldwide. Internet Explorer 8
share continues to be the fastest growing browser with a 0.66% increase
in share, more than 3 times the growth of Google Chrome, while Firefox
share declined."

Other than bragging rights, is any of this even relevant? As companies
explore the myriad of browser options available, there is something to
be said for going with the flow and choosing the browser with nearly
two-thirds of the market.

Despite claims by Web development purists and Microsoft bashers that
Internet Explorer doesn't follow accepted Web standards and conventions,
when a browser dominates the market place the way Internet Explorer
does, it more or less dictates the standards.

To Microsoft's credit, though, it does try to work with the established
Web standards. Development of IE9 is underway, and Microsoft is working to
embrace HTML5, and font-rendering standards, as well as striving to
improve the performance of Internet Explorer on accepted Web browser tests
like Acid3.

Internet Explorer, Firefox, Chrome, and other Web browsers are all
equally capable of surfing the vast majority of the Web. However, as
developers create custom apps and add-ons to extend the functionality of
a given browser, or provide additional interactivity and expand the
Web-surfing experience, they are more likely to invest that time and
effort developing for the platform that has two-thirds of the market.

The reality is that malware attackers may also target Internet Explorer
for the same reason - larger pool of potential victims. However, many
Web-based attacks tend not to be browser-specific, and recent testing
has demonstrated that Internet Explorer 8 actually beats all competitors
in guarding against Web-based socially engineered malware attacks.

Given the vast variety of browsers out there today, I doubt we'll ever
see Internet Explorer return to its virtual monopoly glory days of
90-plus percent market share. But, the reversal of fortune over the past
couple of months demonstrates that Internet Explorer is also not going
to just slowly die away.



IBM Endorses Firefox As In-house Web Browser


Technology giant IBM wants its workers around the world to use free,
open-source Mozilla Firefox as their window into the Internet.

"Any employee who is not now using Firefox will be strongly encouraged
to use it as their default browser," IBM executive Bob Sutor said
Thursday in a blog post at his sutor.com website.

"While other browsers have come and gone, Firefox is now the gold
standard for what an open, secure, and standards-compliant browser
should be."

Making Firefox the default browser means that workers' computers will
automatically use that software to access the Internet unless commanded
to do differently.

All new computers for IBM employees will have Firefox installed and the
global company "will continue to strongly encourage our vendors who have
browser-based software to fully support Firefox," according to Sutor.

New York State-based IBM, known by the nickname "Big Blue," has a
corporate history dating back a century and now reportedly has nearly
400,000 workers.

"Today we already have thousands of employees using it on Linux, Mac,
and Windows laptops and desktops, but we?re going to be adding thousands
more users to the rolls," Sutor said.

Sutor is the vice president of open source and Linux at IBM, which
launched an Open Source Initiative in 1998. Open-source software is
essentially treated as public property, with improvements made by any
shared with all.

Firefox is the second most popular Web browser in an increasingly
competitive market dominated by Internet Explorer software by Microsoft.

Google Chrome has been steadily gaining market share, last week
replacing Apple Safari as the third most popular Web browser in the
United States.

"We'll continue to see this or that browser be faster or introduce new
features, but then another will come along and be better still,
including Firefox," Sutor said.

"I think it was Firefox and its growth that reinvigorated the browser
market as well as the web. That is, Firefox forced competitors to
respond."



U.S. Government Slowly Adopting Web-Based Computing


The United States, seeking to modernize technology and reduce costs, is
embracing "cloud computing," but privacy and security issues need to be
ironed out during a decade-long transition, U.S. officials said on Thursday.

In a fledgling and rapidly growing industry, companies like Google Inc,
Microsoft Corp and Salesforce.com Inc, the world's biggest maker of
Web-based software, are trying to move rapidly to provide cloud computing
services - where data is stored on remote servers - to corporations and
the U.S. government, which spends about $80 billion each year on
technology.

"The cloud can allow teleworkers to easily and securely access their
data and work from wherever they happen to be," Mike Bradshaw, who heads
Google's federal cloud computing program, said at a congressional hearing.

"The cloud saves taxpayers money," said Bradshaw.

Proponents say cloud computing allows employees to collaborate more
easily,

  
reduces the time it takes to install patches on thousands of
individual desktops and provides greater flexibility for U.S. employees
working remotely, which in turn helps reduce energy consumption due to
less travel.

Concerns about whether the U.S. government should further adopt cloud
computing largely center on security of U.S. data and networks from
malicious attacks, snooping enemy governments and theft of information
by criminal organizations.

At a hearing to examine the benefits and risks to cloud computing,
lawmakers and government officials said while estimates on costs savings
range widely, issues over security, privacy, and data management remain
unaddressed.

"As we move to the cloud, we must be vigilant in our efforts to ensure
the security of government information, protect the privacy of our
citizens, and safeguard our national security interests," U.S. Chief
Information Officer Vivek Kundra told lawmakers.

Last year the U.S. government opened an office to coordinate efforts in
developing cloud computing standards, security and procurement across
the various agencies.

While the transition is expected to take a decade, several U.S. agencies
including the departments of energy and defense and the Securities and
Exchange Commission have adopted it.

Each agency decides whether to adopt cloud computing, but fragmentation
among agencies on standards and procurement procedures has resulted in
high costs and lost time, Kundra said.

As a result, the National Institute of Standards and Technology is
developing standards and the Government Services Administration is
looking into streamlined procurement process, he said.

The U.S. move to cloud computing also comes during concerns over global
cyber attacks, illustrated by a dispute between giant U.S. Internet
company Google and China over an attack Google said came from within
China.



Which ISP Is Fastest?


Internet service providers get a pretty bad rap. Everyone seems to be
convinced that his Internet provider is the worst, and that everyone
else in America has a dreamily fast connection to the Web.

The results of PC Magazine’s new tests of ISPs across the nation are in,
and they're hardly what you’d expect. When it comes to Internet performance,
in the aggregate, none of us are exactly living it up. The fastest
throughput in America clocks in, on average, at a measly 1.22Mbps.

Broadband throughput was measured by SurfSpeed (an application you can run
for yourself with a free download.

Nationwide, the top spot went to Verizon’s FiOS, a fiber-optic connection
available only sparsely around the country. But Verizon - widely thought to
be vastly faster than the rest of the industry’s ISPs - is hardly leaps and
bounds above its competition. Cox and Optimum Online managed to creep up
close behind Verizon, with 1.14Mbps and 1.12Mbps average throughputs,
respectively.

One thing is certain: In the grand scheme of things, you’ll get the
fastest results with a fiber-optic-based ISP, followed by a cable-based
provider, then a DSL line. The numbers stack up pretty cleanly for the
three major types of delivery systems, with the average DSL provider
roughly half the speed of Verizon’s FiOS.

On a regional basis, things look pretty similar across the country. The
only outliers are in the South, where Verizon’s FiOS seems unusually
slow (placing behind Cox and Comcast’s cable-delivered Internet) and the
Midwest, where FiOS isn’t widely available and Road Runner (cable)
narrowly beats out Qwest (DSL) service.

Oddly enough, most users nationally seem to be reasonably satisfied with
their Internet service, with even the cellar-dwellers earning a 6.6 out of
10 satisfaction rating, or higher. Verizon manages to top the charts
overall with a 9 out of 10 rating, and that’s great for the company as long
as its users don’t start looking internationally to compare how things
look. Stand that 1.22Mbps up to South Korea’s 14.6Mbps average broadband
speed and that rating suddenly looks awfully generous.



Sex.com Domain Name on Sale


The world's "most valuable" Internet domain name, sex.com, went up for
grabs on Thursday having fetched 12 million dollars in 2006, a German firm
handling the sale said on Thursday.

"It happens very rarely that an Internet address of this calibre goes on
sale," Cologne-based Sedo said. "(The) sale of sex.com offers the new
owner a unique opportunity to became market leader."

Sedo, which said it is the world's biggest trading platform for domain
names, is selling sex.com on behalf of US firm Escom after creditors
filed for insolvency protection, a joint statement said.

Other domain names have also changed hands for huge sums in the past,
with vodka.com selling for three million dollars, kredit.de for 892,500
euros (1.1 million dollars) and poker.org for one million dollars, Sedo
said.

"Owners of domains like this have a clear competitive advantage. Visitors
land automatically on the websites of the owners just by entering what
they are looking for. The listing in search machines is also improved,"
it added.



Dr. Demento Leaving Radio for the Internet


Listen closely, that's the sound of demented music dying that you're
hearing on your radio.

After nearly 40 years of broadcasting catchy little tunes celebrating
everything from dogs getting run over by lawnmowers to cockroaches
devouring entire cities, Dr. Demento is discontinuing his syndicated
radio show.

By summer's end, the good doctor's hyper-enthusiastic voice will be
heard only on the Internet as it introduces oddball classics such as
"There's a Fungus Among Us," "Fish Heads" and "Dead Puppies."

For decades Demento has been a Sunday-night fixture on radio stations
across the country, keeping alive the music of political satirists like
Tom Lehrer ("The Vatican Rag"), while making a star of "Weird Al"
Yankovic, whose first hit, "My Bologna," debuted on the doctor's show.

"He kept my whole career alive by playing Freberg records constantly,"
says Stan Freberg, the Grammy-winning song satirist who, at 83,
continues to write and perform comedy music and make public appearances.

Recently, however, the radio stations carrying Demento's show declined
to fewer than a dozen. He had planned to stop syndicating it this month
until he learned a college station in Amarillo, Texas, had committed to
airing it through the summer.

Over the decades, Demento, who was inducted into the Radio Hall of Fame
last year, has kept his playlists contemporary. But it was changing
radio formats that did in his syndicated show, said Demento, 69, who in
a parallel life is Barret Hansen, music writer and ethnomusicologist.
His college master's thesis was on the evolution of rhythm and blues.

"With the increasingly narrow casting, as they call it, of radio where
stations will pick one relatively restricted format and stick with it 24
hours a day, especially in the music area, my show just got perhaps a
little too odd of a duck to fit in," he said.

The program has always been built on Demento's personal music
collection, which numbers in the hundreds of thousands and includes
every recording format from antique wax cylinders to modern-day digital
downloads. He says he's long since lost count of how many recordings he
keeps in the Southern California home he shares with his wife, Sue, but
puts the number somewhere north of 300,000.

When he started putting them on the radio in 1970, it wasn't all that
unusual for a pop station to play a record by blues-rocker Eric Clapton,
followed immediately by one from crooner Frank Sinatra. With that
dichotomy, broadcasting a variety show that would include songs like
Sheb Wooley's "Purple People Eater" and "Weird Al's" Grammy-winning
Michael Jackson takeoff "Eat It" didn't sound so out of place.

But those days of radio appear over, says broadcast veteran and
University of Southern California's writer-in-residence Norman Corwin.

"Radio has been relegated to programs like Rush Limbaugh and other talk
shows and (on the music side) niche formats," said Corwin, who has
worked in and followed the broadcast industry for more than 70 years.
"They do have a huge following, and a huge influence," he says of such
shows. "But the variety programs are gone. That's a shame."

They've gone to the Internet, says Demento, who has been doing a
separate Web show there for several years. On the Web, he says, he can
play even a wider selection of music, including tunes too raunchy or
outrageous for FCC-regulated terrestrial radio.

"I prefer to think of it as just transitioning to a new medium rather
than it coming to an end," he says of the show, which will mark its 40th
anniversary in October.

"It's kind of like when we changed from cassettes to CDs," he adds in
that distinctive Dr. Demento voice.



=~=~=~=




Atari Online News, Etc. is a weekly publication covering the entire
Atari community. Reprint permission is granted, unless otherwise noted
at the beginning of any article, to Atari user groups and not for
profit publications only under the following terms: articles must
remain unedited and include the issue number and author at the top of
each article reprinted. Other reprints granted upon approval of
request. Send requests to: dpj@atarinews.org

No issue of Atari Online News, Etc. may be included on any commercial
media, nor uploaded or transmitted to any commercial online service or
internet site, in whole or in part, by any agent or means, without
the expressed consent or permission from the Publisher or Editor of
Atari Online News, Etc.

Opinions presented herein are those of the individual authors and do
not necessarily reflect those of the staff, or of the publishers. All
material herein is believed to be accurate at the time of publishing.

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT