Thursday, February 24, 2011

Large Hadron Collider powers up to unravel mysteries of nature

Physicists at CERN rob hydrogen atoms of their protons, compact them into bunches and inject them into the series of four accelerator rings. Each ring in the sequence amplifies the energy of the protons and guides the subatomic particles with powerful magnets. Finally, two proton beams are flung in opposite directions at nearly the speed of light and smashed together at one of four detectors. Then the sparks fly. Credit: Justin Eure/MEDILL
Outside the small village of Meyrin, Switzerland, horses graze quietly in fields lined by the Jura mountains. You'd never know it by the idyllic landscape, but 300 feet below the Swiss-French border, the Large Hadron Collider is searching for the secrets of the universe. A 17-mile circular tunnel houses the world’s largest atom smasher that is once again firing high-energy proton beams.
On Monday, the CERN Control Center turned on the LHC beams to begin the next two-year run of the particle collider. CERN directors decided to extend the run through the end of 2012, instead of shutting down in 2011 for repairs as previously planned, and spirits are running high among scientists working in the field of new physics.
Researchers from across the world engineer detectors and seek to solve the mysteries of matter in an international collaboration that reaches from Chicago to Mumbai.
“Recently there was a convention in Chamonix,” said Georgios Choudalakis, a Greek physicist on the ATLAS experiment at the LHC. “The heads of the experiments and the director of the laboratory decided that we will take data for 2 years. And the decisive criterion for this was the sensitivity to the Higgs, so we’re optimistic.”
The search is on for the Higgs! The international team of scientists at CERN recalls the bumpy history of the Large Hadron Collider, from disastrous delays to recent results that are exceeding expectations. The physicists anticipate breakthroughs in the next two years that will change our fundamental understanding of the universe. Video credit: Chelsea Whyte and Justin Eure/MEDILL.
This comes on the heels of the news that 2011 will be the end of the run for the Tevatron, the second most powerful particle collider in the world located at the Fermi National Laboratory in Batavia. After setbacks and shutdowns, the LHC had collisions in 2010 that went even better than expected. “We made it clear even to ourselves that the page has turned,” said Choudalakis. “The energy frontier is not at the Tevatron anymore. We are cutting more ice here.”
Now, the hunt for the Higgs is on at CERN, the Conseil Européen pour la Recherche Nucléaire.
The elusive Higgs particle is, according to the theory, a fundamental building block of matter and the reason everything has mass.
“Nobody can explain where mass comes from, but we know it’s there,” said Pauline Gagnon, a French physicist at the ATLAS experiment. This conundrum is the most important question physicists have to answer, she said.
“If you think of a one pound bag of salt and you add up the weights of each grain of salt, they will logically equal one pound,” said Gagnon. But, when physicists break down atoms in this way and try to determine the weight of the pieces inside, the calculations of the weight of atomic building blocks such as quarks and electrons don’t add up, she said. Here’s where the Higgs comes in.

Google turns up the enterprise collab heat on Microsoft

Google will intensify its attack on Microsoft's enterprise collaboration business with the release on Thursday of the Cloud Connect plug-in for Microsoft Office and with the launch of a trial program for the collaboration components of Google Apps.
In limited release since November, Cloud Connect is now available to all Microsoft Office 2003, 2007 and 2010 users who also have an individual Google account or a Google Apps account.
Based on technology from DocVerse, a company that Google acquired about a year ago, the Cloud Connect plug-in lets Office users share and collaboratively edit their documents by storing them on Google's cloud infrastructure but without leaving the Office interface.
Once stored on Google servers, Office documents receive a unique URL and pop into their author's Google Docs file list. Cloud Connect tracks changes and edits made to documents and lets users revert to previous versions.
Because users work at all times with the Microsoft software, documents don't need to be converted to Google Docs, avoiding formatting problems.
Docs, a hosted office productivity suite freely available with an individual Google account, competes with the Office desktop suite and with its online companion Office Web Apps, which does allow for cloud-based collaboration.
Docs is also part of Google Apps, a free, hosted collaboration and communication suite that competes with Microsoft Exchange through its Gmail component. Apps is free, except its Business edition, which costs $50 per user per year.
Apps also competes against Microsoft's BPOS (Business Productivity Online Suite), a hosted suite that includes online versions of Exchange 2007, SharePoint 2007 and Office Communications Online but not Office Web Apps.
Microsoft plans to ship this year a BPOS upgrade called Office 365 with online versions of Exchange 2010, SharePoint 2010 and Office Communications Server 2010 (renamed Lync). It also comes with Office Web Apps, and in some configurations includes the full-featured Office 2010 offered on a hosted, subscription basis.
Clearly, in Google's ideal world there would be no need for Cloud Connect because everyone would use Apps instead of Office, Exchange and SharePoint.
To that end, Google is also launching on Thursday a 90-day trial program called Appsperience for organizations to test drive the collaboration components in Apps, such as Docs and the Sites website builder.
The program doesn't include Gmail, but organizations that sign up for Apps at the end of the trial period do get the full suite, including Gmail.
The program costs $7,000 for organizations with 500 users or less, and $15,000 for those with more than 500 users. Google and its reseller partners will actively help participating organizations get set up and trained on using the Apps collaboration components.
The program includes access to a new usage analytics dashboard that Google is also rolling out to all Apps for Business and Apps for Education customers.
The dashboard provides granular stats on Apps usage patterns within an organization, so that administrators in the trial program can eventually decide how many Apps licenses they truly need.
At first, organizations were drawn to Apps mostly by Gmail, but in the past two years, Google has seen a spike in interest and usage of Docs and Sites, said Jeremy Milo, Google Apps product marketing manager.
"That's what sparked the idea for this program," Milo said.
The general availability of Cloud Connect is expected to boost this trend even further, he said.
Industry analyst Rebecca Wettemann from Nucleus Research sees the two-punch announcement as a serious escalation of the collaboration wars between Google and Microsoft.
"This highlights the diminishing benefit of Office upgrades and it's likely to make companies reconsider their Office strategy. If everyone needs collaboration, and only power users need all the capabilities of Office, why buy both Google and Office for everyone?," she said via e-mail.
"We see the enterprise search space in three tiers: Web, desktop, and enterprise application. Google already owns the Web, this gives them more opportunity on the desktop," Wettemann added.

Hacker claims credit for knocking church's site offline

Twitter post suggests 'The Jester' may have been responsible for knocking controversial church offline
A Twitter message from Monday suggests that a seld-proclaimed "hacktivist" using the handle The Jester may have been responsible for knocking the controversial Westboro Baptist Church offline.
In the message, the hacker claimed to have temporarily taken down the public website of the church "for celebrating the death of U.S. troops."
The message, however, made no direct mention if The Jester (@th3j35t3r on Twitter) was also responsible for the unavailability today of several other websites affiliated to the WBC.
Members of the WBC church, based in Topeka, Kan., are known for their strident anti-gay views and for protests at funerals of slain military personnel and others.
Last week, someone purporting to be from the hacking collective known as Anonymous, posted a letter on an Anonymous site, warning WBC members of attacks against their church public websites if they did not stop their protests.
The letter lamented the "inimitable bigotry and intolerant fanaticism" of the protesters and warned of online attacks that the church would not be able to withstand or recover from.
That letter was later dismissed as a hoax by Anonymous, which has been involved in several high-profile attacks recently, including one against the security firm HBGary.
Shirley Phelps-Roper of the Westboro Church today said such attacks are not unusual for the church.
"Every time we look up somebody is doing something to us," she said claiming that in the past the church's website has been attacked by hackers not just from the U.S. but also from other countries such as the Netherlands and New Zealand. "All such attacks do is to cause somebody to look at us," she said.
She attributed the latest attack on the military but offered no explanation for her claim.
This morning, all of the church's sites were unavailable. It is not immediately clear how long the sites have been down and what role, if any, The Jester or Anonymous may have played. There has been no response yet from Anonymous members.
The Jester, previously claimed responsibility for launching distributed denial of service attacks against WikiLeaks last year in response to what it claimed was WikiLeaks' role in endangering the lives of U.S. troops.

Update: Firefox update will patch CSRF bug, Mozilla says

Delayed Firefox 3.6.14, 3.5.17 to ship March 1, fix cross-site request forgery bug that can be exploited via Flash
Mozilla said late Wednesday that it will ship security updates to Firefox 3.5 and Firefox 3.6 next week that will include a patch for a bug that can be exploited using a malicious Adobe Flash file.
(Editor's note: An earlier version of this story, published before Mozilla responded to a request for comment, said company meeting notes suggested that the Firefox security updates would not include the patch.)
Firefox 3.5.17 and Firefox 3.6.14 will now appear Tuesday, March 1, Mozilla disclosed in meeting notes published today.
Originally slated for release Feb. 14, the security updates were held while Mozilla developers investigated a bug that affected some, though not all, users of the betas. According to Mozilla, the bug caused some copies of the updates to repeatedly crash. Mozilla then backed out a recent fix to retest the betas.
Around the same time, another problem -- a separate cross-site request forgery (CSRF) vulnerability -- surfaced that Mozilla needed to patch. "Adobe is pressing for a release due to a public CSRF issue," Mozilla said last week.
The vulnerability is in Firefox, but Adobe's involved because the vulnerability can be exploited using a malformed Flash file.
According to patch information posted Feb. 8 by the open-source Ruby on Rails Web development framework, and a follow-up message two days later on a security mailing list, the CSRF bug can be exploited by "Certain combinations of browser plug-ins and HTTP redirects."
An attacker could exploit the vulnerability to bypass the built-in CSRF protections of Ruby on Rails -- and that of Django, another Web development platform, which also patched its products earlier this month -- and successfully attack a Web application built with those tools.
The security mailing list message posted Feb. 10 spelled out several affected browsers, including Firefox -- including an earlier beta of Firefox 4 -- as well as Google's Chrome and Apple's Safari on both Windows and Mac OS.
That same message also said that a Google security researcher had first reported the CSRF vulnerability.
Last week, an Adobe spokeswoman said she knew nothing about a potential zero-day that would impact its software and/or Firefox.
Mozilla will patch the CSRF flaw in both Firefox 2.5.17 and Firefox 3.6.14 when they ship next week, a spokeswoman for that company confirmed late Wednesday.
The timing of the update may help Firefox survive the Pwn2Own, the hacking contest that kicks off March 9 at the CanSecWest security conference in Vancouver, British Columbia.
Firefox will be one of four browsers -- the others are Chrome, Safari and Microsoft's Internet Explorer -- that will be targeted by attackers hoping to walk off with $15,000 or $20,000 in cash. Pwn2Own's rules state that the targeted browsers will be "the latest release candidate at the time of the contest," meaning that researchers will have to tackle Firefox 3.6.14.
Last year, Mozilla confirmed a critical vulnerability in Firefox less than a week before 2010's Pwn2Own, but said it wouldn't fix the flaw until after the contest. Pwn2Own organizers then ruled that hackers would not be allowed to use the vulnerability to exploit Firefox.

Researchers create computer that fits on a pen tip

Micro-computer's first use will be for medical monitoring
Researchers at the University of Michigan today announced they have created the first prototype for a millimeter-scale computing system that can hold up to a week's worth of data when implanted in something as small as a human eye.
The computer, called the Phoenix chip, is just over one cubic millimeter in size and was designed to monitor eye pressure in glaucoma patients.
"This is the first true millimeter-scale complete computing system," Dennis Sylvester, a professor at the school and one of the researchers on the project, said in a statement.
Within the computer is an ultra low-power microprocessor, a pressure sensor, memory, a thin-film battery, a solar cell and a wireless radio with an antenna that can transmit data to an external reader device held near the eye.

The Phoenix chip computer just covers the 'N' on a penny

The chip uses a power gating architecture with an extreme sleep mode that powers the computer up briefly every 15 minutes to take readings. By remaining in sleep mode most of the time, the chip sips power, averaging 5.3 nanowatts every time it turns on.
The Phoenix chip's photovoltaic system requires 10 hours of indoor light or 1.5 hours of sunlight to fully charge the battery.
The chip's micro radio automatically tunes into whatever wireless frequency is available in order to download data to a reader. The data can then be used as part of an electronic medical record for treatment.
According to researchers, the micro computers and their wireless networks could one day also be used to track pollution, monitor structural integrity, perform surveillance, or make virtually any object smart and trackable. "Our work is unique in the sense that we're thinking about complete systems in which all the components are low-power and fit on the chip," Sylvester said. "We can collect data, store it and transmit it. The applications for systems of this size are endless."
The researchers presented their papers on the new microcomputers and the networks at the International Solid-State Circuits Conference (ISSCC) in San Francisco. The work is being led by three faculty members in the University of Michigan Department of Electrical Engineering and Computer Science.
University professor David Blaauw said that once the chips reach the nanoscale level, hundreds of the computers could be fitted onto a single silicon wafer to perform multiple monitoring tasks.
The researchers pointed to Bell's Law, which states that there's a new class of smaller, cheaper computers about every decade. With each new class, the volume shrinks by two orders of magnitude and the number of systems per person increases. The law has held from 1960s' mainframes through the '80s' personal computers, the '90s' notebooks and the new millennium's smart phones, they said.
"When you get smaller than hand-held devices, you turn to these monitoring devices," Blaauw said in a statement. "The next big challenge is to achieve millimeter-scale systems, which have a host of new applications for monitoring our bodies, our environment and our buildings."

Apple boosts MacBook Pro speeds in 'ho-hum' refresh

Moves to dual- and quad-core 'Sandy Bridge' processors, debuts Intel's Thunderbolt I/O technology
As anticipated, Apple today refreshed its MacBook Pro notebook line, turning to Intel's new Sandy Bridge chip architecture and adding a new connectivity technology dubbed Thunderbolt that transfers data at speeds up to 10Gbps.
Prices for most of the new models have not changed. The entry-level 13-in. MacBook Pro runs $1,199, while the most expensive 15-in. laptop still costs $2,199. Apple did raise the price of the top-of-the-line 17-in. model by $200, however, to $2,499.
"A ho-hum product announcement," said Ezra Gottheil, an analyst with Technology Business Research, giving his assessment of the refresh.
All MacBook Pros now run one of Intel's Core i5 or Core i7 dual- or quad-core processors, and include Intel's integrated graphics chipset. The 13-in. models some standard with dual-core i5 processors, while the 15- and 17-in. MacBook Pros sport a quad-core i7. The larger notebooks also boast AMD's Radeon HD 6490M or 6750M discrete graphics, which Apple claimed were three times faster than the Nvidia graphics in the older models.
According to Apple, all MacBook Pros are "up to twice as fast as their predecessors."
The additional power comes at a price, however. All models now feature a battery that Apple said provides 7 hours of power between charges, down from the 10 hours it boasted for last year's 13-in. MacBook Pros and off the 8-to-9 hours estimated for 2010's larger 15-in. and 17-in. models.
Gottheil, however, said that the changes in battery life estimates resulted from a new, more rigorous testing procedure that Apple is now using, and dismissed the idea that the new processors and graphics were behind the battery declines.
Apple's 13-in. MacBook Pros are powered by a 2.3GHz or 2.7GHz Core i5, come with 4GB of memory standard, and feature a 320GB or 500GB hard drive. Both rely on the integrated Intel HD Graphics 3000 chipset. The smaller MacBook Pros are priced at $1,199 and $1,499.
The two 15-in. models -- down from three -- run a 2GHz or 2.2GHz quad-core i7 processor, come with 4GB of memory and a 500GB or 750GB hard drive, and include AMD's graphics processor with either 256MB or 1GB of graphics RAM as well as the Intel integrated chipset. Apple eliminated the middle-of-the-road 15-in. MacBook Pro, which was priced at $1,999, but kept the low- and high-end models at $1,799 and $2,199.
Apple's new 17-in. MacBook Pro comes standard with 2.2GHz quad-core i7, 4GB of memory, a 750GB hard drive and the top-end AMD graphics card with 1GB of RAM. It was the only model of the five to get a price increase.
All MacBook Pros also now sport a new I/O (input/output) technology developed by Intel, which formerly called it Light Peak but has renamed it Thunderbolt.
Thunderbolt offers direct bi-directional connections to high-speed peripherals such as data drives, and using optional adapters, to other technologies, including FireWire, USB, Gigabit Ethernet and Apple's DisplayPort.
Apple's notebooks were the first to launch with the Thunderbolt technology.
One rumored change -- a move to solid-state drives (SSD) to mimic the MacBook Air -- didn't come to pass, something that irked Gottheil.
"I'm deeply disappointed," he said. "I thought that at the least, [the new MacBook Pros] would use a clever hybrid drive that put the OS and apps on a solid-state drive."
Customers can swap out the traditional platter-based hard drive with a SSD, but the prices run from $100 for a 128GB SSD to $1,250 for a 512GB. (The prices vary by the original configuration of the MacBook, with lower swap-out prices for models that come standard with larger hard drives.)
"Yes" Gottheil said of the SSD prices.
Apple last revamped the MacBook Pro in April 2010, and today's refresh looks to Gottheil like a return to the line's roots.
"They're returning to the original definition of the line, where the [MacBook] Pros are for pros," Gottheil said. "And they may be trying to get back some of the margins they lost during the recession," he added. "They have come back some, but they're still not at the point they were before the recession."
Even with the increased processor, graphics and connectivity horsepower in the new MacBook Pros, Gottheil still thought that the aged MacBook and the four-month-old MacBook Air, both available for $999, is the "sweet spot" for most consumers.
He was especially high on the MacBook Air, which eschews a hard drive for an SSD. "An SSD addresses real people's performance issues, getting stuff off the disk," he maintained.
Last month, analysts disagreed over whether a design gaffe by Intel would delay the MacBook Pro relaunch. On Jan. 31, Intel acknowledged that a supporting chipset for the next-generation Sandy Bridge processors contained a flaw in the Serial-ATA (SATA) controller. The bug could cause poor hard drive performance or even make the drive invisible to the system, Intel confirmed.
Apple may have been able to sidestep the problem, and utilize flawed versions of the Intel chipset, since its notebooks could tap into the two unaffected ports to connect to systems' hard and optical drives.
The new MacBook Pros are available immediately at Apple's retail stores, some authorized resellers and via the company's online store. At the latter, the new models currently indicate a 1-to-2 day shipping delay.

100G Ethernet stays pricey as speed needs soar

Virtualization, video and massive amounts of data are all driving enterprises and service providers toward 100-Gigabit Ethernet, but the cost of the fledgling technology remains prohibitively high and few products have been installed, industry observers said at the Ethernet Technology Summit.
The 100GE standard was ratified by the IEEE last year, but the technology is just beginning to creep into use. Analyst Michael Howard of Infonetics Research estimates that only a few hundred ports of 100GE have been delivered and most of those are being used by service providers in tests or trials.
However, with the growing amounts of data going in and out of servers in big data centers, some large enterprises also are running into a need for something faster than 10-Gigabit or 40-Gigabit Ethernet, Howard said at the Ethernet Technology Summit this week in Santa Clara, California. Virtualization allows servers to run at higher utilization rates, and blade server chassis pack more computing power into a rack. Connecting these systems to top-of-rack switches, and linking those to other switches at the end of a row, is starting to require multiple 10-gigabit links in some cases, he said.
The Mayo Clinic is already testing 100GE as a possible replacement for multiple 10-Gigabit Ethernet links in its data center, which are needed because its virtualized servers can drive so much traffic onto the clinic's LAN. One reason is that Mayo doctors frequently consult with other physicians around the world and need to share medical imaging files such as CAT scans and MRIs, said Gary Delp, a systems analyst at Mayo.
Aggregated 10-Gigabit ports are still an inefficient way to create fast links within the data center, and 100GE should be more efficient, Delp said. He expects vendors to come out with aggregation technology that pushes traffic more efficiently, and whether users adopt that or 100GE will be a matter of economics, he said.
Some other large enterprises are in similar situations, according to Howard. Using four to eight aggregated links also typically takes up more space and power and generates more heat than a single high-speed connection does, Howard said. One solution administrators are beginning to use is 40-Gigabit Ethernet, which is somewhat less expensive and more readily available today, but the traffic curve points to a need for 100GE, he said.
Cost is one of the main barriers to adoption of 100-Gigabit Ethernet and is likely to remain so for the next few years, Howard said. Though per-port prices can vary based on specific deals, the cost of a 100GE port is still effectively six figures. Juniper Networks typically charges about ten times the cost of a 10-Gigabit Ethernet port, meaning 100GE can cost about $150,000 per port, a company executive said on Tuesday. Brocade Communications announced a two-port module with the new technology for $194,995 last year. It can be ordered now and is expected to start shipping in the first half of this year, the company said Wednesday.
The 100GE equipment is pricey because the technology is so new, according to Howard. Early components are expensive as well as large, power-hungry and hot, and there are still several generations to go in downsizing these parts for more economical systems. It's a fundamental problem of networking today, as the cost of equipment falls by about 15% per year but traffic increases by 45% or more, Howard said.
This isn't the fault of the Ethernet equipment industry or the Institute of Electrical and Electronics Engineers, which developed the 100GE standard, Howard said. At these speeds, the problem comes down to physics. "It's just that we're up against limits," he said. An IEEE study group is now looking at writing specifications for some of the components that go into 100GE equipment, with an eye toward creating a vendor ecosystem around interoperable products.
Analyst Nathan Brookwood of Insight 64 doesn't think price will deter enterprises that have to use 100GE in their data centers.
"The people who can't solve the problem any other way will cough up the bucks to solve it this way," Brookwood said. Initially, those will probably be operators of massive computing clouds, such as Facebook and Google, he said. A Facebook network engineer said a year ago that the company already needed 100GE. The new technology may also go into high-performance computing environments, according to Brookwood.
Despite these projections, 100GE remains far from being a mainstream technology. Even 10-Gigabit Ethernet is just starting to be used for server interfaces, albeit at a growing rate. The global market for 10-Gigabit Ethernet server interfaces was nearly $100 million in the fourth quarter, up 57% from a year earlier, research firm Dell'Oro Group reported this week. Some types of products for this technology are just beginning to reach the market. At the Ethernet event, Broadcom showed off its first server adapter based on the 10GBase-T specification, which uses standard Ethernet cabling and can reach 100 meters. It is expected to ship in commercial volumes in the second quarter.
Because of pricing and other issues, most observers don't expect 100GE to be widely used until 2013 in service providers and at least 2015 in data centers.
"100 Gigabits is a lot of bandwidth, and most environments haven't even figured out how to consume 10 gigabits," said Greg Scherer, vice president of server and storage strategy in Broadcom's High Speed Controller Business Unit.

Microsoft: Update glitch hit 10% of WP7 users

Microsoft shed a little bit of light on the problems it had sending out the first update for its Windows Phone 7 software, but it has still suspended updates to Samsung phones while it works out the issue.
Ironically, the update was designed to improve the software update process rather than add features that people have been waiting for, like cut and paste.
In a blog post late on Wednesday, Microsoft said that 90% of people who received an update notification installed the software successfully.
Nearly 5% of people who had a problem did so because they had a bad Internet connection or insufficient computer storage space, Microsoft's Michael Stroh wrote in the blog post.
A "technical issue" is affecting the update process for a "small number" of Samsung phones, he said. As a result, Microsoft has briefly suspended updates to Samsung phones until it can correct the problem, he said.
It's unclear if the issues will have an impact on the timing of Microsoft's next update, which CEO Steve Ballmer recently said would come out in early March. That update will add copy and paste, a function in high demand, and other performance enhancing changes, the company has said.
The HTC Arrive, the first CDMA Windows Phone 7, announced by Sprint Thursday, will include that update, Mark Elliott, a Sprint spokesman said. The Arrive is set to go on sale March 20.
Like many other smartphone software developers, Microsoft requires users to connect the phones to their computers and make the update download through its Zune software on a PC or the Windows Phone 7 Connector for the Mac.
Also like most other phone software makers, Microsoft is rolling the update out over time. It started on Tuesday and said it could be days or even weeks before some people get it.

Iran claims two new supercomputers

After Stuxnet attacks damaged its computer systems, Iran tries for an IT comeback via supercomputing
Iran's government is claiming that it has developed two new supercomputers powerful enough to earn rankings on the Top500 list of the world's most powerful systems.
The supercomputing announcement, made Wednesday, is being treated as a big deal in Iran and involves top Iranian government officials, including President Mahmoud Ahmadinejad.
If this announcement was made by any country other than Iran, it would get little attention. The larger of the two systems is far, far behind the current top-ranked system in China.
But a U.S. embargo means Iran has to buy many of its components from the black market. The country is also a target of a cyberwar, as the Stuxnet worm illustrated.
Ahmadinejad discussed this project, via a videoconference with officials at the two universities where these systems were installed, according to government media press reports.
Iran's supercomputing claims could not be independently verified. It may well be a fake and an elaborate attempt to demonstrate IT prowess after the Stuxnet worm hit its nuclear control systems.
It could also be an effort by the regime to offer some distraction from the region's spreading turmoil threatening authoritarian governments.
A photo spread in one of Iran's news outlets shows what purports to be one of the two systems at Amirkabir University of Technology in Tehran.
In it are a series of racks on what may be a raised floor, not unlike a typical data center. Other photos show people at terminals and in conference rooms.
Reports include a claim that the largest system is capable of 89 teraflops. One teraflop equals 1 trillion floating point operations per second. The fastest computers in the world now exceed 1 petaflop, or a thousand trillion floating point operations per second.
Although an embargo prevents U.S. companies from selling microprocessors and other components to Iran, U.S.-made computer technology seems to be readily available in that country.
In 2007, for instance, Iranian officials disclosed the source of AMD chips in a Linux-based high-performance computing system. The name of a distributor in the United Arab Emirates was visible on the boxes in one of the photos.
AMD, at the time, said it had never authorized the sale. The photos were quickly removed from the Iranian site after the details were published.
But Iran's latest supercomputer announcement appears to have no details about the components used to build the systems. Iranian officials have not yet responded to request for details about them.
Mehdi Noorbaksh, an associate professor of international affairs at Harrisburg University of Science and Technology in Pennsylvania, urged skepticism around Iran's supercomputing announcement and said it may well be fake.
"The Iranian government is notorious for fabricating this kind of information -- believe me," said Noorbaksh. "When the government announces something like this, it is very difficult to confirm it."
Noorbaksh said Iran's announcement may be a reaction to the malware attack on its nuclear systems, to demonstrate to the world that "we are in control."
Iran's nuclear capabilities and many of its computers were hurt by the Stuxnet worm, which damaged the country's nuclear refining capabilities.
If Iran wants its systems considered for listing on the Top500 list, it will have to run the high-performance Linpack Benchmark and submit the results to the list. It has thus far not submitted anything to the list, according to a list official.

Update: HP reports mixed quarter on weak PC sales

Hewlett-Packard reported a jump in profit for the first quarter of its 2011 fiscal year but sales were dragged down by weakness in its PC and services divisions.

The poor performance from those groups prompted HP to lower its revenue forecast for the year, causing its share price to slump 12% after the results were announced.

"If you use Q1 as a marker, it's clear we do a lot of things well at HP. It's also clear that we have isolated areas that we need to improve," CEO Leo Apotheker told reporters on a conference call.

HP's profit for the quarter ended Jan. 31 was $1.36 per share before one-time items, up 27% from a year earlier and ahead of the $1.29 per share that financial analysts had expected, according to a consensus poll from Thomson Reuters.

Revenue was $32.30 billion, up 4% from a year earlier but below the $32.96 billion that analysts had been looking for.

Revenue from HP's services division declined 2% from last year to $8.6 billion. It did well with bigger, long-term deals, but short term contracts for IT outsourcing and application services fared poorly, Apotheker said.

Revenue from its Personal Systems Group declined by 1%, to $10.4 billion. Sales of business PCs were up 11%, HP said, but the growth was offset by weak sales to consumers, especially for netbooks computers. HP has also had channel problems in China, though Apotheker said the company is "getting past the issues."

HP's enterprise hardware business fared better. Revenue from servers, network and storage gear climbed 22 percent to $5.6 billion. And revenue from printers and related products increased by 7% to $6.6 billion, HP said.

"We reported good results across most of the business but we have opportunities to improve growth in a few areas," Apotheker said.

HP's shares ended the regular trading day at $48.23, down 1%. Following the earnings announcement they slumped more than 12% in the after-hours markets, to $42.40.

Apotheker took the reins at HP last November and is in the midst of a minor makeover to improve HP's profitability and competitiveness. He's due to outline his plans for the company at an event in San Francisco next month.

The former SAP executive is expected to expand HP's efforts in enterprise software, as well as other profitable areas such as networking and storage. On Tuesday he said he'll describe HP's view of technology trends and explain how they mesh with HP's products and services.

HP is also expanding its focus on mobile gadgets. Earlier this month it announced a tablet PC and two smartphones based on the webOS software it acquired when it bought Palm last year.

HP is also shuffling its board, replacing four directors that were involved in the decision to oust CEO Mark Hurd last year. The new members include Patricia Russo, the former CEO of networking vendor Alcatel-Lucent, and Meg Whitman, who used to run eBay.

What's still missing in the HTML5 spec

Although the HTML5 spec won't be finalized until July 2014, the W3C (World Wide Web Consortium), has scheduled its "last call" for feature-completeness for this May. So what's missing?

According to the editor of the specification, refinement of the HTML5's multimedia capability is the only outstanding issue to be resolved before the W3C can move to the final stage of the HTML5 specification effort: finalizing the technical specs, getting final comments, and creating test suites to validate interoperability across browsers and other technologies, said Ian Jacobs, head of W3C communications. That last stage will take about three years.

The multimedia holes in the HTML5 spec The primary aspect of multimedia capability to be resolved this spring is multitracking for audio and video, though the W3C isn't committing to having this capability in the final HTML5 spec. Multitracking would, for example, enable a choice of spoken languages to accompany a video, allow the presentation of a video within another video, and permit applications like chat rooms to display simultaneous audio from multiple people.

Also possible to be added to the HTML5 spec after the "last call" are extensions to the canvas 2D technology and the ability to mark up photo credits, said HTML5 specification editor Ian Hickson. He expects such additions to first come up in the WHATWG (Web Hypertext Application Technology Working Group) HTML5 proposals, then migrate to the W3C effort through the W3C HTML working group chairmen. (The two organizations are collaborating on the HTML5 spec.)

One technology not slated for HTML5 is a standard video codec. Developers of the specification have been unable to find a satisfactory open source codec to use, so they are leaving each browser maker to choose its own codec and instead providing standard APIs for them to use in HTML5. "That's pretty much the deal," Hickson said. "HTML5 doesn't care what the codecs are," Jacobs added.

The lack of a video codec does not have to be resolved for HTML5 to be completed, said Forrester Research analyst Jeffrey Hammond. "It's a pain for developers, but [they] can work around it for the time being" by encoding in multiple formats, he said.

WebSocket, developer tools not yet HTML5-ready HTML5 has been paired with complementary specifications, such as CSS (Cascading Style Sheets) 3 and WebSocket, for two-way Web communications, in an effort to promote an "open Web." HTML5 is in pretty good shape in terms of the language, but WebSocket capabilities are needed for applications like stock trading and real-time data feeds, Hammond said. "You can do that fine with Flash or Java," but not with HTML5, he said. There's also a security issue: WebSocket implementations have been pulled from current browsers because of concerns over potential screen hijacking, he said.

Developer tools support is also lacking for HTML5, Hammond said. Although there are some HTML-savvy tools available today, there's nothing on the order of a Microsoft Visual Studio or Adobe Dreamweaver, Hammond said.

HTML5's newest target date could slip again The July 2014 deadline for the final HTML5 standard is not set in stone, Hickson noted. He pointed out that in 2007, the W3C projected a final specification would arrive in 2010, which did not happen.

That lack of certainty may explain why Jacobs is championing use of HTML5 as it is right now. "We're telling people to use it already," he said. "The goal is to get feedback to improve interoperability," he added. That advice contradicts what Philippe Le Hegaret, the W3C interaction domain leader, said last fall, when he cautioned against deploying HTML5 in websites at the time because of the incompleteness of the specification.

Army of fake social media friends to promote propaganda

It's recently been revealed that the U.S. government contacted HBGary Federal for the development of software which could create multiple fake social media profiles to manipulate and sway public opinion on controversial issues by promoting propaganda. It could also be used as surveillance to find public opinions with points of view the powers-that-be didn't like. It could then potentially have their "fake" people run smear campaigns against those "real" people. As disturbing as this is, it's not really new for U.S. intelligence or private intelligence firms to do the dirty work behind closed doors.

EFF previously warned that Big Brother wants to be your friend for social media surveillance. While the FBI Intelligence Information Report Handbook (PDF) mentioned using "covert accounts" to access protected information, other government agencies endorsed using security exploits to access protected information.

It's not a big surprise that the U.S. military also wants to use social media to its benefit. Last year, Public Intelligence published the U.S. Air Force social media guide which gave 10 tips for social media such as, "The enemy is engaged in this battlespace and you must engage there as well." Number three was "DON'T LIE. Credibility is critical, without it, no one cares what you have to say...it's also punishable by the UCMJ to give a false statement." The Air Force used the chart below to show how social media influences public opinion.


The 6th Contracting Squadron at MacDill Air Force Base sought the development of Persona Management Software which could be used for creating and managing fake profiles on social media sites to distort the truth and make it appear as if there was a generally accepted agreement on controversial issues. "Personas must be able to appear to originate in nearly any part of the world and can interact through conventional online services and social media platforms." What happened to don't lie and the Uniform Code of Military Justice?

Everything revealed after Anonymous leaked emails from private security firm HBGary Federal is disturbing on many levels. However, the Daily Kos said with the Persona Management Software it would take very few people to create "an army of sockpuppets" which could distort the truth while appearing to be "an entire Brooks Brothers riot online."

So again I ask, what happened to number three . . . the rule about not lying that was also "punishable by the UCMJ to give a false statement"?

President and CEO of Plessas Experts Network, Inc, Kirby Plessas pointed out some of the unethical and potentially illegal activities that Aaron Barr's leaked emails suggested like "Chumming and baiting" which sounded like "entrapment of some sort." There would be no warrant for the data collected on individuals which could then be stored for how long? "THIS is the entire reason Intelligence Oversight was created - to avoid this sort of thing from ever happening again."

According to Redacted News, the leaked emails showed how names can be cross-referenced across social media sites to collect information on people and then used to gain access to those social ciricles. The emails also talked of how Facebook could be used to spread government messages:

Even the most restrictive and security conscious of persons can be exploited. Through the targeting and information reconnaissance phase, a person's hometown and high school will be revealed. An adversary can create a classmates.com account at the same high school and year and find out people you went to high school with that do not have Facebook accounts, then create the account and send a friend request.

Under the mutual friend decision, which is where most people can be exploited, an adversary can look at a targets friend list if it is exposed and find a targets most socially promiscuous friends, the ones that have over 300-500 friends, friend them to develop mutual friends before sending a friend request to the target. To that end friend's accounts can be compromised and used to post malicious material to a targets wall. When choosing to participate in social media an individual is only as protected as his/her weakest friend.

Lots of people have multiple online aliases, Facebook or Twitter accounts for both business and private life. What most bothers me is the lying and seemingly unethical means to an end. Although the government says it doesn't approve of censorship, etc, when its secrets come to light, it seems to be Okay with recommending underhanded tactics.

Secretary Clinton delivered a speech called, "Internet Rights and Wrongs: Choices and Challenges In A Networked World." To help promote and support Internet freedom, the State Department intends to award $25 million in grants. While that is great news, the EFF reported, "For every strong statement about preserving liberty, freedom of expression, and privacy on the global Internet, there exists a countervailing example of the United States attempting to undermine those same values."

Secretary Clinton later told "This Week" anchor Christiane Amanpou that most Americans "are in favor of human rights, freedom, democracy. We know that ultimately the most progress that can be made on behalf of human beings anywhere is when those individuals are empowered, when they have governments that are responsive." Clinton added, "At the same time, we recognize that this process can be hijacked. It can be hijacked by both outside and inside elements within any country."

So while the U.S. government can talk a good talk, what it does and what it says often doesn't seem to jive. Gasp, I know, it's not a big shocker but sometimes I find that utterly frustrating. The President wanted an Internet Kill Switch, the FBI keeps pushing for backdoors on all-things-Net. What happened to a code of ethics? Does it disappear behind closed doors, dirty deeds done in the dark and used against the American people who are supposed to be free to express themselves?

Motorola Xoom: The complete FAQ


It's been months in the making, but it's finally here: the Motorola Xoom, Google's inaugural Android Honeycomb tablet.

Launching today, the Xoom will be the first in a tidal wave of Android tablets set to hit the market this year. It's essentially Google's flagship device, kickstarting the category and setting a golden standard for all the high-end Honeycomb products on the way. Android engineers actually used the Xoom to test and develop the Honeycomb OS, and it's considered a "pure Google" product -- meaning you get what Google created, with no manufacturer-added skins or interfaces getting in the way.

As I've covered the Xoom and Honeycomb these past several weeks, I've heard plenty of questions -- questions on everything from the technical nitty-gritty to the truth about those irksome Verizon data plans. It's time for some simple, straight-forward answers. Here's a Q&A-style guide to everything you need to know.

What is the Motorola Xoom?

Come on -- that's just too easy. The Motorola Xoom is an Android-based tablet that runs on Google's new Honeycomb platform.

How's the Motorola Xoom different from previous Android tablets, like the Galaxy Tab?

Motorola Xoom Android Honeycomb

Aside from differences in form, Honeycomb -- the operating system debuting on the Xoom -- is the first version of Android that Google actually built with tablets in mind. The software was written from the ground up in order to optimize the Android experience for larger-screen devices.

With Honeycomb, the devices' hardware buttons are gone and replaced with on-screen buttons that change position based on your orientation. There's a whole new notification system that puts desktop-style alerts at the bottom of your screen. Android's multitasking system is revamped to make it easier to switch between running applications. And speaking of apps, Honeycomb allows developers to split their programs into multiple panes -- like in Honeycomb's Gmail app, where you can actively scroll through your inbox while simultaneously viewing (and scrolling through) individual messages.

Put simply, it's a very different interface -- and, if you ask me, a drastically improved one. Check out my Android Honeycomb hands-on impressions for a deeper dive into the Xoom experience.

How 'bout hardware -- what kind of specs does this bad boy have?

Motorola Xoom Hardware

The Xoom has a 10.1-inch, 1280-by-800 display. It runs on a dual-core Nvidia Tegra 2 1GHz processor with 1GB of RAM, 32GB of internal storage, and SD card support for additional storage space. As of now, the Xoom supports 3G and Wi-Fi connections; Verizon promises, however, that you'll be able to get a free hardware upgrade that'll enable 4G access by this spring.

The Xoom comes with a 5MP rear-facing camera and a 2MP front-facing camera for video chat (which Honeycomb natively supports via Google's Google Talk system). The tablet records 720p HD video and supports 1080p playback. It has HDMI and USB 2.0 ports. And according to Motorola, its Android battery life can give you up to 10 hours of video playback on a single charge.

So how can I buy this Xoom, and how much is it?

The Xoom is on sale in the U.S. as of today, February 24. It's available for $800 at Verizon Wireless or Best Buy stores. Verizon is also offering the Xoom for $600, if you sign a two-year contract for data services; those will run you a minimum of $20 a month.

Do I have to get a data plan?

Verizon is requiring you to sign up for one when you buy a Xoom -- obnoxious, I know -- but you can cancel it immediately after. So basically, you'll have to pay $20 for one month, plus a $35 activation fee (sigh). But there's no obligation to continue paying for data beyond that point; once your first month is up, you can use your Xoom via Wi-Fi only and never pay Verizon another nickel.

UPDATE: Verizon Wireless has changed its mind on this policy as of Thursday morning (thank goodness). I've just confirmed with a Verizon spokesperson that if you purchase the Xoom for $800, without a contract, you will not be required to pay the $35 activation fee or sign up for a data plan. As of now, the Verizon Wireless website is not reflecting this change; however, an online customer service representative tells me you can contact customer care to have the data plan removed after making your purchase.

UPDATE #2: Best Buy has now revised its policy and is allowing Xoom purchases without data plans as well.

So what if I want to activate data for a month here and there as I need it? Will I get charged an activation fee each time?

I've been wondering that very thing myself (how coincidental!). I checked with Verizon Wireless yesterday. The annoying answer is that yes, they'll charge you 35 bucks every time you activate data service. So you can switch the data plan on as you need it, but you'll have to pay that fee each time.

Isn't there supposed to be a Wi-Fi-only Xoom, too?

Yes -- but it isn't here yet. Motorola Mobility CEO Sanjay Jha has said a Wi-Fi-only edition of the Xoom will sell for about $600; he hasn't said, though, when it'll launch. So if you want the Xoom now, you'll have to spring for the full 3G/4G version, even if you're planning to use it only over Wi-Fi; otherwise, you can wait, but it's anyone's guess how long the wait might be.

Can I just use my phone's data plan to get my Xoom online?

Motorola Xoom Adobe Flash

Maybe. Standard Android tethering -- where you use a program like PdaNet to connect your phone to another device via USB or Bluetooth -- won't do the trick, as those kind of connections require a Windows- or Mac-based client to work.

If you're able to create a wireless hotspot from your phone, though -- either by subscribing to your carrier's hotspot option or by going the underground route and rooting your phone -- then yes, you can use that connection to get the Xoom online. It would be no different than connecting the Xoom to any other wireless hotspot.

What's with all this Flash talk? Will the Xoom support Flash or not?

The Xoom will support Flash, but not immediately; Adobe is still putting the finishing touches on its Flash Player 10.3 software, which the Xoom needs in order to play Flash-based content. Adobe promises, though, that the software will be sent to the Xoom as an over-the-air update "within a few weeks" of today's launch.

Should I get the Xoom or wait for one of the other upcoming tablets?

That, my friend, is a tricky question; I take a stab at answering it here.

Okay, Mr. Smartypants, tell me this: Can you eat the Motorola Xoom?

Technically, yes, but it isn't advisable.

Android Power TwitterCan you drop it from a 40-story building?

Sure, but you probably wouldn't want to.

Can you legally marry it in the state of California?

Right -- I think we're done here.