Wednesday, March 2, 2011

Cross-ocean clouds gain despite millisecond delays

Japan's AIG Edison says cloud computing benefits outweigh latency issues that come with using Salesforce.com's U.S. data center
Just over a year ago, Tohru Futami, CIO and managing director at AIG Edison Life Insurance Co. in Japan, knew that his company needed to upgrade its core applications -- the systems were seven years old and often didn't let the back office and the sales staff share information in a timely manner. Furthermore, some of the company's processes were still paper-based.
Futami said the company's main options were to rewrite all of the applications or to move to the cloud and run hosted software. Spreadsheet calculations convinced the firm to try the latter option.
The calculations estimated that an in-house rewriting of the AIG Edison's applications would take about 30 months, while the company could move to Salesforce.com Inc.'s cloud platform in just 10 months. The research also indicated that the cost of the cloud technology would be only about one-third of the cost of any other option.
Futami said the key consideration for AIG Edison officials from the start was to complete the project as quickly as possible. "To improve customer services, a system improvement was a must," he said.
The decision of whether to move to the cloud via Salesforce.com's hosted CRM offering was complicated early on because the hosted software resides in a data center on the West Coast of the United States, 5,000 miles from AIG Edison's Tokyo headquarters. The distance raised concerns about network latency, and officials also wondered about the legal and regulatory issues involved in such a setup.
Nevertheless, AIG Edison did decide to turn to the cloud, and work on running the company's new core applications on Salesforce.com computers in San Francisco began last January. Today, the system is available for use by several million AIG Edison customers, millions of prospects, some 3,000 employees and 15,000 insurance brokers and resellers. The hosted applications handle complex tasks such as generating insurance quotes and running simulations to assess coverage needs.
Futami said that early on in the process, a key concern was whether a cloud-based system could provide the same level of performance as AIG Edison's conventional system.
The company undertook the project with the help of Appirio Inc., a San Francisco-based firm that helps businesses set up cloud platforms. Appirio helped architect and tune the system to provide "almost the same level of response time" as the conventional system, said Futami.
Network latency, particularly for complex services delivered around the world, can be an issue if users feel response times are too slow. The laws of physics will always prevail, but latency concerns don't appear to be curtailing adoption of software-as-a-service (SaaS) offerings.
For instance, FleetMatics, a private Dublin-based company that provides hosted GPS tracking services, has been able to provide service to a rapidly growing U.S. customer base even though its system was hosted exclusively in a data center in Ireland until December. The company recently raised $68 million in funding.
FleetMatics customers can watch vehicles move around on large flat screens as GPS data is continuously updated. FleetMatics CTO Peter Mitchell said customers hadn't said that they perceived the response time from Ireland as a negative. Nonetheless, when the company opened a data center in Denver in December, immediately "there was a perception that the system was now lightning fast," he added.
Mitchell said he believes SaaS providers in Europe have no problem providing services to customers in the U.S. FleetMatics opened its Denver data center as part of an effort to develop a global disaster recovery model, as well expand services. The company has begun testing latency times to India from Dublin and the U.S., he added.
Last fall, Salesforce.com announced plans to open a data center in Japan in the second half of this year. Japan is Salesforce's fastest growing market outside of the U.S., according to company spokesman Joseph Schmidt, who said that when the Japan data center opens, "our customers will benefit from the speed and peace of mind that come with having their data close to home."
AIG Edison has found that latency from the U.S.-Japan link varies according to connection speed and amount of data. On average, it takes 132 milliseconds to send and receive 32KB, according to Appirio. In contrast, it takes about 52 milliseconds for a similar amount of information to travel via a Japan-based host site.
AIG Edison's entire client environment, which includes virtual desktops for all of its salespeople, experiences a maximum lag of 300 to 400 milliseconds, or about one-third of a second, according to Appirio.
That latency rate comes after optimization and tuning. Among the things AIG Edison did was transmit batch loads of data into the Salesforce data center, where the majority of data was located. Also, instead of doing four sequential queries, the system was optimized to do four queries at the same time.
"If the application is written in such a way that it is minimizing the number of round trips to the database," latency "can become a minor issue," said Andy Poulter, CTO at Runaware Inc., which provides a number of SaaS- and cloud-based offerings, including online software testing. It has data centers in Sweden and Miami, and it serves a worldwide customer base.
Some AIG Edison data, along with customer history checks, still has to be pulled off a mainframe in Japan.
For AIG Edison, the decision to work with a cloud provider in another country raised security concerns that also had to be addressed.
"Just having the data residing outside of Japan was hard for some people to get over," said Jason Park, general manager of Appirio's Japan operations.
But executives' concerns were mitigated by explanations of the physical and logical security Salesforce had in place. They found that Salesforce "was probably better both from a reliability and uptime perspective and provided more robust security as well," Park said.
AIG Edison was acquired last month by Prudential Financial Inc., and Futami decided to leave his position. But decisions like the one he made to move to the cloud are becoming increasingly common: U.S. revenue from public cloud computing is expected to increase 24% this year alone to $17.6 billion, according to IDC.
Phil Garland, a partner in the PricewaterhouseCoopers advisory practice, said that whether latency is an problem or not depends on the user's expectations, level of tolerance, and what works for the business.
"It really depends about what performance levels are acceptable to you, said Garland. "There is no question that the farther one is away from a data center, there will be an impact on performance," he said.
"But there are ways that many providers work around that by balancing between actual performance and perceived performance," said Garland. "A clever client design can alleviate many of those issues that are presented by high-latency environments -- at least up to a point."
Garland said there's no rule of thumb on user acceptance of latency. It depends on the function of the data and how critical it is. But the topic has grown in importance as companies move toward data center consolidation at the same time as global customer bases are expanding. "It's a very common discussion point right now," he said.

Facebook dispute over $20 ends in murder

18-year-old Kayla Henriques has murdered 22-year-old Kamisha Richards over a $20 dispute that developed on Facebook.
A dispute that started over $20 and turned into a war of words on Facebook has ended with murder. 18-year-old Kayla Henriques stabbed 22-year-old Kamisha Richards near East New York’s Cypress Hills housing complex in Brooklyn, according to police cited by NY Daily News.
Richards suffered one stab wound to the chest and succumbed to her injuries at Brookdale Hospital shortly after. Police found a kitchen knife at the crime scene and followed a blood trail to a neighboring building where Henriques was arrested for the death of her brother’s girlfriend. She was charged with second degree murder and criminal possession of a weapon. Skeptical police sources said Henriques claimed the stabbing was in self-defense after Richards came after her.
Authorities say the two women engaged in an argument on Facebook a day earlier. Postings on the social network suggest that Richards lent Henriques $20 for diapers and milk. Henriques instead spent the cash on other items that did not immediately return the money. Excerpts of their heated verbal fight is below:
Richards: “U DAM RIGHT I’m MAD. I have no f—- kids and I refuse 2take care of any1 eles so yea I will b needing that $20…this is the last time u will con me into giving u money.”
Henriques: “Dnt try to expose me mama but I’m not tha type to thug it ova facebook see u wen u get frm wrk.”
Richards: “Kayla now u getin outa hand … I hope u having fun entertaining the world … Trust, IMA HAVE THE LAST LAUGH!!!”
Henriques: “We will see.”
Family and friends said Richards began taking care of Henriques since her mother died 15 years ago. Last year, Richards threw a baby shower for Henriques and was helping plan a first birthday party in May for her son Alex. When reporters asked if she felt bad about the slaying, Henriques replied: “No, ’cause it was a mistake. I was protecting my kids.”
Richards was awaiting her Law School Admission Test results and hoped to begin law school in the fall. She was a graduate of John Jay College of Criminal Justice and also worked at a nursing home as well as for JPMorgan Chase.

Bing surpasses Yahoo! as second most popular search engine worldwide

Bing is certainly popular worldwide. StatCounter is reporting that Microsoft’s creation has passed up Yahoo! as the second most-used search engine globally.

Let’s take a look at the hard numbers. In February, Bing nabbed 4.37% of the global share, just ahead Yahoo! at 3.93%. They both still pale in comparison to Google’s whopping 89.94% share of the pie, but it is interesting to see Bing growing. All of those TV product placements must be helping.
Domestic results, however, are a different story. Yahoo! still managed to hold on to the silver medal with 9.74%, while Bing was trailing ever-so-close behind at 9.03%. Shockingly, Google actually has less control over the search engine market share in the United States with just 79.63%. Okay, so that’s still nothing to complain about, but Bing could pull ahead of Yahoo! in March.
Which search engine is your favorite?

Turning bacteria into butanol biofuel factories


The enzyme pathway by which glucose is turned into n-butanol is set against the silhouette of an E. coli bacterium. The pathway, taken from Clostridium bacteria and inserted into E. coli, consists of five enzymes that convert acetyl-CoA, a product of glucose metabolism, into n-butanol (C4H9OH).
While ethanol is today's major biofuel, researchers aim to produce fuels more like gasoline. Butanol is the primary candidate, now produced primarily by Clostridium bacteria. UC Berkeley chemist Michelle Chang has transplanted the enzyme pathway from Clostridium into E. coli and gotten the bacteria to churn out 10 times more n-butanol than competing microbes, close to the level needed for industrial scale production.
University of California, Berkeley, chemists have engineered bacteria to churn out a gasoline-like biofuel at about 10 times the rate of competing microbes, a breakthrough that could soon provide an affordable and “green” transportation fuel.
The advance is reported in this week’s issue of the journal Nature Chemical Biology by Michelle C. Y. Chang, assistant professor of chemistry at UC Berkeley, graduate student Brooks B. Bond-Watts and recent UC Berkeley graduate Robert J. Bellerose.
Various species of the Clostridium bacteria naturally produce a chemical called n-butanol (normal butanol) that has been proposed as a substitute for diesel oil and gasoline. While most researchers, including a few biofuel companies, have genetically altered Clostridium to boost its ability to produce n-butanol, others have plucked enzymes from the bacteria and inserted them into other microbes, such as yeast, to turn them into n-butanol factories. Yeast and E. coli, one of the main bacteria in the human gut, are considered to be easier to grow on an industrial scale.
While these techniques have produced promising genetically altered E. coli bacteria and yeast, n-butanol production has been limited to little more than half a gram per liter, far below the amounts needed for affordable production.
Chang and her colleagues stuck the same enzyme pathway into E. coli, but replaced two of the five enzymes with look-alikes from other organisms that avoided one of the problems other researchers have had: n-butanol being converted back into its chemical precursors by the same enzymes that produce it.
The new genetically altered E. coli produced nearly five grams of n-buranol per liter, about the same as the native Clostridium and one-third the production of the best genetically altered Clostridium, but about 10 times better than current industrial microbe systems.

“We are in a host that is easier to work with, and we have a chance to make it even better,” Chang said. “We are reaching yields where, if we could make two to three times more, we could probably start to think about designing an industrial process around it.”
“We were excited to break through the multi-gram barrier, which was challenging,” she added.

Graduate student Brooks Bond-Watts and post-doctoral fellow Jeff Hanson examine cultured E. coli used to produce the biofuel n-butanol. (Photo by Michael Barnes)
Among the reasons for engineering microbes to make fuels is to avoid the toxic byproducts of conventional fossil fuel refining, and, ultimately, to replace fossil fuels with more environmentally friendly biofuels produced from plants. If microbes can be engineered to turn nearly every carbon atom they eat into recoverable fuel, they could help the world achieve a more carbon-neutral transportation fuel that would reduce the pollution now contributing to global climate change. Chang is a member of UC Berkeley’s year-old Center for Green Chemistry.
The basic steps evolved by Clostridium to make butanol involve five enzymes that convert a common molecule, acetyl-CoA, into n-butanol. Other researchers who have engineered yeast or E. coli to produce n-butanol have taken the entire enzyme pathway and transplanted it into these microbes. However, n-butanol is not produced rapidly in these systems because the native enzymes can work in reverse to convert butanol back into its starting precursors.
Chang avoided this problem by searching for organisms that have similar enzymes, but that work so slowly in reverse that little n-butanol is lost through a backward reaction.
“Depending on the specific way an enzyme catalyzes a reaction, you can force it in the forward direction by reducing the speed at which the back reaction occurs,” she said. “If the back reaction is slow enough, then the transformation becomes effectively irreversible, allowing us to accumulate more of the final product.”
Chang found two new enzyme versions in published sequences of microbial genomes, and based on her understanding of the enzyme pathway, substituted the new versions at critical points that would not interfere with the hundreds of other chemical reactions going on in a living E. coli cell. In all, she installed genes from three separate organisms – Clostridium acetobutylicum, Treponema denticola and Ralstonia eutrophus — into the E. coli.
Chang is optimistic that by improving enzyme activity at a few other bottlenecks in the n-butanol synthesis pathway, and by optimizing the host microbe for production of n-butanol, she can boost production two to three times, enough to justify considering scaling up to an industrial process. She also is at work adapting the new synthetic pathway to work in yeast, a workhorse for industrial production of many chemicals and pharmaceuticals.
Provided by University of California - Berkeley (news : web)

Missing sunspots: Solar mystery solved


This visible-light photograph, taken in 2008 by NASA's Solar and Heliospheric Observatory (SOHO) spacecraft, shows the sun's face free of sunspots. The sun experienced 780 spotless days during the unusually long solar minimum that just ended. New computer simulations imply that the sun's long quiet spell resulted from changing flows of hot plasma within it. Credit: NASA/SOHO
The Sun has been in the news a lot lately because it's beginning to send out more flares and solar storms. Its recent turmoil is particularly newsworthy because the Sun was very quiet for an unusually long time. Astronomers had a tough time explaining the extended solar minimum. New computer simulations imply that the Sun's long quiet spell resulted from changing flows of hot plasma within it.
"The Sun contains huge rivers of plasma similar to Earth's ocean currents," says Andres Munoz-Jaramillo, a visiting research fellow at the Harvard-Smithsonian Center for Astrophysics (CfA). "Those plasma rivers affect solar activity in ways we're just beginning to understand."
The Sun is made of a fourth state of matter - plasma, in which negative electrons and positive ions flow freely. Flowing plasma creates magnetic fields, which lie at the core of solar activity like flares, eruptions, and sunspots.
Astronomers have known for decades that the Sun's activity rises and falls in a cycle that lasts 11 years on average. At its most active, called solar maximum, dark sunspots dot the Sun's surface and frequent eruptions send billions of tons of hot plasma into space. If the plasma hits Earth, it can disrupt communications and electrical grids and short out satellites.
During solar minimum, the Sun calms down and both sunspots and eruptions are rare. The effects on Earth, while less dramatic, are still significant. For example, Earth's outer atmosphere shrinks closer to the surface, meaning there is less drag on orbiting space junk. Also, the solar wind that blows through the solar system (and its associated magnetic field) weakens, allowing more cosmic rays to reach us from interstellar space.
The most recent solar minimum had an unusually long number of spotless days: 780 days during 2008-2010. In a typical solar minimum, the Sun goes spot-free for about 300 days, making the last minimum the longest since 1913.
"The last solar minimum had two key characteristics: a long period of no sunspots and a weak polar magnetic field," explains Munoz-Jaramillo. (A polar magnetic field is the magnetic field at the Sun's north and south poles.) "We have to explain both factors if we want to understand the solar minimum."
To study the problem, Munoz-Jaramillo used computer simulations to model the Sun's behavior over 210 activity cycles spanning some 2,000 years. He specifically looked at the role of the plasma rivers that circulate from the Sun's equator to higher latitudes. These currents flow much like Earth's ocean currents: rising at the equator, streaming toward the poles, then sinking and flowing back to the equator. At a typical speed of 40 miles per hour, it takes about 11 years to make one loop.
A simulation of solar magnetic fields.
Munoz-Jaramillo and his colleagues discovered that the Sun's plasma rivers speed up and slow down like a malfunctioning conveyor belt. They find that a faster flow during the first half of the solar cycle, followed by a slower flow in the second half of the cycle, can lead to an extended solar minimum. The cause of the speed-up and slowdown likely involves a complicated feedback between the plasma flow and solar magnetic fields.
"It's like a production line - a slowdown puts 'distance' between the end of the last solar cycle and the start of the new one," says Munoz-Jaramillo.
The ultimate goal of studies like this is to predict upcoming solar maxima and minima - both their strength and timing. The team focused on simulating solar minima, and say that they can't forecast the next solar minimum (which is expected to occur in 2019) just yet.
"We can't predict how the flow of these plasma rivers will change," explains lead author Dibyendu Nandy (Indian Institute of Science Education and Research, Kolkata). "Instead, once we see how the flow is changing, we can predict the consequences."
Provided by Harvard-Smithsonian Center for Astrophysics (news : web)

Has the Earth's sixth mass extinction already arrived?

Tigers are one of Earth's most critically endangered species. Extinction of the majority of such species would indicate the sixth mass extinction is in our near future.
With the steep decline in populations of many animal species, from frogs and fish to tigers, some scientists have warned that Earth is on the brink of a mass extinction like those that occurred only five times before during the past 540 million years. Each of these 'Big Five' saw three-quarters or more of all animal species go extinct.
In a study to be published in the March 3 issue of the journal Nature, University of California, Berkeley, paleobiologists assess where mammals and other species stand today in terms of possible extinction, compared with the past 540 million years, and they find cause for hope as well as alarm.
"If you look only at the critically endangered mammals – those where the risk of extinction is at least 50 percent within three of their generations – and assume that their time will run out, and they will be extinct in 1,000 years, that puts us clearly outside any range of normal, and tells us that we are moving into the mass extinction realm," said principal author Anthony D. Barnosky, UC Berkeley professor of integrative biology, a curator in the Museum of Paleontology and a research paleontologist in the Museum of Vertebrate Zoology.
"If currently threatened species – those officially classed as critically endangered, endangered and vulnerable – actually went extinct, and that rate of extinction continued, the sixth mass extinction could arrive within as little as 3 to 22 centuries," he said.
Nevertheless, Barnosky added, it's not too late to save these critically endangered mammals and other such species and stop short of the tipping point. That would require dealing with a perfect storm of threats, including habitat fragmentation, invasive species, disease and global warming,
"So far, only 1 to 2 percent of all species have gone extinct in the groups we can look at clearly, so by those numbers, it looks like we are not far down the road to extinction. We still have a lot of Earth's biota to save," Barnosky said. "It's very important to devote resources and legislation toward species conservation if we don't want to be the species whose activity caused a mass extinction."
Coauthor Charles Marshall, UC Berkeley professor of integrative biology and director of the campus's Museum of Paleontology, emphasized that the small number of recorded extinctions to date does not mean we are not in a crisis.
"Just because the magnitude is low compared to the biggest mass extinctions we've seen in a half a billion years doesn't mean to say that they aren't significant," he said. "Even though the magnitude is fairly low, present rates are higher than during most past mass extinctions."
"The modern global mass extinction is a largely unaddressed hazard of climate change and human activities," said H. Richard Lane, program director in the National Science Foundation's Division of Earth Sciences, which funded the research. "Its continued progression, as this paper shows, could result in unforeseen – and irreversible – negative consequences to the environment and to humanity."
The study originated in a graduate seminar Barnosky organized in 2009 to bring biologists and paleontologists together in an attempt to compare the extinction rate seen in the fossil record with today's extinction record. These are "like comparing apples and oranges," Barnosky said. For one thing, the fossil record goes back 3.5 billion years, while the historical record goes back only a few thousand years. In addition, the fossil record has many holes, making it is impossible to count every species that evolved and subsequently disappeared, which probably amounts to 99 percent of all species that have ever existed. A different set of data problems complicates counting modern extinctions.
Dating of the fossil record also is not very precise, Marshall said.
"If we find a mass extinction, we have great difficulty determining whether it was a bad weekend or it occurred over a decade or 10,000 years," he said. "But without the fossil record, we really have no scale to measure the significance of the impact we are having."
To get around this limitation, Marshall said, "This paper, instead of calculating a single death rate, estimates the range of plausible rates for the mass extinctions from the fossil record and then compares these rates to where we are now."
Barnosky's team chose mammals as a starting point because they are well studied today and are well represented in the fossil record going back some 65 million years. Biologists estimate that within the past 500 years, at least 80 mammal species have gone extinct out of a starting total of 5,570 species.
The team's estimate for the average extinction rate for mammals is less than two extinctions every million years, far lower than the current extinction rate for mammals.
"It looks like modern extinction rates resemble mass extinction rates, even after setting a high bar for defining 'mass extinction,'" Barnosky said.
After looking at the list of threatened species maintained by the International Union for Conservation of Nature (IUCN), the team concluded that if all mammals now listed as "critically endangered," "endangered" and "threatened" go extinct, whether that takes several hundred years or 1,000 years, Earth will be in a true mass extinction.
"Obviously there are caveats," Barnosky said. "What we know is based on observations from just a very few twigs plucked from the enormous number of branches that make up the tree of life."
He urges similar studies of groups other than mammals in order to confirm the findings, as well as action to combat the loss of animal and plant species.
"Our findings highlight how essential it is to save critically endangered, endangered and vulnerable species," Barnosky added. "With them, Earth's biodiversity remains in pretty good shape compared to the long-term biodiversity baseline. If most of them die, even if their disappearance is stretched out over the next 1,000 years, the sixth mass extinction will have arrived."
Provided by University of California - Berkeley (news : web)

Apple failed to move hardware bar with iPad 2, say experts


But it retains edge in tablet sales experience, iOS and App Store
By the time that Apple CEO Steve Jobs wrapped up today's launch of a revamped iPad, analysts were already calling it "incremental" and pointing out that it the new tablet delivers "no surprises."
The bottom line? Contrary to Jobs' assertion that iPad 2 will stymie what he called "copy cats," Apple hasn't staked out an insurmountable hardware position.
"Apple didn't really move the bar all that much," said Jack Gold, an analyst at J. Gold Associates. "I don't see this as heads above the competition, especially the Xoom, right now. Apple fans who want the latest will buy this or upgrade, but I don't see any overwhelmingly compelling capabilities that would make people sitting on the tablet fence go out and buy one."
Other experts echoed Gold's take.
"It's all very nice -- smaller, lighter faster, but there were no surprises," said Ezra Gottheil, an analyst with Technology Business Research. "Is it nicer? Yes. But it all was predictable, things that everyone was betting on, including competitors."
Stephen Baker of retail research firm NPD Group chimed in as well on the theme.
"It seemed like this time, everyone knew everything ahead of time," said Baker. "It's all incremental. But there are only so many ways you can surprisingly change things."
The iPad 2, which according to Apple features a dual-core processor, faster graphics and two built-in cameras, will go on sale March 11 in the U.S. at the same price points as the original tablet: The Wi-Fi model starts at $499, while the 3G device starts at $629.
Yesterday, Baker said Apple might preempt actual and potential tablet rivals by going aggressive on price, but the company didn't take his last-minute advice.
Nor did it stake out features that other tablets won't sport within months. "The specs are basically what everyone else is coming out with in three to four months," said Baker. "We're at a point where this set of features will be similar across every device, at least for this round."
But the fact that Apple didn't feel the need to drop the price or radically rework the iPad speaks volumes about its place in the tablet market, which the company essentially kick-started last year, selling nearly 15 million of the devices.
"They're clearly not seeing any constraints on the market at $500," said Gottheil. "At some point, they may drop the price to, say, $400, but they won't do that until they need to."
And Apple has an advantage because of less tangible elements that its competitors still lack. "They have the first second-generation [iPad] out there," Baker said, "and a year's worth of sales and experience with tablets."
Or maybe the talk of hardware is all just noise.
"Let's be honest," said Baker. "What Apple's move today points to is the fact that hardware is not all that important. The biggest differentiator between Apple and [other tablet makers] is iOS versus the others, its App Store versus everyone else. If Apple can continue to build the better experience, its success isn't hardware dependent."
While there may have been a lack of drama about the iPad 2 -- the name bloggers had stuck on the expected upgrade months ago, and that Apple co-opted -- it was balanced by Jobs' presence at the San Francisco event. It was his first public appearance since he announced last January that he was taking an indefinite medical leave, and made up for a missing "One more thing" announcement that has made Apple's product launches famous.
"Steve Jobs doing the presentation, that was important," said Gottheil. Jobs received a standing ovation from the crowd of reporters and others at the invitation-only event.

HP touts greater capacity of new 11n Wi-Fi access points

Hewlett-Packard's new 802.11n Wi-Fi access points include two models that support three data streams, capable of yielding a data rate of 450Mbps per radio, or 900Mbps per access point. That translates into greater throughput, sustained over longer distances, compared with products that use two data streams.
HP says the high end of the new line can handle up to 50% more Wi-Fi clients or 50% greater bandwidth than 11n products with two data streams. All three of the new 400 series models, each with two radios, are aimed at the enterprise market. One is an entry-level 11n access point, with two data streams; the other two, both with three streams, differ only in their antenna configuration: One is built-in under the covers, the other has exterior fittings to mount directional antennas for greater range.
The new access points also activate a range of optional radio management features in the 802.11n standard. Together, these let the access point and client exchange more information about the RF environment, and let the access point make a range of adjustments for a stronger, better-quality connection for each client.
As part of the new product rollout, HP also announced updated firmware, version 5.5, for its Multi Service Mobility (MSM) controller, and the 3.10 version of the Mobility Manager network management application.
The 802.11n standard separates the data stream into substreams, each corresponding to a separate transmit and receive antenna pairing between the client and the access point. More streams, and antenna pairs, means a greater data rate, and a greater resulting throughput; along with being able to sustain that throughput over longer distances. HP is using the latest 3x3 and 2x2 chips from Atheros for the new products.
The result is a Wi-Fi network that's more reliable, and creates better quality connections for more clients, even if the 11n clients only have two or even one antenna.
And that's been the experience of one beta site for the new HP 400 series: Glendale Community College, with three campuses in Glendale, Ariz., about 11 miles northwest of Phoenix. All three sites have been using older HP WLAN gear since 2008, 98 all told. They are a mix of 802.11abg, with newer 11n products installed in some high-traffic areas. The 11n devices exploit Gigabit Ethernet back to the LAN. The two remote locations use metro Ethernet WAN service back to the data center on the main campus.
About 5% of the 22,000 faculty, staff and students are regular Wi-Fi users, according to Joshua Krek, senior network administrator with the college's Office of Information Technology.
Last year, the college deployed a mix of all three new access points in high-use areas of the main campus, such as the student union building. There were eight in the beta test: the entry-level 430, the 460 with built-in radios, and the 466 with exterior antenna mounts, the latter for a mesh deployment covering some outside spaces.
Glendale's tests found that a single Wi-Fi client, a laptop, could easily see sustained throughput of "well over" 200Mbps per radio, or more than 400Mbps for each access point, for the 460 and 466 models, Krek says. That meant more throughput could be shared among the 20-30 clients typically associated with each access point.
And the throughput stayed high, and reliable, over much longer distances than before. The exterior mesh using the 466 models "still came close to 200Mbps" over a distance of about two football fields, roughly 750 feet. That was an almost ideal deployment: straight line-of-sight with no obstacles.
As are a growing number of WLAN vendors, HP has introduced with the newest 11n access points several radio optimization features that optional in the 802.11n standard. These include:
- Beam forming: Based on information from the client radio, the access point can adapt its transmit beam signal for a specific, individual client, optimizing the connection.
- Band steering: The access point can detect whether the client radio can run in the 5GHz frequency; if so, the access point can shift that client from 2.4 to 5GHz automatically. The higher frequency has more non-overlapping channels, can offer a larger number of combined channels for maximum throughput, and has less interference than 2.4GHz.
In addition, the 466 model, with exterior antennas, can support what HP calls "concurrent radio operation" in the 5GHz band, meaning both radios can be set to run in that band, without interfering with each other.
All three of the new products can support full 11n features and performance over existing 802.3af power-over-Ethernet infrastructures, according to Roger Sands, director of mobility solutions for HP Networking.
Mobility Manager 3.10 includes a range of improvements, many of them focused on client tracking and trouble-shooting, according to Sands. Network administrators now can see a history of a specific client's locations, and correlate those with Wi-Fi performance metrics, for example.
The new access points, updated controller firmware, and new release of the network management application are all available now.
The E-Series Multi Service Mobility (MSM) 430 has a U.S. list price of $699; the MSM460 and 466, with three spatial streams, list for $999. All three come with HP's lifetime warranty. More information can be found online at HP.