Pages

Wednesday, March 23, 2011

Microsoft sends Mozilla another cake for Firefox 4 release

Microsoft has once again sent a congratulatory cake to Mozilla, this time for the release of Firefox 4. This would be the third cake the software giant has sent; it's become a tradition that is celebrated upon major version releases of the open-source browser.
"This is a fine gesture," Frédéric Wenzel, a Mozilla software engineer, said in a statement. "Hat tip to our fellow browser makers in Redmond. Remember: Competition is good for business."
The tradition began in October 2006, when the Internet Explorer 7 development team sent a cake to Mozilla for successful shipping Firefox 2. Given the state of the browser wars at the time, many joked about the cake being poisoned, while others teasingly suggested that Mozilla send a cake back along with the recipe, in reference to the fact that the organization produces open-source software. The IE8 development team sent another cake on June 17, 2008, when Firefox 3 was released.
Mozilla has announced that in 2011, there will be four major releases of its browser. That means that there is still Firefox 5.0, Firefox 6.0, and Firefox 7.0 to come. Firefox 5 is slated for a late June 2011 release. Did Mozilla consider Microsoft's marketing budget when it began planning to change its development cycle?

Microsoft rolling out 'NoDo' update for Windows Phone 7

After facing a delay to iron out a few issues Microsoft has finally begun distributing the first full update for Windows Phone 7 devices. Dubbed NoDo as in "No Donuts" -- in reference to Google's Android naming scheme -- the update brings the much awaited Copy and Paste functionality, as well as faster app and game performance, streamlined Marketplace search, improved Wi-Fi connectivity, Outlook and MMS and some tweaks for the Facebook integration.
The new update was supposed to be released in the first two weeks of March. But a series of glitches with a small patch in February, aimed at preparing Windows Phone 7 devices for future updates, led Microsoft to revise the schedule. Back then, a reportedly 'small number' of users -- mostly Samsung device owners -- were left with a bricked device. Microsoft stopped the update to fix the bugs but even the follow-up release caused some issues.
Not surprisingly, the company is taking a careful approach by slowly distributing this update. Based on numerous reports, users with unlocked phones are among the first to get the update, while phones that sport carriers' brands will possibly take as long as four weeks to receive the download notification. As for new handsets, like the recently announced HTC Arrive on Sprint, they will ship with NoDo already installed.
Microsoft is already working on the next big software version for its Windows Phone platform, codenamed Mango, which is expected by the end of 2011. The release should come with its own version number, most likely 7.5, and bring a host of new features, including multitasking, Internet Explorer 9, SkyDrive integration and Twitter people hub support. In the meantime, check the full list of improvements in NoDo from Microsoft's official page after the break.

Toshiba's Portege R835 ultraportable up for pre-order

Toshiba has begun accepting pre-orders for their newest ultra-portable Portege laptop. Starting at $899, the R835-ST3N01 packs a dual-core Core i3-2310M processor with built-in graphics, 4GB of RAM, a 13.3-inch display with 1366 x 768 resolution, 640GB 5,400 RPM hard drive, Bluetooth 3.0, DVD burner, a 66-Wh battery rated for up to nine hours of run time, and everything you'd expect port-wise, including eSATA connectivity, USB 3.0, as well as HDMI. 

Spending an extra $30 upgrades you to a quicker Core i5-2410M CPU, and there are a couple of other closely spec'd options on Toshiba's website, but so far it seems only the $899 R835-ST3N01 model is up for pre-ordering. Whichever route you take gets you a good-looking and performing laptop with a weight of around 3.2lbs and a thickness of 0.72-1.05 inches, featuring a crafted magnesium casing and a honeycomb rib structure to deliver extra durability. 


The Portege R705 was very well received last year for being one of the lightest laptops you could find with both a roomy 13-inch display and a built-in optical drive. Now, refreshed with the latest Intel Sandy Bridge processors and improved integrated graphics, the 13-inch R835 is looking to be another winner from Toshiba

Tuesday, March 22, 2011

Samsung: Galaxy Tab 10.1, Galaxy Tab 8.9 coming this summer

Samsung has officially announced two additions to its family of Galaxy Tabs: the Galaxy Tab 10.1 and the Galaxy Tab 8.9. The WiFi version of the 10.1 will arrive on June 8, 2011 in 16GB ($500) and 32GB ($600) flavors while the 8.9 will also be available in 16GB ($470) and 32GB ($570) versions. They join the original 7-inch Galaxy Tab, which has already been quite a success for the company.
The hardware giant is touting the duo as the world's thinnest mobile tablets, at just 8.6 millimeters thick. The Galaxy Tab 10.1 weighs 595 grams while the Galaxy Tab 8.9 weighs 470 grams. Both support HSPA+ network speeds of up to 21Mbps, Bluetooth, and Wi-Fi 802.11 a/b/g/n connectivity. They also feature a 3 megapixel rear camera and a 2 megapixel front camera, providing seamless 1080p HD video. The two devices are powered by a 1GHz dual core application processor, but for more detailed specs check out the following images:

What the specs don't mention is that the two devices are preloaded with Readers Hub and Music Hub giving consumers instant access to more than 2.2 million books, 2,000 newspaper (49 languages), 2,300 magazines (22 languages), and 13 million songs. The devices are also designed with Samsung's Social Hub, which will aggregate email, instant messaging, contacts, calendar, and social network connections into a single interface.
Both devices feature Samsung's own TouchWiz user interface, which sits on top of Android 3.0 (codenamed Honeycomb). A Live Panel menu allows users to customize a variety of content on the home screen including digital pictures, favorite websites, and social network feeds. There's also a Mini Apps Tray for commonly used features such as task manager, calendar, and music player which can be launched while other major applications are also in use, including large file downloads and document editing.
"The Galaxy Tab 10.1 and 8.9 are remarkable examples of Samsung's constant innovation and show our dedication to designing premium tablets that fit the unique needs of consumers around the world," JK Shin, President and Head of Samsung's Mobile Communications Business, said in a statement. "By combining Samsung's innovations in design and display with our exciting new user experience, we've created a new class of products that will lead the tablet market."

Verizon preps LTE deployment in 59 more US cities

Verizon plans to expand its 4G LTE service to 59 more cities by the end of 2011, including Tallahassee, Little Rock, Charleston, Honolulu, and Milwaukee. The mobile broadband network launched last December in 39 markets and Verizon announced 49 more during January's Consumer Electronics Show. In all, the carrier should have at least 147 US cities covered by LTE this year (complete list here).

The list is largely comprised of major cities, but there are a handful of smaller areas such as Decatur, Alabama, and Fort Smith, Arkansas. Verizon says those markets represent its "commitment to reach deep into medium-sized cities and smaller communities by the end of 2011." The carrier vowed to fill its 3G footprint with LTE by 2013, covering 200 million people. We assume this deployment is still on track.


This announcement follows roughly one week after Verizon unveiled its first 4G LTE smartphone, the HTC Thunderbolt. The device runs Android 2.2 Froyo and features a 4.3-inch WVGA display, a 1GHz Snapdragon processor, HD video recording, an 8-megapixel rear camera and 1.3-megapixel front camera, and wireless DLNA support. Pricing starts at $249.99 with a two-year service contract.

Four lines of code to hop over The New York Times paywall

Dave Eurica, a Canadian reader of the online version of The New York Times, was rather annoyed to find that the website was blocking him from reading a lot of articles. He looked into the matter and found that the "paywall" erected by the publication was written in JavaScript. This means the blocked articles are actually being loaded up in the HTML in the back.
Eurica wrote a bookmarklet that you can put on your bookmarks toolbar to get around the block. Anytime nytimes.com blocks you on a page, all you have to do is click it to show the usual content (it does nothing on any other website). The solution was written in just four lines of code, one of which is a comment:
//Prototype is already installed on NYTimes pages, so I'll use that:
$('overlay').hide();
$('gatewayCreative').hide();
$(document.body).setStyle( { overflow:'scroll' } );
The New York Times subscription plan was officially announced last week. Readers who view more than 20 articles will have to pay $15 a month to access NYTimes.com and the paper's mobile app, $20 for the site and its iPad app, or $35 for all three platforms. Subscribers to the physical paper will receive unlimited access across all digital platforms except e-readers.
The block hasn't yet started for everyone: that's going to happen on March 28, 2011. The 20-article cap was first applied to Canada, however, as the site needed time to find unresolved bugs before the feature is slammed by US visitors. Apparently Canadians are just as interested in climbing over virtual walls as their American counterparts are.

Google gets a ridiculous patent for Google Doodles

After filing for it nearly 10 years ago (in April 2001), Google has been granted US Patent7,912,915, which is titled "Systems and methods for enticing users to access a web site." The patent lists Google co-founder Sergey Brin as the inventor of the feature, which is now known asGoogle Doodles - the custom logos that the company has been putting on its home page to mark special occasions since the year 2000.
Here's the patent's abstract:
A system provides a periodically changing story line and/or a special event company logo to entice users to access a web page. For the story line, the system may receive objects that tell a story according to the story line and successively provide the objects on the web page for predetermined or random amounts of time. For the special event company logo, the system may modify a standard company logo for a special event to create a special event logo, associate one or more search terms with the special event logo, and upload the special event logo to the web page. The system may then receive a user selection of the special event logo and provide search results relating to the special event.
We're having a bit of trouble understanding how Google was granted this patent, but then again the US Patent system is in complete disarray. What is even harder to grasp is how Google will possibly enforce this patent or what it gains from having the IP rights to changing a company logo for special events.
Here are some recent "doodles" as seen on Google's homepage:
 
Thomas Edison's Birthday Doodle

Alienware M11x getting Sandy Bridge in early April

Dell is supposedly preparing to update its pint-sized gaming laptop sometime early next month. If the reports hold up, the Alienware M11x R3 will arrive with Intel's Sandy Bridge processors, quicker RAM and more storage. Processor options will include the Core i5-2537M (1.4GHz base with up to 2.3GHz speeds via Turbo Boost 2.0), i7-2617M (1.5GHz to 2.6GHz) and i7-2657M (1.6GHz to 2.7GHz). 

System memory will include 1GB, 2GB, 4GB and 8GB options but it will operate at 1333MHz DDR3 instead of 800MHz. Mechanical storage will still start at 250GB but the maximum capacity will be increased from 640GB to 750GB. Samsung's 256GB P810 SSD will be offered on the side. The third revision is also expected to receive options for 3G, LTE or WiMax mobile broadband connectivity. 


A few key specifications are still unknown, namely the graphics processor. Being a gaming machine, the chosen GPU could make or break the M11x's appeal. There is some concern that Dell will opt for Sandy Bridge's integrated graphics core instead of providing a discrete solution. The current system ships with a 1GB GeForce GT 335M, a mid-grade chip capable of running demanding tiles, if only on low settings. 

It's also unclear whether the refresh will bring USB 3.0 as the existing model only comes with USB 2.0. The system will come in red, black and "SoftTouch" options, but we assume Dell hasn't made any drastic cosmetic changes. Pricing is unknown, but last year's Core i5-520UM model currently starts at $999 and we've always gotten the impression that Dell wants to keep the M11x reasonably affordable.

Verizon: we have no interest in Sprint

Verizon Wireless CEO Daniel Mead has declared that he has no interest in buying Sprint Nextel. Verizon, a joint venture of Verizon Communications and Vodafone Group, will lose its top position in the US wireless market due to a recently announced decision by AT&T to acquire T-Mobile.
"We're not interested in Sprint. We don't need them," Mead told Reuters ahead of the CTIA Wireless Conference. In regards to the AT&T/T-Mobile deal having to go through US regulatory approval, he said: "Anything can go through if you make enough concessions." The deal will likely be approved if the companies agreed to certain conditions, and AT&T is expected to have to sell some assets.
Late last week, AT&T announced that it had entered into a definitive agreement to acquire T-Mobile USA from Deutsche Telekom in a cash-and-stock transaction valued at approximately $39 billion. Interestingly, Mead said he would not oppose AT&T's plans because he does not want his company to be distracted from its goal of being the most profitable US wireless operator.
Sprint Nextel CEO Dan Hesse is a lot more worried about AT&T's decision to acquire T-Mobile than Verizon's CEO appears to be. He said he's concerned that the deal will hurt his company and the industry, as the biggest two players strengthen their dominance. "I do have concerns that it would stifle innovation and too much power would be in the hands of two," Hesse said in a panel discussion at cellphone conference in Orlando, Florida, according to The Associated Press.
Mead's cool stance is very odd. More often than not, companies are up in arms when their competitors join forces against them. It's possible that the Verizon CEO is banking on the merger slowing the two down while his company plows ahead.

Monday, March 21, 2011

LG Super-Multi N2A2 2TB NAS

Over the past few years we've reviewed a handful of NAS (Network Attached Storage) products and while most of them have been very impressive, they all shared one common pitfall: they were extremely pricey. So expensive in fact, that they're little more than a pipe dream for the average user. 

So when LG told us they had a new 2-bay NAS device that costs less than $250 with a pair of 1TB hard drives included, we sat up and took notice. 


As we have repeatedly discovered when dealing with NAS products, there is much more to consider than just storage capacity and price. These devices are highly dependent on the hardware and software that drives them and either can cause the product to be a complete failure. Realizing that, we're extremely eager to find out what makes the LG N2A2tick, let's take a closer look...

Software to predict 'March Madness' basketball winner

BasketBall.jpg
Fine, computers, you can beat us at chess and Jeopardy!, just please let us keep March Madness. With the US National Collegiate Athletic Association's basketball tournament starting today, contestants in the second annual March Madness Predictive Analytics Challenge are attempting to build software that can pick winning teams better than humans.
The contest pits machine against machine to find out which algorithm can correctly predict the outcome of the 64-team contest. Tournament brackets must be chosen entirely by computer algorithm, and no specific team-based rules, such as "always pick Duke over North Carolina", are allowed. All contestants are restricted to using the same data set - team and player statistics from the 2006 season until last month.
Contest organiser Danny Tarlow's own entry started out as a movie recommendation engine similar to those used on sites like Netflix. He says that predicting what movie a particular person would like to see is similar to predicting how well a basketball team's attack will do against their opponent's defence: both interactions are driven by unknown rules.
To predict the result of a basketball game, his algorithm chews through loads of regular season data and uses probabilities to find equations that fit the outcomes of each game. It then uses these equations to pick which teams will win in tournament match-ups. "The algorithm knows nothing about basketball or details about any team. It just sees the outcome of each game in the season, and it tries to discover latent characteristics that best explain the outcomes," he says.
Other entries range from using genetic algorithms to evolve equations that can pick winners to more straightforward attempts to boil down a team's strengths and weaknesses to a single number, then pick the team with the higher number in each match-up.
Last year's contest had 10 entries, including a "pace" bracket that simply picked the higher-seeded team in each matchup. Six of the entries did better than this baseline, one even predicting underdog Butler University's surprising ascent to the final four.
Tarlow hopes for a better performance this year, but is well aware of the difficulty of predicting the outcome of an entire basketball tournament. "There's clearly a lot of luck that goes into having a successful bracket."
We'll know how the software programs fare soon - the round of 64 begins today.

Defending against botnet attacks – by fighting back

Online malcontents who try to take down web servers now face a new kind of defence system that can fight back. A common attack known as distributed denial of service (DDoS) knocks websites offline by flooding them with traffic from a horde of infected computers, or botnet. Yuri Gushin and Alex Behar of the internet security firm Radware, based in Tel Aviv, Israel, say they have turned this attack back on its perpetrators.
Most current DDoS defences work by blocking connections from attacking computers or throttling data rates to let only a certain amount of traffic through. But these methods can't handle the larger attacks that are increasingly common. "Today's technologies are not cutting it," explain Gushin.
The pair's more sophisticated technique manipulates an attacker's connection in order to make botnet computers work harder. By intentionally ignoring part of the intended connection request they are able to trick the attacker's computer into making a very slow connection to the server as it continues to try to make contact. This lasts for around 5 minutes. When the attacking botnet computer is slowed down in this way it will automatically try to send new connection requests, badly affecting its performance. Eventually the botnet computers making the attack will be forced to give up, depending on the instructions given to them by the botmaster who launched the attack.
This approach proved successful last year, when Gushin and Behar helped defend against attacks perpetrated by Anonymous, the loose collective of internet activists that took up digital arms to fight for WikiLeaks. "We were able to really turn the tide on the attack," says Behar.

AMD: DirectX "getting in the way" of PC gaming graphics

AMD believes there is a lack of a great disparity between PC gaming graphics and console gaming graphics, despite the huge advantage the PC has over consoles in terms of hardware. Despite the hardware giant's great relationship with Microsoft, Richard Huddy, AMD's worldwide developer relations manager of the company's GPU division, blames the software giant.
"It's funny," Huddy told bit-tech. "We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is to "make the API go away."
This quote comes hot on the heels of a related statement from id Software co-founder John Carmack. While Huddy offers one perspective from the hardware side of things, and says that game developers agree with him, Carmack recently gave a different opinion from the game development angle. Despite being an OpenGL house, he stated that DirectX is a better API than OpenGL.
While Huddy said nothing about OpenGL, he clearly has not talked to Carmack. "I certainly hear this in my conversations with games developers, and I guess it was actually the primary appeal of Larrabee to developers – not the hardware, which was hot and slow and unimpressive, but the software – being able to have total control over the machine, which is what the very best games developers want," Huddy said. "By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft – no doubt at all. Wrapping it up in a software layer gives you safety and security but it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate."

AT&T to acquire T-Mobile for $39 billion

AT&T has announced that it has entered into a definitive agreement to acquire T-Mobile USA from Deutsche Telekom in a cash-and-stock transaction currently valued at approximately $39 billion. With all of the rumors of Sprint and T-Mobile joining up, this may be a little surprising, but strategically it makes more sense since the two carriers have similar networks.
The agreement has been approved by the Boards of Directors of both companies. Deutsche Telekom will receive an equity stake in AT&T that gives it an ownership interest in of approximately 8 percent. A Deutsche Telekom representative will join the AT&T Board of Directors. The deal is expected to close, pending regulatory approval, within the next 12 months.
While AT&T certainly wants T-Mobile for its subscribers, it's also clearly interested grabbing its (Long Term Evolution) network assets, improving network quality for both companies' customers, and gaining more of the limited wireless spectrum. With this transaction, AT&T is committing to a significant expansion of robust 4G LTE deployment to 95 percent of the US population to reach an additional 46.5 million Americans beyond current plans – including rural communities and small towns.
"This transaction represents a major commitment to strengthen and expand critical infrastructure for our nation's future," Randall Stephenson, AT&T Chairman and CEO, said in a statement. "It will improve network quality, and it will bring advanced LTE capabilities to more than 294 million people. Mobile broadband networks drive economic opportunity everywhere, and they enable the expanding high-tech ecosystem that includes device makers, cloud and content providers, app developers, customers, and more. During the past few years, America's high-tech industry has delivered innovation at unprecedented speed, and this combination will accelerate its continued growth. This transaction delivers significant customer, shareowner and public benefits that are available at this level only from the combination of these two companies with complementary network technologies, spectrum positions and operations. We are confident in our ability to execute a seamless integration, and with additional spectrum and network capabilities, we can better meet our customers' current demands, build for the future and help achieve the President's goals for a high-speed, wirelessly connected America."

RSA's SecurID hacked

RSA says hack won't allow "direct attack" on SecureID tokens Security firm RSA has been the victim of an "extremely sophisticated" attack that has resulted in exfiltration of certain private information, announced Executive Chairman Art Coviello in an open letter published yesterday. The company also filed a note with the SEC, warning of possible risks due to the attack. Since 2006, RSA has been part of EMC. Ars Technica
'Pruned' microchips are leaner and meaner If you had to use a commuting bicycle in a race, you would probably set about removing the kickstand, fenders, racks and lights to make the thing as fast and efficient as possible. When engineers at Houston's Rice University are developing small, fast, energy-efficient chips for use in devices like hearing aids, it turns out they do pretty much the same thing. Gizmag
AMD preparing 3.7GHz Phenom II X4 980 New details about AMD's roadmap continues to come. According to information we receive, AMD, Bulldozer processors in addition to the existing product range will continue to update. Quad-core 3.6GHz processor family as the fastest model running at 975 Black Edition model, the agenda, the AMD Phenom X4, 3.7GHz, running at 980 Phenom X4 Black Edition model also prepared. DonanımHaber (translated)
AT&T shames unauthorized phone tetherers, gives ultimatum AT&T has begun cracking down on smartphone customers—particularly iPhone users—who have been secretly tethering their smartphones to their laptops without paying for a tethering plan. The company sent text messages to the offending users, followed up with an e-mail, that says they've been identified as taking advantage the feature without paying up. Ars Technica
Dell offers unlocked Streak for $99 with purchase of a new PC Dell's Streak 5 tabletphone hasn't held our interest much lately, even with Android 2.2 on board, but Dell's got a new deal that may be too good to pass up. Engadget
Analysis: Is Anonymous a real political movement or a lame gag? The mysterious hackers of Anonymous have been more busy than usual this week. But are they a legitimate movement or a bad joke? TGDaily
Radiation dose chart This is a chart of the ionizing radiation doze a person can absorb from various sources. xkcd
Widespread freezing issue with 17-inch MacBook Pro 2011? Apple forum
"Mario" - SXSW 2011 Film Bumper YouTube (embedded below)

Saturday, March 19, 2011

The best new desktops for business

For a business buyer of desktops there are five major concerns: performance, energy, space efficiency, budget and support. Different businesses will have specific requirements from their desktop PCs; this article will look at performance and low power PCs suitable for businesses.
The performance category is most important to CAD, computer aided design, graphical work such as photo and film manipulation and publishing as well as any other role requiring large amounts of memory and processing power on the primary machine of the user. Some roles such as programming require powerful desktop PCs to compile code quickly to allow testing but this can be a shared processing core while the work is done on less powerful machines. In the performance segment the new Sandy Bridge processors are a major step forward for desktop PCs and are now becoming available once more following the recall of the first batch which had issues with port deterioration. On-chip graphics help; if a separate graphics card is not required then costs are lowered and maintenance is easier and with some language extensions it is possible to take advantage of the additional processing cores in the Sandy Bridge graphics to carry out many more calculations. Apple’s iMac is a powerful machine with Apple’s strong appeal for visual designers and artists, powered by a 3.06 GHz dual-core Intel Core 2 Duo processor with an ATI Radeon HD graphics card; it’s a fast all in one machine. Alternatively for a Windows PC the Dell XPS 8300 offers an extremely powerful machine at a good price, available for £560 not including VAT it sports the most powerful of the new Sandy Bridge line up in its unoverclocked form, the i7-2600, a quad core 3.4 gigahertz chip with some built in over-clocking ability for single threaded tasks as well as temporary overclocking of the whole chip to use the latent heat capacity of the heat sinking. This is combined with a useful 4 gigabytes of RAM and up to 4 terabytes of storage.
Low power desktop PCs are the choice of many environmentally conscious businesses both to save money though the significant power savings, especially through the use of power-efficient data centres where computation per watt is becoming the name of the game and to demonstrate their commitment to the ideas they are making in their mission statement. As the demand has grown in this area there are a number of good options available and it is only set to improve, in a big way in fact, as mobile phone processors which are designed from the very beginning to use minimal power are becoming powerful enough and the operating systems mature enough that they are beginning to migrate into laptops and desktop computers. Current market options include the tiny Fit PC2i, barely larger than an external hard drive, it can drive a widescreen display via DVI and has built in Wi-Fi, USB ports and a microSD card reader with an Intel Atom Z530 1600 MHz chip to do the processing, along with 2 GB of RAM and a 160 GB hard disk. Power use ranges from 6 Watts at a low CPU usage level up to 8 Watts at full power. The price is quite steep for the product, between £245 and £545 not including VAT. Visually it is striking and extremely space-efficient, it will have an impact on anyone who sees it but is expensive for what you get. More powerful but still energy efficient desktop computers come in the form of Lenovo’s Blue Sky desktop computers. Able to run on as little as 45 Watts on either an AMD Athlon 64 bit dual core or an AMD Sempron with hard drives from 80 GB up to 750 GB and 4 GB of RAM they are fully powered desktop computers that take little space at a size similar to that of a phonebook that also have a low speed fan for quietness, making them ideal for crowded environments.

Google's Content-Farm Algorithm Yields Bitter Harvest

Google recently deployed a new algorithm meant to lower the rankings of so-called content farms -- sites that churn out low-quality content built only to catch Google rankings and drive advertising. In many cases, the algorithm has done its job and Google searches are now cleaner. But some legitimate websites say they've been caught in the crossfire and unfairly demoted.

Late last month, Google (Nasdaq: GOOG) deployed a new algorithm intended to improve the quality of its search results, and as some critics feared, the results have in several cases hurt legitimate websites.
The algorithm was meant to clamp down on website owners gaming the system to raise their standings in search results.
Perhaps two of the most prominent accused system-gamers in recent months were J.C. Penney and Overstock.com. Both allegedly got multiple sites to link back to them to improve their site rankings, though both companies have denied this was a deliberate act backed by top management.
Prominent as they are, J.C. Penney and Overstock were basically one-off players whose nuisance was abated when they were called out by Google. A far worse problem is that of content farmers -- sites that basically churn out junk or copy content in a targeted manner to game Google's website ranking system.
These content farms have plagued Google for years, leading to several attempts by the Internet search giant to weed them out.
The latest Google algorithm, nicknamed "Farmer Update," is the latest such attempt.

The Farming of Content

What constitutes a content farm is open to debate.
A general consensus is that the label applies to websites that lift content almost entirely from other sites. However, that's also true of sites that churn out patterns of words that satisfy Google's algorithms, using search engine optimization (SEO) techniques.
One example of using SEO techniques is to create content based on topics currently hot on the Web. For example, a story about actress Elizabeth Taylor that may be trending now would spur content farms to create content about the actress, mentioning her name as frequently as possible.
Overdoing or misusing SEO techniques is called "black hat SEO" in the business. However, the difference between legitimate and black hat SEO can sometimes be a matter of opinion.
It's sometimes difficult to mechanically distinguish between unwanted garbage and legitimate sites using any of these techniques -- churning out content or using SEO. That possibly led to the "Farmer Update" algorithm sweeping up a few legitimate sites in its hunt for content farmers.

Fluctuations in the Harvest

"Content farms are obviously the prime offenders, but there are concepts and approaches thin-content sites use that spill over to all kinds of sites," Adam Audette, CEO of AudetteMedia, pointed out.
Product pages with text supplied by manufacturers that are repeated across "dozens, hundreds even" of e-commerce sites, duplicate content that's created from mass syndication of news, and comparison shopping networks are examples of thin-content sites, Audette told TechNewsWorld.
Legitimate victims reportedly include Cult of Mac, a popular Apple-focused blog;onedayonejob.com; and Complete Review. These sites apparently saw a sharp drop in Google search rankings just after the algorithm went into effect, though Cult of Mac's standing was later reportedly restored.
One site, Mahalo, was hit so hard by the new algorithm that founder Jason Calacanis reportedly laid off 10 percent of its staff.
Calacanis did not respond to requests for comment by press time.

Reaping What You Sow

Still, it's almost a given that some legitimate sites will get clobbered by Google's algorithm.
"We've seen collateral damage happen with just about every large Google update," AudetteMedia's Audette said. "It's part of the Google landscape."
Overall, however, the new algorithm seems to have managed to weed out some of the bad players. Searchmetrics' analysis of 39 content farm-style domains showed they suffered an average fall of 57 percent in performance.
"Suite101.com, blippr.com and answerbag.com are definite examples of content farm-style domains," Horst Joepen, Searchmetrics' CEO, told TechNewsWorld. All were affected by the change in Google's algorithm.
However, sites such as Wikihow.com, Answers.yahoo.com, Instructables.com,Howstuffworks.com and eHow.com gained in terms of absolute visibility.
Absolute visibility refers to the absolute change in the index value, Joepen said. This is calculated to indicate an estimated number of visitors to a site daily. "Wikihow, for example, recorded an approximately additional 200,000 visitors per day from their organic search results," he added.
News portals such as MSN.com, Mashable.com, ZDnet.com and Wired.com also saw their search result rankings improve after Farmer Update was introduced.
"From where we sit on the SEO side, there isn't much collateral damage in this release," AudetteMedia's Audette said. "In several cases, our clients are faring better after the release of the algorithm."

A Knight in Rusty Armor

Perhaps the "Farmer Update" algorithm isn't designed to take out all content farms plaguing Google.
"Google's calculation may be to keep the two or three largest farms in the mix while cutting off the smaller ones," Mark Ballard, a senior analyst at the Rimm-Kaufman Group, told TechNewsWorld.
"This way, they preserve a good chunk of the content farm traffic and its revenue, while making their results appear less cluttered," Ballard added.
That may not sit well with owners of legitimate websites, who suffer from the impact of content farmers on their search rankings.

How to Survive Being Collateral Damage

Owners of legit sites that have been hammered by Farmer Update should assess their historical practices, strengths and weaknesses and see if there's anything in their profiles that could be deemed against Google's webmaster guidelines, AudetteMedia's Audette suggested. Then, they should file a re-inclusion request with Google.
"Low-quality pages on one part of a site can impact the overall ranking of that site," Google spokesperson Jake Hubert pointed out.
Publishers who feel they were wrongly impacted can post in Google's webmaster forums, Hubert added.
Legit website owners should also look to diversify.
"Google represents the lion's share of revenue potential," Audette stated. "However, I think marketers need to diversify and start looking beyond Google, to YouTube, Bing, and social media," he added.

Ubisoft Attacks Music Gaming With a Real Ax

With the demise of "Guitar Hero," it began to look like "Rock Band" had cornered the music video game market. But along comes "Rocksmith," an upcoming title that allows the user to play along using a real guitar plugged into the video game console. It's certainly a departure from the simplistic controllers used by other games, but will requiring players to jam with a real instrument limit its popularity?

The October release of Electronic Arts (Nasdaq: ERTS)' "Rock Band 3" turned a lot of heads with its so-called Pro guitar controller that looks and feels more like a real guitar than any previous title in the series. But now another publisher, Ubisoft, is taking things a step further, having just revealed an upcoming game called "Rocksmith" that will actually let players plug in a real guitar to play along.
Rocksmith
Rocksmith
The game will come with a special USB cable that connects to the quarter-inch jack slot found in any electric guitar. Other than that, few of the game's details have been disclosed, but that hasn't prevented speculation from flowing freely.
Precarious Timing
Presently, the future of the music video game genre is murky. It was just a matter of weeks ago that Activision shut down its legendary "Guitar Hero" franchise. The "Guitar Hero" brand apparently just couldn't compete with "Rock Band," which has become the de facto band simulation game.
The introduction of a new music game focused on playing guitar seems almost suicidal in that context, Jordan Cressman, an analyst at Market Communications, told TechNewsWorld. "This is like opening up a Chinese restaurant in place of a Japanese restaurant that just went out of business. It may seem like a completely different endeavor, but will it be all that more enticing in the eyes of consumers?

"The closure of 'Guitar Hero,' which was for years one of Activision's most valuable brands, means one thing: Consumers have chosen what they want out of a musical instrument simulation game, and they want 'Rock Band.' It's going to be very difficult for a new company to come in and convince them otherwise, especially if they see the real guitar aspect as nothing more than a gimmick," said Cressman.
The shattering of Activision's "Guitar Hero" studio may have come at an unfortunate time for Ubisoft, but the fact is that development on "Rocksmith" has likely been underway for quite a while. Back in 2009, Ubisoft posted a job listing for a lead designer in what it referred to as "an exciting new cross-platform music-based game." That could very well have been "Rocksmith."

What Ubisoft Can Bring to the Table

Aside from a few games for the Nintendo DS, Ubisoft has virtually no experience creating guitar-based music titles. However, the company has recently gained acclaim for its dance games like "Just Dance" and "Michael Jackson: The Experience," so it has learned some of the tools of the trade when it comes to making rhythm-based music games.
The game promises to have music from such artists as The Animals, David Bowie, Nirvana, and The Rolling Stones, and its debut trailer says it's for all audiences, "whether you've performed on a stage or never played at all."
However, Chris Morris, freelance video game journalist, is skeptical. "Ask anyone who's tried to play guitar after mastering the plastic 'Rock Band controller' -- playing a real guitar is difficult. It can take years of training to be able to competently perform a rock song," Morris told TechNewsWorld.
"Rocksmith will most likely not be for the casual gamer, but it will open up a new market," he added. "There are 'Rock Band' players who want to learn how to play the guitar, and there are real guitarists who want to have a better guitar-playing simulation in a game. Rocksmith will cater to these crowds."

Built for a Limited Audience?

The game is unlikely to have the same blockbuster success as "Rock Band," Morris said, since most casual gamers won't have the time, desire, and skill to learn the complex finger maneuvering required to play a real guitar.
"Most people who play 'Rock Band' are quite happy with the experience they get, and don't have any aspiration to break out an authentic Gibson five-stringer."

America's Perilous Patchwork of Privacy Laws

The United States, unlike many other developed nations, does not have a federal consumer data protection law. Instead, a patchwork of laws apply in different situations. We have laws that apply to medical records, other laws that apply to banking records, still others that apply to data collected by the government, and so on. And some of those laws are stronger than others.

As a concept, the notion of online privacy seems to rank right up there with the Tooth Fairy.
Facebook has declared that all posts by members on their walls are public property; Google(Nasdaq: GOOG) keeps getting into trouble with various governments over the data its Street View cars collect; and you can forget about your Tweets being private -- the Library of Congress is recording them.
"Consumers can't expect much privacy in online services like Google, Facebook and Twitter," Rainey Reitman, activism director at the Electronic Frontier Foundation, told TechNewsWorld.
There are few laws protecting consumers on the Web, Reitman pointed out. Meanwhile, law enforcement "continues to seek ways to expand their online surveillance powers."
For example, a federal magistrate judge in Virginia has ruled that the government can collect the private records of three Twitter users as part of its investigation into WikiLeaks, and that it can prevent those users and the public from seeing some of the documentation submitted to justify obtaining those records.
This could set a precedent that will let the government secretly amass information related to individuals' communications over the Internet, the EFF has cautioned.

Federal Thud and Blunder

Even when the federal government tries to introduce new services, new problem areas open up.
For example, the United States Department of Transportation has come up with the Next Generation 911 Project in an attempt to let consumers contact emergency services through multiple means, including text messaging, photos and email.
However, the EFF pointed out in comments filed with the Federal Communication Commission that the proposal might provide data such as medical histories to first responders. That's a well-intentioned move that could backfire because, as the EFF stated, 911 calls are subject to public disclosure in most parts of the country. This could conceivably make details about a person's illness or medical history publicly available to anyone who asks for them.

Gimme Shelter

"We can expect very little privacy when we visit Google or Facebook," Darren Hayes, a professor at Pace University's Seidenberg School of Computer Science and Information Systems in New York City, told TechNewsWorld.
For example, the Google search engine saves the searches users make. This could be used to identify the searchers.
Google's end user license agreement "clearly states that they are allowed to read through your Gmail messages," and your emails may end up on Google even if you don't have an account with the Internet giant because they may have been forwarded by the recipient, Hayes remarked.
Facebook has a similar privacy policy and apparently believes it's entitled to use images, Hayes stated.

The Yin and Yang of Being Online

It seems nobody's nailed down just how much control we should, or want to, have over our personal information on the Internet.
For example, users often put an astonishing amount of personal information on social networking sites to which they belong, or on their personal websites. At the same time, many of them scream in outrage when they're tracked online.
Consumer objections to online tracking have led the major browser vendors to work on plans to implement so-called do-not-track features into their products. This will let consumers indicate when they don't want to be observed online.
For example, Microsoft's (Nasdaq: MSFT) Internet Explorer 9 browser, which was released earlier this week, includes beefed-up controls that let users control what information about their behavior they're wiling to share with others.
Consumer resentment at online tracking is fueled to a large extent by targeted online ads -- a poll conducted jointly by Gallup and USA Today in December found that 67 percent of respondents don't think advertisers should be allowed to match ads to their specific interests based on websites they've visited.
However, consumers' outrage is perhaps overblown -- if you leave tracks on public sites, which many free online communities and services are, you can hardly complain when someone collates that information.
"Surfing the Internet should be viewed as no different than walking down Main Street," Daren Orzechowski, a partner at the law firm White & Case, suggested.
"The problem is that people do not think this way when using a computer in their home," Orzechowski remarked.

What Is Privacy, Anyway?

Privacy "is a right to be free from unauthorized observation or intrusion," Orzechowski told TechNewsWorld.
However, the scope of that right is defined by the law. While Europe has "very strict" laws around data privacy and protection, the United States framework is "much more relaxed" for now, Orzechowski said.
That then raises the question of what is the scope of the right to privacy.
"In the constitutional area, where the government is involved, the law is much clearer," Orzechowski said. "In the private sector, it is not, and that is the issue being debated."
New privacy laws or regulations may emerge in the next year or two, Orzechowski added.
In the meantime, consumers would do well to remember that the Constitution doesn't contain any guarantee of privacy.
"I think there is an unsupported view by the general public that there is a broad right to privacy, when in reality any such rights, to the extent they actually exist, are not as broad as people think," Orzechowski warned.

Where Is the Law?

Part of the problem with protecting our privacy is that the United States, unlike other developed nations, does not have a federal consumer data protection law, the EFF's Reitman pointed out.
Instead, we have a patchwork of laws that apply in different situations. We have laws that apply to medical records, other laws that apply to banking records, still others that apply to data collected by the government, and so on. Some of those laws are stronger than others, Reitman said.
Another part of the problem is that these laws are based on an outdated model that associates consumers' privacy rights with demonstrable financial harm.

Critics Poke Holes in Android vs. iPhone Browser Test

Blaze.io has concluded that Android's embedded browser is about 50 percent faster than the one found in Apple's iOS mobile operating system, based on a series of tests the company performed. However, critics have questioned the test's methodology as well as Blaze's motives in conducting and publicizing the study.

The browser in Google's (Nasdaq: GOOG) Android mobile operating system is more than 50 percent faster than the browser found in Apple's (Nasdaq: AAPL) iOS, according to companyBlaze.io.
Blaze tested the embedded browsers in Android 2.3 (aka "Gingerbread") and iOS 4.3. These were WebView and UIWebView, respectively.
The tests were conducted against websites of Fortune 1,000 corporations.
Also among Blaze's findings is that although both Google and Apple claim they've sped up JavaScript in Android 2.3 and the browser in iOS 4.3, respectively, it had no impact on browser speed.
However, shortly after the results were publicized, information surfaced suggesting the tests could be flawed.

Details of the Tests

Blaze used an iPhone 4 running iOS 4.3 and a Google Nexus S smartphone running Android 2.3 for its tests.
"We chose the Nexus S since it's the poster boy for Android 2.3," Guy Podjarny, Blaze's chief technical officer, told LinuxInsider.
The methodology Blaze outlined appears impressive. It loaded Web pages from the sites of the Fortune 1,000 companies three times on each device, on different days, and used the median load times as a basis for comparison.
Results with speeds greater than 40 seconds or less than 400 milliseconds were weeded out because these indicate network and server errors, Blaze said.
The tests were conducted over WiFi in an area with good reception. They were conducted at night to eliminate noise and to maintain consistent results.
Measurements were conducted using a custom app developed by Blaze that uses the platform's embedded browser. This loads a page on demand and measures how long that takes. Load times were calculated using the "Document Complete" callback from the browser, which Blaze contends is a standard way of measuring a Web page's load time.
To distinguish mobile sites from non-mobile ones, Blaze loaded the same 1,000 pages through Internet Explorer 8 and compared the number of resources required to load the page on the iPhone versus the number required for IE 8.
If the desktop browser required 30 or more additional resources, the site was designated as mobile, meaning it had a customized version for mobile device access.

The Test Results

The tests showed that the Android browser was 52 percent faster on average than the iOS browser. Also, Android was faster on 84 percent of the sites tested.
Blaze also found that, although Apple and Google had optimized the JavaScript engines in iOS 4.3 and Android 2.3 and boasted of the resulting performance improvements, browsing speed wasn't really different when it came to loading Web pages.
Blaze used real websites in its tests because custom-created benchmarks, such as the SunSpider JavaScript benchmark that's normally used, don't reflect the actual user experience.
That's to be expected, because the SunSpider benchmark tests the core JavaScript language only, according to its Web page.
However, when tested against mobile-specific sites, Android 2.3 was only 3 percent faster than Safari on iOS, Blaze said. Mobile sites tend to be smaller and lighter than regular Web pages.

Putting Out the Fire

However, the tests are perhaps not quite what they're made out to be.
"I'm skeptical because I think Blaze have an ulterior motive in conducting tests, seeing that they optimize Web performance," Mark Beccue, a senior analyst at ABI Research, told LinuxInsider.
The results shouldn't be taken as set in stone, because the speed of native browsers on mobile devices is affected by the processing chips used, Webkit, and the network, Beccue pointed out. "The Android browser's speed is particularly hardware-dependent because the operating system is licensed to different hardware vendors," he added.
Finally, the Nexus S is not the best test platform because "it's not a widely used phone," Beccue cautioned.
Meanwhile, The Loop points out that using an embedded browser is not the same as using the official browser.
UIWebView, which Blaze tested, is based on Safari but didn't receive any of the updates Safari did in iOS 4.3, The Loop claimed.
Blaze did not respond to follow-up calls to discuss these issues.

But Really, Who Cares?

Perhaps the difference in speed may not be all that important, suggested Rob Enderle, principal analyst at the Enderle Group.
"On this last cycle, other things have started to take precedence over speed -- safety and the ability to take advantage of the new multi-core processors," Enderle told LinuxInsider. "Once you realize that speed problems generally come from the network or plug-ins you may decide to focus more on other aspects of the browser, such as safety, in order to differentiate your product."
Further, the browser is not a focus of Apple's, Enderle pointed out.
"Apple feel the browser is something they have to have, but not something that differentiates the product positively, nor is it something that helps lock the customer into the Apple ecosystem," Enderle stated.
For Google, on the other hand, the browser is "critical to what they provide, and their future is Web-based, so they invest a great deal in making their browser competitive," Enderle remarked.