Sunday, July 31, 2011

Study: Dumb People Use Internet Explorer

Internet Explorer logo

Here comes the flame war. According to a new report, dumb people are more likely to use Internet Explorer than smart people. It's a finding so apparently defamatory that the company responsible for the statement is allegedly being threatened with a lawsuit by inflamed Internet Explorer aficionados.

Online psychometric testing company AptiQuant, based out of Canada, turned its analytical skills to a group of more than 100,000 individuals in an effort to determine the IQ scores associated with various Web browser users. Over a period of around four weeks, the company gave a Wechsler Adult Intelligence Scale (WAIS) to users looking for free online IQ assessment tests, then recorded the results and browsers used for all participants above the age of 16.

Across the board, the average IQ scores presented for users of Internet Explorer versions 6 through 9 were all lower than the IQ scores recorded for Firefox, Chrome, Safari, Camino, and Opera users. Humorously enough, those using Internet Explorer with the Chrome frame built-in actually ranked third in IQ scores among this browser list. Opera users reported the highest average IQ score – hovering around the 120 to 130 range, which is a bit higher than the WAIS test's population mean of 100 (and standard deviation of 15).

AptiQuant's report notes that the only statistically significant difference in IQ scores occurred between Internet Explorer uses and their counterparts. There was not a significant difference in IQ scores between non-IE browser users, even though these users, in aggregate, reported a higher average IQ score than IE users.

"In addition, the results were compared to another unreleased study of a similar nature undertaken in year 2006. The comparison clearly suggests that more people on the higher side of IQ scale have moved away from Internet Explorer in the last 5 years," reads the report.

Although AptiQuant does get a little heavy-handed against Internet Explorer in its report, suggesting that the "nuisance" browser should be "eradicated," the company has been quick to note that its findings only indicate a one-way relationship between IQ scores and browser use. Perhaps, in part, prompted by alleged threats of a lawsuit against the company by upset Internet Explorer fans.

"I just want to make it clear that the report released by my company did not suggest that if you use IE that means you have a low IQ, but what it really says is that if you have a low IQ then there are high chances that you use Internet Explorer," said CEO Leonard Howard.

For more from David, follow him on Twitter @TheDavidMurphy.



Friday, July 29, 2011

Save your friends from outdated email



You’ve probably already improved the lives of your friends and family members by helping them switch to Gmail, but what about that one friend who still hasn’t made the switch? It’s time to take a stand and stage an intervention. 


How to stage an intervention Follow these three simple steps

1 Select from your contacts

2 Create a customized email

3 Send to your friend



ShareMeNot: Firefox plugins takes the tracking out of social media buttons


Did you know that buttons like these allow Facebook, Google, LinkedIn, and others to track your online browsing activities on every site that includes one of these buttons, even if you never click the buttons and (in some browsers) even if you have third-party cookies disabled?




Students in the University of Washington Computer Science project have created “ShareMeNot,” a Firefox Add-On that defangs social media buttons like the Facebook “Like” button (and others) so that they don’t transmit any information about your browsing habits to these services until (and unless) you click on them. That means that merely visiting a page with a Like or a Tweet or a +1 button (like this one) doesn’t generate a data-trail for the companies that operate those services, but you still get the benefit of the buttons, that is, if you click them, they still work. Smart.

ShareMeNot is a Firefox add-on designed to prevent third-party buttons (such as the Facebook “Like” button or the Twitter “tweet” button) embedded by sites across the Internet from tracking you until you actually click on them. Unlike traditional solutions, ShareMeNot does this without completely removing the buttons from the web experience.

http://sharemenot.cs.washington.edu/

Thursday, July 28, 2011

A Touch Mouse’s Tale

How do you take a concept from research to product? In the case of the Microsoft Touch Mouse, it took a collection of prototypes, collaboration between transatlantic teams, and a lot of user testing. It also helps when the research that launched the project won the best-paper award during the Association for Computing Machinery’s 22nd Symposium on User Interface Software and Technology.

Mouse 2.0: Multi-Touch Meets the Mouse, a joint effort between Microsoft Research Redmond, Microsoft Research Cambridge, and Microsoft’s Applied Sciences Group, introduced five research prototypes, each exploring a different touch-sensing strategy that influenced the design of different mouse form factors and their interaction possibilities. The research featured extensive user feedback, as well as practical comparisons of different techniques for enabling multitouch on the desktop. The prototypes included three camera-imaging approaches, multiple optical sensors, and the use of capacitive sensors on a curved surface.

Members of the Mouse 2.0 research team expressed the hope that they would be able to refine their prototypes, both ergonomically and in terms of their sensing capabilities, and make a deeper exploration of the interaction techniques specific to this new class of input devices.

The researchers soon got an opportunity to refine their prototypes. Microsoft Hardware decided to get behind the research, and a team was formed to bring a multitouch mouse to market.

New Possibilities for the Humble Mouse

Hrvoje Benko, researcher with the Adaptive Systems and Interaction group at Microsoft Research Redmond, has worked on both the Mouse 2.0 research and the Microsoft Touch Mouse product-development project. He recalls one of the key product decisions: selecting from five prototypes the one that would be the launching point for the new device.

“In the end,” Benko says, “we selected the prototype using capacitive touch sensing to track the position of multiple fingers on its surface. This approach offered the most consistency and flexibility in terms of how we could mount and integrate the sensor, which is important in a small form factor. Plus, unlike camera-based tracking, there are no issues with ambient light, so you reduce the calibration issues. It’s a much more controllable sensor.”

Microsoft Touch Mouse
Microsoft Touch Mouse: the final form factor.

Although the choice of prototype simplified some of the technical issues, there were still plenty of challenges when it came to refining the mouse to the point where it was ready for consumer use. The design of the final form factor required sculpting and testing of hundreds of models. The team also examined user interactions and evaluated the kinds of gestures that made sense, developing an entire gestural set focused on enhancing window manipulation and management. At the same time, core technologies, such as firmware and hardware for capacitive sensing, had to be built and optimized for this specific form factor and device functionality.

“The gesture-recognition software is the brains behind all these interactions,” says John Miller, software architect with the Cambridge Innovation Development team at Microsoft Research Cambridge. “Our gestures are multitouch and designed to amplify your experience with Windows 7. So they are optimized for window management: docking, moving, minimizing and maximizing, going backward and forward on your webpage, switching between tasks, and so on.”

Getting the Right Touch

Benko and Miller agree that one of the toughest problems they tackled was the requirement that users should be able to operate the device using classic point-and-click interactions, as well as the newly developed set of multitouch gestures. The mouse form itself added complications: The shape encourages a user to rest both palm and curled fingers on the entire touch-sensitive surface, creating constant contact.

“That made everything much, much harder,” Benko smiles ruefully. “Instead of making palm rejection and other issues easier, it added a few more challenges. But at the end of the day, our goal was to have a comfortable, great-looking mouse that people enjoy using, with a nice look and feel that support the gestures, so it was definitely worth the effort.”

Unlike touch-screen devices on which one or two clear touches make user commands easy to interpret, a small sensor surface and the nature of mouse usage creates an entirely different set of problems.

“If you have a touch-sensitive phone,” Miller explains, “you interact by touching the screen, and as soon as you’re done, your finger lifts off the screen. We have completely different issues with the mouse. We have a device that not only has to support gesture touches, but also has to deal with times when the user is just holding it.

Gestures for controlling Touch Mouse
A core technical challenge: developing a gesture set that enables clear differentiation between various types of user contact with the touch surface.

“Next, fingers can be very close together when making contact. To the sensor, they can appear as one finger rather than multiple fingers. But if you want to have reliable gesture recognition, you need a way to differentiate between one, two, or three fingers. We had to develop technology that enhances signal processing and reliably tracks contacts.

“And here is one more example,” Miller continues. “Everybody holds the mouse in a slightly different way. Some people hold their fingers flat on the mouse, and people with very small hands will hold the device differently than people with very large hands. So the mouse does not make contact the same way for all users, and they are all going to be performing these gestures in a slightly different way. As a result, there’s a lot more ‘noise’ to handle than from a touch-screen phone or a Tablet PC. We had to deal with a lot more data.”

To mitigate some of these problems, the team set a design goal that gestures should be both intuitive and distinct—the kind that would be hard for a user to perform by accident. This helped simplify the job of the recognition software.

They also developed a tool that recorded sensor data while human testers were using the mouse for actions such as pointing and clicking, multitouch gesturing, and grabbing and releasing the mouse.

“We ended up with data examples of good gestures for mouse usage and unintentional movements,” Benko says, “and this helped us conceive strategies for distinguishing between intentional gestures and incidental movements. It’s what allowed us to develop an engine that’s able to recognize some movements and ignore others.”

Collaboration Delivers a Quality Product

The Microsoft Touch Mouse project is unusual compared with other hardware-development projects, because it is not simply about hardware. Rather, it is a product that combines multiple disciplines in a tightly integrated way, a task that would have been impossible without close collaboration between multiple Microsoft Research and Microsoft hardware-development teams in different locations.

Decisions about the final product, for example, involved testing and evaluation of different prototypes and features by all parties.

Hrvoje Benko and John Miller
Touch Mouse collaborators Hrvoje Benko (left) and John Miller.

“There were a lot of concepts from the original research,” Benko says, “and some of those we decided to leave out. That doesn’t mean they were bad ideas, just that we were being very careful about our choices. It’s how making a product works: You assess the pros and cons of every choice. Both the research and hardware teams were focused on nailing down the core experience, to make sure that everything we included was critical and didn’t distract from the user’s task. Our goal was to deliver a delightful, fluid desktop experience.”

Even though the multitouch-mouse project officially belonged to the hardware team, Microsoft Research remained integral to the development.

“The original Mouse 2.0 paper was just the starting point,” Miller says. “The research efforts didn’t stop there. They continued in tandem with product development. There was a lot of additional research from different parties before we could turn the multitouch-mouse concept into a device that consumers can buy off the shelf.”

For Benko and Miller, one of the most rewarding aspects of this project has been the close collaboration between the hardware team and Microsoft Research in both Cambridge and Redmond. It went beyond technology transfer and was absolutely critical to delivering a successful transition from research prototype to consumer product.

The Microsoft Touch Mouse proves that quality research doesn’t have to address technologies that are many years away from commercialization. Sometimes, it’s about exploring new possibilities. There’s always room for a better mousetrap—make that, a better mouse.



Microsoft's MS-DOS is 30 today

MS-DOS is 30 years old today. Well, kind of. On 27 July 1981, Microsoft gave the name MS-DOS to the disk operating system it acquired on that day from Seattle Computer Products (SCP), a hardware company owned and run by a fellow called Rod Brock.

SCP developed what it at various times called QDOS and 86-DOS to run on a CPU card it had built based on Intel's 8086 processor.

MS-DOS 1.19

Command line: MS-DOS 1.19 still running after all these years

The company had planned to use Digital Research's CP/M-86 operating system, then still in development. But, having released the card in November 1979 - it shipped with an 8086-compatible version Microsoft's Basic language interpreter-cum-operating system - and reached April 1980 without CP/M-86 becoming available to bundle, SCP decided it had to create its own OS for the card.

Enter, in August 1980, QDOS. It really did stand for Quick and Dirty Operating System. That's actually what it was: a basic but serviceable OS good for coding and running programs written in 8086 assembly language - the x86 instruction set. It was written by SCP's Tim Paterson, who had joined the company as a programmer a couple of years previously and began work on it in April 1980.


Some observers later claimed that QDOS too closely resembled CP/M for comfort. Paterson himself would later say that QDOS' design criteria specifically included the abililty to support programs written for CP/M and compiled for the 8086. That's not at all surprising given that SCP undoubtedly saw QDOS as a temporary stand in until Digital Research (DR) shipped CP/M-86.

The picture we have today is muddied by the claims that IBM originally wanted to use CP/M-86 in its first personal computer. IBM and DR famously failed to come to terms that would allow CP/M-86 to be bundled with the PC, and IBM turned to Microsoft for an alternative. Digital Research founder Gary Kildall, who died in 1994, would later allege that Microsoft's product was a rip off, fuelling plagiarism claims that Paterson has always denied - he reverse engineered it.

CP/M-86

The competition: CP/M-86 in action
Source: Wikipedia

Update My fellow Reg hack Andrew Orlowski points out that, no matter what Paterson says, the US court ruled against the programmer in a defamation lawsuit he brought against publisher Little Brown for claiming the origins of QDOS were not clear-cut.

Back in 1980, Paterson continued to evolve QDOS through the year, the OS being renamed 86-DOS - it was now evidently no longer viewed as a rough-and-ready stand-in - between September and December 1980. Accounts differ as to when the name - and the OS' status - was switched, but December is the date Paterson himself gave during a Softtalk interview published just a few years later.

'Hi, it's Microsoft. Say, can we license your OS?'

It's at this point that Microsoft re-enters the picture, acquiring from SCP a licence to market and sell 86-DOS, paying $25,000 for the privilege. Microsoft was now working with IBM in place of DR - the two had been partners since November 1980 - to supply the operating system for the hardware giant's first personal computer, but it kept IBM's identity hidden from SCP and Paterson until it acquired the OS in its entirety the following year.

"We all had our suspicions that it was IBM that Microsoft was dealing with," Paterson would later say, "but we didn't know for sure."

MS-DOS Advert

Microsoft would later advertise MS-DOS' claimed superiority to CP/M-86

Microsoft had been in contact with SCP ever since the latter asked to use its Basic, so it would have been aware of SCP's work on QDOS, the operating system's design goals and its convenient compatibility with CP/M-86. Microsoft would have seen how closely QDOS matched the product it had been commissioned to supply to IBM, and its ties with SCP would have helped it gain that initial re-distribution licence.

You can read a copy of the 86-DOS Programmer's Manual (PDF) here.

By July 1981, Microsoft had sufficient understanding of IBM's plans - and the vision to conceive of what the personal computer market might become - to consider not merely licensing 86-DOS but buying it outright from SCP, for a further $50,000 - $75,000 in total, $180,000 (£112,000) in today's money. SCP was allowed to continue to offer the OS with its own hardware. Paterson had already quit SCP, in April 1981, to join Microsoft the following month.

Seattle Computer Products DOS diskettes

Seattle Computer Products' DOS
Source: Ty's Computer Interest Website

"So on 27 July, 1981, the operating system became Microsoft's property," Paterson said in the 1983 Softtalk interview. "According to the deal, Seattle Computer can still see the source code, but is otherwise just another licensee. I think both companies were real happy. The deal was closed just a few weeks before the PC was announced. Microsoft was quite confident."

In August 1981, Big Blue introduced what would eventually become known as the IBM PC, though it was originally the 5150. It was based on the Intel 8088 CPU, a lesser - but cheaper - version than the 8086 that used an 8-bit external bus rather than the 16-bit bus found on the 8086.

Paterson came with his operating system, and stayed with Microsoft for a year while 86-DOS was honed into MS-DOS 1.0, released as a standalone product early in 1982. He left in March 1982, after the completion of MS-DOS 1.25, but would later return (twice) to Microsoft, where he would go on to work on Visual Basic. He eventually formed his own hardware company, Paterson Technology, though his blog now lists his status as retired.

MS-DOS 3.2 box

Microsoft boxes up MS-DOS 3.2
Source: Hugepedia

Now 55, Paterson continues to blog about the QDOS' development, emphasising the reasons for its CP/M friendliness yet stressing its under-the-hood differences.

MS-DOS triumphant

From July 1981, SCP continued to sell the operating system it had created, now calling it Seattle DOS and bundling it with its hardware products. It continued to do so until 1985, by which time its was clear buyers wanted systems, and cheap ones - whether from IBM or the many 'cloners' who'd released products compatible with its technology.

MS-DOS Advert

Microsoft advertises DOS in 1983
Source: Fraggle UK at Flickr

Brock now sought to sell his rights to MS-DOS, a scheme with which Microsoft was not best pleased and said its agreement with SCP did not permit. Brock sued, and the case went to trial in the last few months of 1986. Brock and Microsoft quickly came to an out-of-court arrangement, however: Brock sold his licence to Microsoft for $925,000, leaving the software giant in complete ownership of the OS.

Through this time, Microsoft was releasing version after version of MS-DOS, each mirrored by a release of IBM's IBM-DOS and, later, PC-DOS, as its take on the OS came to be called.

Other versions appeared, tweaked by PC manufacturers using Microsoft's OEM kit to more closely fit the specifics of their hardware. Many would run software developed for the IBM PC, others would not, though they would run generic MS-DOS-compatible applications.

CP/M-86 was eventually released, in 1981, and subsequently offered by DR as a third-party alternative to MS-DOS. As you can see from the ad above, Microsoft saw it as as a threat. DR's OS was bundled with a number of IBM PC rivals, from the likes of Apricot and Siemens.

You can view the source code for CP/M-86 - and other versions of the OS - here.

In May 1988, CP/M-86 was effectively re-released as DR-DOS and pitched more directly as an alternative to MS-DOS itself than to IBM's PC-DOS.

DR-DOS found many supporters but failed to dent Microsoft's market share. Microsoft quickly established the technique of announcing new MS-DOS features well ahead of their appearance, previously seen as an approach that could only kill sales of the current version. Instead, it kept buyers away from rival offerings, and it's now a common tactic employed by highly competitive tech companies.

MS-DOS 6

MS-DOS gets upgraded, kind of

Meantime, MS-DOS continued to evolve, gaining a graphical user interface of sorts with version 4.0, disk compression tech with version 6.0, and FAT32 support with version 7.1.

Version 4.0 should have been the final release - even Microsoft said so, announcing in 1987 that "DOS is dead" and that we should all be using OS/2, jointly developed by IBM and Microsoft, though the latter stepped away from it when Windows 3.0 became a huge success. That's another story.

Microsoft's work on DOS eventually took the OS to version 8.0, the release used for Windows XP boot discs. With that release, on 14 September 2000, MS-DOS development formally came to an end, though significant work stopped some years earlier with MS-DOS 5.0 when it ceased to be offered as a standalone product. ®



Is Apple declaring war on DVDs?

Apple is pulling the DVD drive from its Mac mini. Are we moving towards a future of only streaming video?

In its new product announcement last week, Apple rolled out a lot of new features – including significantly faster processers and greater expandability for its Macbook Air and Mac Mini lineups. But Cupertino also quietly took something out of its lineup (besides the vanilla Macbook, that is): the Mac Mini is now missing its DVD drive.

Skip to next paragraph

They say you need at least two data points to draw a trend, and now we have them: the Macbook Air has never had an optical drive, and now that the Mini’s has disappeared as well, it likely indicates that the company is eyeing a future in which media doesn’t come on a DVD – or a CD-ROM or Blu-Ray disc, for that matter.

For a lot of companies – and a lot of users – the move to a discless world makes a lot of sense. It’s easier for both parties to deal in digital downloads – as opposed to the comparatively byzantine process of burning software to physical media, packaging it, and shipping it around. And the exclusion of an optical drive allows computers to be that much smaller, lighter, and less expensive.

This isn’t the first time Apple’s been in this position, either. Back in 1998, the company introduced the original iMac without a floppy drive, pulling the plug on a technology that was still considered standard. (In hindsight, that was probably a good call, though Apple’s move caused quite an outcry at the time.)

For a lot of people, though, it really is too soon to ditch the discs. Let’s assume that Apple will continue to remove optical drives throughout its laptop and desktop lines, as it did with the floppy drive: this is probably an unwelcome scenario to anyone hoping to watch a DVD on an airplane.

There’s also the home theater crowd to consider. The previous generation Mac Mini, with its flexible display options and DVD drive, gained acclaim as a near-perfect media player (just hook it up to an HDTV and you’re good to go!). But now that the drive is gone, the retribution from home-theater enthusiasts is swift. Over at tech site Engadget, Nilay Patel cited the lack of DVD support as his biggest gripe with the machine, saying, "The Mac mini looks like it'd be the ideal home theater PC … [but] having access to Hulu, Boxee, iTunes and Netflix is just half of the story -- there aren't too many HTPC owners that never pay their local Redbox a visit."

There’s no reason to think that Apple will bring back optical drives in the future, which means it’s also unlikely that it’ll ever introduce Blu-Ray drives in Macs. Steve Jobs famously referred to Blu-Ray as “a bag of hurt” back in 2008, and it’s worth pointing out that when Lion, the next iteration of the Mac OS X operating system, arrives in a physical format in August (it’s download-only for now) it’ll be on a USB stick, not a disc.

What’s your take on Apple’s move? Have you moved on from optical media already, or do you have a collection of discs that must now sit unplayed? Let us know in the comments section. In the meantime, sign up for our free weekly newsletter, which arrives every Wednesday.



Mac mini review

For those familiar with last year's Mac mini, what you're peering at above isn't likely to strike you as jarring. Heck, it may even seem somewhat vanilla at this point. In truth, Apple did exceedingly little in terms of design changes with the mid 2011 Mac mini, but given the relatively recent cosmetic overhaul, it's not like we were genuinely expecting anything above a top-to-bottom spec bump. And that, friends, is exactly what we've received. The mini remains quite the curious beast in Cupertino's line -- it's the almost-HTPC that living room junkies are longing for, yet it's still a country mile from being the headless mid-tower that Apple steadfastly refuses to build. It's hardly a PC for the simpleton (given that it's on you to hunt down a mouse, keyboard and monitor), and it's actually taking a giant leap backwards on one particularly important front. Care to hear more? You'll find our full review just past the break.

Hardware and design
Make no mistake about it -- the mini is just gorgeous to look at. As with the prior model, this 2.7 pound slab of aluminum looks nicer than its price tag indicates, and it honestly feels more like a decoration than a computer. It's sized at 7.7 x 7.7 x 1.4 inches, exactly the same as its predecessor, and outside of the chromed Apple logo on the top, a matte black strip of ports on the rear and a similarly hued lid on the bottom, it's a clean sweep of brushed silver. It'll sit nicely on its edge for those contemplating a vertical installation, but the protruding lid on the bottom makes it a little less elegant for those applications.

Speaking of the rear, the dozen connectors found there aren't cosmetically different than those on the last build. From left to right, you'll find an AC input, gigabit Ethernet jack, FireWire 800 port, HDMI (full-size), Thunderbolt, four USB 2.0 sockets, an SDXC slot, an audio input and a 3.5mm headphone port. Funny enough, last year's DisplayPort socket looks identical to this year's Thunderbolt connector, and not surprisingly, DisplayPort monitors and peripherals will happily fit themselves in with no adapters needed. For what it's worth, Apple does include an HDMI-to-DVI adapter, but oddly, no Thunderbolt dongle. Sure, we know those cables are laced in gold, but what better way to encourage adoption of a new I/O port than to toss in an appendage for newcomers? Even a DisplayPort / Thunderbolt-to-HDMI or DVI cable would've been greatly appreciated -- making it simple to hook up dual displays right from the get-go would have seriously tickled our fancy.


Tinkerers are bound to love that bottom lid... and then grow frustrated by what's underneath; a simple twist reveals a WiFi module, cooling fan, two SODIMM slots and plenty of other, not-easily-accessible components. Our test unit came with a pair of 1GB memory modules, but even the greenest DIYer could swap those out with more sizable ones -- a couple of snaps and a tug is all it took. Unfortunately, we're still miffed at Apple's decision to keep the HDD away from a user's fingertips. If we had our druthers, the RAM wouldn't be the only thing that's just a few clips away, but alas, we're stuck with what we've got.

We shouldn't have to chide Intel and Apple (and whoever else wants to claim responsibility) for not having USB 3.0 on Macs in the year 2011, but regretfully, we are. A foursome of USB 2.0 ports are cute, but when sub-$400 netbooks are boasting SuperSpeed USB ports... well, let's just say it's about time Apple took notice. Unfortunately, Steve Jobs still seems to think that the newest iteration of the world's most popular port isn't going anywhere fast, so we're apt to see Thunderbolt pushed as the true USB 2.0 replacement. That doesn't mean we have to like it, though.


Given that it's the only new port onboard, it's worth mentioning that Thunderbolt is a fantastic addition to the array. The ability to daisy-chain monitors and peripherals off of it enables the bantam desktop to play grown-up in a few key ways. It'll handle vast display resolutions (up to 2,560 x 1,600; the HDMI socket tops out at 1,920 x 1,200) and outlandish storage solutions, and thanks to the revised CPU, it can more easily handle 'em with poise (more on that in a bit). It's also worth pointing out that the power supply is still internalized (huzzah!), leaving you with nary a power brick to fiddle with. Let's all breathe a simultaneous sigh of relief, cool?

Performance
We tested out the base mini -- a $599 rig with a 2.3GHz dual-core Core i5, 2GB of 1333MHz DDR3 memory, a 500GB (5,400RPM) hard drive and Intel's HD Graphics 3000 processor with 288MB of DDR3 SDRAM, which is shared with main memory. All things considered, that's a halfway decent spread for an MSRP that's $100 less than the base model of 2010, but alas, there's no optical drive to pay for, either. Whisking about Lion and handling mundane tasks (we're looking at you, Office) was a breeze, though we confess to getting a little impatient when waiting for heavier applications to load for the first time. Bootup routinely took right around 45 seconds from off to usable, and there's no question that an SSD swap would do wonders for the general snappiness of the system.


We also noticed a bit of slowdown after having Photoshop, Word, Firefox, Chrome, TweetDeck and Lightroom open for around three hours. We're pinning that on the lowly 2GB of RAM; granted, we were intentionally pushing it, but those hoping to get creative work done on a mini will certainly want to invest in a few more gigs (and a speedier disk drive). Thankfully, 2GB proved plenty when playing back 1080p files, YouTube HD clips and anything we could find in Boxee / Hulu.

On the gaming front, the results were downright impressive. We fired up Half Life 2: Episode 2, turned the details to "High" and cranked the resolution to 1,920 x 1,200 to natively fill our 24-inch panel. The result? A consistent 31 frames per second. Granted, that title isn't exactly the newest in the stack, but this at least confirms that light-duty gaming with your favorites from yesteryear is indeed possible. Turning to XBench and Geekbench -- staples in the world of OS X benchmarking -- we found similarly impressive stats. This particular system scored 291.21 (overall) / 228.84 (CPU) / 400.30 (Thread Test) on the former, while notching 5,919 on the latter. For comparison's sake, the mid 2010 Mac mini scored 3385 on Geekbench, proving that the Core i5-infused newcomer is leaps and bounds more powerful in terms of raw number crunching.

The so-called HTPC factor...
Like it or not (Apple's firmly on the 'not' side from what we can gather), the Mac mini looks like it'd be the ideal home theater PC. It's tiny, beautiful, and it supports insanely high resolutions and just about any HDTV / monitor you could think of. It's also a dream come true for heavy Boxee users and iPhone owners; just toss up the overlay and allow your phone to handle the controls. It couldn't be simpler, and if you're able to find an easy solution like this that negates the need for a dedicated mouse and keyboard, you might be just in heaven. It's also worth noting that regardless of how hard we pushed this thing, it simply refused to get even a notch above 'warm,' and the fan noise was practically inaudible from ten feet out.

But here's the rub. While we're able to forgive the mini for not having room for a TV tuner (internally, at least), the sudden and unwarranted departure of the optical drive is downright baffling. We know -- too many people will simply write this off without a second thought, rationalizing it as Apple just killing off something that's on the way out, but it's a decision that we wholeheartedly disagree with. Losing the floppy drive when you have a smattering of other options is one thing; but spiking the optical drive? On a desktop computer? It's a terrible, terrible decision, and the truly ludicrous part is that Apple didn't even shrink the size the chassis to make up for it. As much as Apple would love to have you believe that nothing worthwhile will ever ship on a physical disc again, the HTPC argument alone rebukes that. Having access to Hulu, Boxee, iTunes and Netflix is just half of the story -- there aren't too many HTPC owners that never pay their local Redbox a visit.


Last year's mini could easily play back any DVD rental (read: the only reasonable way to get newer movies at home), install applications that shipped on physical discs, rip your CD collection, and even burn back content and homemade movies. For whatever reason, Apple has decided that you won't need to do any of that with this year's mini, and the only consolation prize is a $100 discount at the register. Gee, thanks for the option. In reality, Apple spiraled off in the wrong direction here. Instead of downgrading the mini from optical drive to slotless, it should've swallowed its misplaced disdain for Blu-ray and finally offered the clear next-gen format victor as a build-to-order option. We can pay $600 (!) to swap in a 256GB SSD in what amounts to a mid-level desktop with no expansion options, but we can't pay $100 to throw in a Blu-ray drive in what's obviously a made-for-HTPC machine? It's not only senseless, it's laughable.


In case it's not crystal clear, Apple has made it effectively impossible for us to recommend this as a media PC, but those dead-set on making it one will be glad to find that multichannel audio output is supported over HDMI, and finding the proper resolution to fit one's TV is a lesson in simplicity. So, for those content with a streaming-only HTPC option, this one's about as gorgeous as they come, but we'd definitely recommend a phone-based remote option. Apple doesn't make a combination mouse / keyboard, and even the best of those tend to feel awkward in use. In all honesty, HTPC diehards are better off dropping $99 on an Apple TV and bidding the hassle adieu -- without an optical drive, we're struggling to see why one would pay an extra $500 for something that'll never leave the den.

Wrap-up
It's not often that Apple products take a turn for the worse when a new revision comes out, but there's no question that the design of 2010's mini is superior to the design of this guy. Sure, the revised edition is a heck of a lot more powerful and $100 cheaper, but it's in the same infelicitous spot that it's always been in: by the time you invest in a halfway decent keyboard, mouse and monitor, you're pushing $850+ for a mid-level machine with a sluggish hard drive, the bare minimum amount of RAM that we'd recommend for Lion, no USB 3.0 and no optical drive. For whatever reason, Apple's made the new mini even less useful than the last, and while a Benjamin off the sticker is appreciated, it hardly puts it in a new class in terms of value.

On the upside, OS X Lion is a superbly polished operating system, and the mini itself is easily the most stunning SFF PC on the market today. It's also eerily quiet, power efficient and cool, and it's everything the average college student or studio apartment dweller needs. Handling 1080p multimedia, basic video / photo editing and even gaming is no problem, but we just can't get over the paradoxes here. Apple dumbs down the back panel so the DIYers among us can't access the hard drive, but selling a computer without three essential peripherals (monitor, keyboard and mouse) ensures that the target market will be one that's at least remotely familiar with technobabble.


In isolation, the Mac mini is a fine computer. It's quick on its feet, and it's happy both beside your TV or in the office. As with all Macs, there's an elusive premium that comes with the overall software experience, and those placing a high value on OS X and the bundled iLife suite may find the compromises here acceptable. But imagining how stellar this bundle of joy could have been with a Blu-ray drive (or any drive) is an impossible vision to shake. Perhaps it's just getting more difficult to logically recommend a Mac desktop, particularly one that's underpowered for serious AV work and near impossible to upgrade. Apple has fine-tuned its laptop options in such a way that makes the revamped mini look underwhelming -- grandiose thoughts of an entry-level MacBook Air docked to a (reasonably priced) 27-inch Thunderbolt Display continue to find their way into our brains.

If you're still fixated on the beauty here, our honest recommendation is to pick up last year's model as it inevitably drops in price (and in turn, increases in value). We've been looking long and hard for an ideal use-case for this guy, and sadly, we've yet to find it.

Google+ traffic falls as users spend less time on site

Last week, comScore reported that Google+ hit 20 million unique visitors.
Last week, comScore reported that Google+ hit 20 million unique visitors.
STORY HIGHLIGHTS
  • Total visits to Google+ declined about 3% to 1.79 million in the U.S.
  • Average time on the site was down 10% from 5 minutes, 50 seconds
  • Google+'s traffic had enjoyed a steady climb since its June 28 debut
RELATED TOPICS

(Mashable) -- After a running start, Google+'s growth may be slowing down a bit. A report from Experian Hitwise found both traffic and users' average time on the social network fell last week in the U.S.

Total visits to Google+ declined about 3% to 1.79 million in the U.S. for the week ending July 23 compared to the previous seven days, according to the research company.

The site received 1.86 million visits the prior week. Average time on the site was down 10%, from 5 minutes, 50 seconds to 5 minutes, 15 seconds.

Matt Tatham, a rep for Experian Hitwise, was careful not to overplay the findings. "This is not a huge drop," he says. The company extrapolates its figures from data from ISPs and from an opt-in panel of about 2.5 million users.

The report comes after Google+'s traffic enjoyed a steady climb since its June 28 debut. Last week, comScore reported that the network hit 20 million unique visitors.

Some were so enamored with Google+ that they closed out their Facebook accounts and moved all their activity to the new network.

What do you think? Has the novelty of Google+ worn off or is this just a blip on the road to world domination? Let us know in the comments.



Samsung Galaxy S II Sales Hit 5M Ahead of U.S. Debut

Samsung said it has sold 5 million Galaxy S II handsets in 85 days of availability in South Korea, Japan and some European countries, according to Yonhap News.

That speedy clip comes after Samsung sold 3 million of the Android 2.3 "Gingerbread" devices in 55 days and 1 million in less than a month in Korea alone.

The white-hot sales bode well for Samsung, which just launched the Galaxy S II in China but has yet to release the much-ballyhooed phones through carriers such as AT&T and Verizon Wireless in the U.S. The big mystery is: when will the new devices launch stateside?

Shin Jong-kyun, president of Samsung's mobile business and digital imaging, said at a media briefing July 19 the company would launch the Galaxy S II phones in the U.S. this August.

However, Samsung's U.S. contingent told eWEEK July 20: "Samsung Mobile politely declines to comment on the upcoming availability of the Galaxy S II in the U.S.

Meanwhile, Boy Genius Report snagged these images of AT&T's Galaxy S II slider smartphone.

Other Galaxy S II handsets have proven to be thinner and lighter than the Galaxy S predecessors that sold over 10 million units in the U.S. in 2010. The new handsets also use 4.3 inch Super AMOLED Plus (Super active-matrix organic LED Plus) screens and are powered by 1.2GHz processors.

The phones also include an 8-megapixel rear-facing camera that captures video in 1080p, as well as a 2MP front-facing camera.

Samsung is believed to be targeting an August U.S. launch to get on retail shelves ahead of Apple's  highly anticipated iPhone 5, which is expected to launch this September or October.

Samsung's urgency to get out the door in the U.S. may be appropriate. Experian's PriceGrabber said 35 percent of nearly 3,000 U.S. consumers surveyed online said they would buy the iPhone 5 upon its release.

Samsung may have sold between 18 million and 21 million smartphones worldwide from April through June, compared with 16.7 million for Nokia and 20.3 million iPhones, according to research firm Strategy Analytics via Bloomberg.



Nintendo Cuts Profit Forecast and Price of Key Product

TOKYO — In a striking reversal of fortune for the world’s largest videogame maker, Nintendo drastically cut its annual profit outlook and said it would discount its new 3DS handheld device as it struggles to stem a flow of users to casual online games.

Nintendo said Thursday that it had tumbled to a loss of 25.5 billion yen for the three months to June 30, as sales plunged 50 percent from a year earlier. The loss prompted Nintendo to lower its annual profit forecast 82 percent to 20 billion yen ($257 million) for the year to March, down sharply from a previous estimate of 110 billion yen. The company also slashed its annual sales forecast by 18 percent to 900 billion yen.

Nintendo had been banking on the 3DS, its first major new gaming system since its wildly-popular Wii home console, to lock in a new generation of fans and bolster profits eroded by maturing sales. The handheld machine lets users play games that appear in 3-D, without the need for the clunky glasses that accompany most current 3-D technology.

But sales of the 3DS — which went on sale in February in Japan and in March in other parts of the world — have fallen short of expectations, hurt partly by the device’s unfortunate release date just before the devastating earthquake that struck Japan in March. The Kyoto-based company said it had sold just 710,000 units of the portable console in the three months to June, bringing the total number of units sold to 4.32 million. The company had said it was aiming to sell that many devices in just the first weeks following the product’s rollout.

The lackluster sales also reflect increasing competition for video game companies from new players that are redefining the industry. Apple has sold more than 200 million devices like the iPhone and iPad that let users choose from tens of thousands of games to download for a few dollars, or for free. Smartphones that run Google’s Android operating system also run simple, downloadable games. Casual games played within social networks like Facebook have taken off, too, as membership of those networks grows.

The new crop of casual games has added to the traditional rivalry between gaming systems developed by the videogames sector’s traditional top three: Nintendo, Sony and Microsoft. Nintendo has dominated the last generation of game consoles, selling over 146 million DS handheld game machines and 86 million Wii home consoles. Nintendo’s Japanese rival, Sony, is set to introduce a new portable game machine called the PlayStation Vita later this year, while Nintendo plans to sell a new home console in 2012.

But for now, players complain of a lack of game titles for the 3DS, a problem that plagues most new gaming systems. Nintendo said Thursday that two flagship titles for the 3DS — Super Mario 3D Land and Mario Kart — would go on sale in November and December, releases that are expected to improve sales of the device. But unless more consumers start buying the 3DS soon, third-party developers could be scared away from making games for the device, leading to a vicious cycle of fewer games released and fewer 3DS units sold.

Nintendo is hoping that a steep price cut will help kick-start sales. The 3DS will cost 15,000 yen in Japan from Aug. 11, down 40 percent from the original price of 25,000 yen, the company announced Thursday. In the United States, the price will drop the following day to $169.99 from $249.99. The company said that it expected the price cut to help it meet a previous sales forecast of 16 million 3DS machines by the end of March.

In a letter posted online, Satoru Iwata, Nintendo’s president and chief executive, offered a profuse apology to Nintendo users, saying that lowering prices so soon after a game machine’s release was a painful move.

“Never in Nintendo’s history have we lowered prices to such an extent, less than half a year since the product launch,” Mr. Iwata said. “But we have judged that unless we move decisively now, there is a high possibility that we will not see many of our customers enjoying a Nintendo 3DS.”