Monday, August 15, 2011

Windows Phone 7 App Store Still Lags Behind Apple, Android

Microsoft has bragged about its growing app store for Windows Phone 7 devices, but the store's offerings are still puny compared to Apple's or Google's stores. The Windows Phone 7 app store has around 10 times less apps than the Android Market and around 30 times less than Apple's App Store. Will Microsoft ever catch up?

Since the store's introduction, Microsoft says that out of the 11,500 apps for the Windows Phone 7 platform, 7,500 are paid apps from around 36,00 developers. Some analysts are predicting Windows Phone 7 will boom in the coming years, overtaking the iPhone by 2015, but the outlook, at least for app stores, is not that bright.

For example, Apple reached the 100,000 apps milestone for its store in 15 months. Microsoft will be hard-pressed to hit that mark by the time it celebrates the 15-month anniversary of the Windows Phone Marketplace in January 2012.

The discrepancy between Microsoft's and Apple's app stores grows ever more when looking at the more than 350,000 apps in Apple's store. The Apple App Store has more than 30 times more apps than Microsoft's.

Even Google, which is activating more Android devices than Apple does iPhones lately, is having trouble catching up with the iOS app store. The Android Market has around 150,000 apps, less than half the amount of apps for iOS. The contrast between Google and Apple is starker in the tablet area, where Android has dozens of Honeycomb apps, while there are more than 60,000 for the iPad. Regardless, the Windows Phone 7 app store has still 13 times less apps than the Android Market.

Meanwhile, Microsoft is taking the playground route to fight Apple's App Store supremacy: they're taking them to court. Microsoft started the fight in January, when it asked the U.S. Patent and Trademark Office to deny Apple's 2008 trademark application for the term "App Store." Apple is fighting back against claims that the term is too generic to trademark, and says that Microsoft should know better.

Technorati Tags: , , , , , , , , , , , , ,

Friday, August 5, 2011

Mr Bean blitzes BBC's Top Gear track

Mr Bean blitzes Top Gear track

mr bean

Car-loving comedian Rowan Atkinson has become the fastest celebrity driver on the popular motoring show Top Gear.

Making his long-awaited debut on the latest episode of the show overnight, Atkinson lapped the track more than half a second faster than the previous record holder.

Atkinson's time of 1:42.2 seconds bested fellow comedian John Bishop's 1:42.8 second lap, and was significantly faster than the previous high-profile record holder Tom Cruise - whose time was a 1:44.2.

"It is remarkable because when Bishop did that time and it was so much faster than Tom Cruise we thought it would never be beaten," Clarkson said on the show.

Atkinson is an amateur racer in historic events in the UK and his driving style was unspectacular but smooth and quick, drawing praise from Clarkson.

"What is interesting is all your times were very consistent, as is the mark of a great racing driver," Clarkson said. "Turns out you're in the wrong career."

The Mr Bean actor's passion for cars is well known but this was his first appearance on Britain's most famous car show. The publicity-shy star said he was reluctant to appear because people assumed he would be quick because he competes.

He currently races an historic Ford Falcon but has previously raced an Aston Martin V8 Zagato and a Renault 5 GT Turbo.

He has written for several leading car magazines and is very particular about choosing the right car to match his various characters. For his latest movie Johnny English Reborn he had Rolls-Royce build a custom car with a unique V16 engine.

His collection of road cars includes a McLaren F1, the one-time fastest car in the world, which he had lent to Top Gear for a previous episode.

He was also famously involved in an accident with the car when he ran into the back of a Rover.

He revealed on the show he has driven more than 65,000km in the three-seater V12-powered machine.

Face-matching with Facebook profiles: How it was done

Image via CrunchBase

Facebook's online privacy woes are well-known. But here's an offline one: its massive database of profile photos can be used to identify you as you're walking down the street.

A Carnegie Mellon University researcher today described how he assembled a database of about 25,000 photographs taken from students' Facebook profiles. Then he set up a desk in one of the campus buildings and asked willing volunteers to peer into Webcams.

The results: facial recognition software put a name to the face of 31 percent of the students after, on average, less than three seconds of rapid-fire comparisons.

In a few years, "facial visual searches may become as common as today's text-based searches," says Alessandro Acquisti, who presented his work in collaboration with Ralph Gross and Fred Stutzman at the Black Hat computer security conference here.

As a proof of concept, the Carnegie Mellon researchers also developed an iPhone app that can take a photograph of someone, pipe it through facial recognition software, and then display on-screen that person's name and vital statistics.

This has "ominous risks for privacy" says Acquisti, an associate professor of information technology and public policy at the Heinz College at Carnegie Mellon University. Widespread facial recognition tied to databases with real names will erode the sense of anonymity that we expect in public, he said.

Another test compared 277,978 Facebook profiles (the software found unique faces in about 40 percent) against nearly 6,000 profiles extracted from an unnamed dating Web site.

About 1 in 10 of the dating site's members--nearly all of whom used pseudonyms--turned out to be identifiable.

Facebook isn't the only source of profile data, of course. LinkedIn or Google+ might work. But because of its vast database and its wide-open profile photos, Facebook was the obvious choice. (Facebook's privacy policy says: "Your name and profile picture do not have privacy settings.")

Facial recognition technology, which has been developing in labs for decades, is finally going mainstream. opened its doors to developers last year; the technology is built into Apple's Aperture software and Flickr. Google bought a face-recognition technology in the last few weeks, and Facebook's automated photo-tagging has drawn privacy scrutiny.

In the hands of law enforcement, however, face recognition can raise novel civil liberties concerns. If university researchers can assemble such an extensive database with just Facebook, police agencies or their contractors could do far more with DMV or passport photographs--something that the FBI has been doing for years. (The U.S. Army partially funded the Carnegie Mellon research.)

Acquisti is the first to admit that the technology isn't perfect. It works best with frontal face photos, not ones taken at an angle. The larger the database becomes, the more time comparisons take, and the more false-positive errors arise.

On the other hand, face recognition technology is advancing quickly, especially for nonfrontal photos. "What we did on the street with mobile devices today will be accomplished in less intrusive ways tomorrow," he says. "A stranger could know your last tweet just by looking at you."

Enhanced by Zemanta

Sunday, July 31, 2011

Study: Dumb People Use Internet Explorer

Internet Explorer logo

Here comes the flame war. According to a new report, dumb people are more likely to use Internet Explorer than smart people. It's a finding so apparently defamatory that the company responsible for the statement is allegedly being threatened with a lawsuit by inflamed Internet Explorer aficionados.

Online psychometric testing company AptiQuant, based out of Canada, turned its analytical skills to a group of more than 100,000 individuals in an effort to determine the IQ scores associated with various Web browser users. Over a period of around four weeks, the company gave a Wechsler Adult Intelligence Scale (WAIS) to users looking for free online IQ assessment tests, then recorded the results and browsers used for all participants above the age of 16.

Across the board, the average IQ scores presented for users of Internet Explorer versions 6 through 9 were all lower than the IQ scores recorded for Firefox, Chrome, Safari, Camino, and Opera users. Humorously enough, those using Internet Explorer with the Chrome frame built-in actually ranked third in IQ scores among this browser list. Opera users reported the highest average IQ score – hovering around the 120 to 130 range, which is a bit higher than the WAIS test's population mean of 100 (and standard deviation of 15).

AptiQuant's report notes that the only statistically significant difference in IQ scores occurred between Internet Explorer uses and their counterparts. There was not a significant difference in IQ scores between non-IE browser users, even though these users, in aggregate, reported a higher average IQ score than IE users.

"In addition, the results were compared to another unreleased study of a similar nature undertaken in year 2006. The comparison clearly suggests that more people on the higher side of IQ scale have moved away from Internet Explorer in the last 5 years," reads the report.

Although AptiQuant does get a little heavy-handed against Internet Explorer in its report, suggesting that the "nuisance" browser should be "eradicated," the company has been quick to note that its findings only indicate a one-way relationship between IQ scores and browser use. Perhaps, in part, prompted by alleged threats of a lawsuit against the company by upset Internet Explorer fans.

"I just want to make it clear that the report released by my company did not suggest that if you use IE that means you have a low IQ, but what it really says is that if you have a low IQ then there are high chances that you use Internet Explorer," said CEO Leonard Howard.

For more from David, follow him on Twitter @TheDavidMurphy.

Friday, July 29, 2011

Save your friends from outdated email

You’ve probably already improved the lives of your friends and family members by helping them switch to Gmail, but what about that one friend who still hasn’t made the switch? It’s time to take a stand and stage an intervention. 

How to stage an intervention Follow these three simple steps

1 Select from your contacts

2 Create a customized email

3 Send to your friend

ShareMeNot: Firefox plugins takes the tracking out of social media buttons

Did you know that buttons like these allow Facebook, Google, LinkedIn, and others to track your online browsing activities on every site that includes one of these buttons, even if you never click the buttons and (in some browsers) even if you have third-party cookies disabled?

Students in the University of Washington Computer Science project have created “ShareMeNot,” a Firefox Add-On that defangs social media buttons like the Facebook “Like” button (and others) so that they don’t transmit any information about your browsing habits to these services until (and unless) you click on them. That means that merely visiting a page with a Like or a Tweet or a +1 button (like this one) doesn’t generate a data-trail for the companies that operate those services, but you still get the benefit of the buttons, that is, if you click them, they still work. Smart.

ShareMeNot is a Firefox add-on designed to prevent third-party buttons (such as the Facebook “Like” button or the Twitter “tweet” button) embedded by sites across the Internet from tracking you until you actually click on them. Unlike traditional solutions, ShareMeNot does this without completely removing the buttons from the web experience.

Thursday, July 28, 2011

A Touch Mouse’s Tale

How do you take a concept from research to product? In the case of the Microsoft Touch Mouse, it took a collection of prototypes, collaboration between transatlantic teams, and a lot of user testing. It also helps when the research that launched the project won the best-paper award during the Association for Computing Machinery’s 22nd Symposium on User Interface Software and Technology.

Mouse 2.0: Multi-Touch Meets the Mouse, a joint effort between Microsoft Research Redmond, Microsoft Research Cambridge, and Microsoft’s Applied Sciences Group, introduced five research prototypes, each exploring a different touch-sensing strategy that influenced the design of different mouse form factors and their interaction possibilities. The research featured extensive user feedback, as well as practical comparisons of different techniques for enabling multitouch on the desktop. The prototypes included three camera-imaging approaches, multiple optical sensors, and the use of capacitive sensors on a curved surface.

Members of the Mouse 2.0 research team expressed the hope that they would be able to refine their prototypes, both ergonomically and in terms of their sensing capabilities, and make a deeper exploration of the interaction techniques specific to this new class of input devices.

The researchers soon got an opportunity to refine their prototypes. Microsoft Hardware decided to get behind the research, and a team was formed to bring a multitouch mouse to market.

New Possibilities for the Humble Mouse

Hrvoje Benko, researcher with the Adaptive Systems and Interaction group at Microsoft Research Redmond, has worked on both the Mouse 2.0 research and the Microsoft Touch Mouse product-development project. He recalls one of the key product decisions: selecting from five prototypes the one that would be the launching point for the new device.

“In the end,” Benko says, “we selected the prototype using capacitive touch sensing to track the position of multiple fingers on its surface. This approach offered the most consistency and flexibility in terms of how we could mount and integrate the sensor, which is important in a small form factor. Plus, unlike camera-based tracking, there are no issues with ambient light, so you reduce the calibration issues. It’s a much more controllable sensor.”

Microsoft Touch Mouse
Microsoft Touch Mouse: the final form factor.

Although the choice of prototype simplified some of the technical issues, there were still plenty of challenges when it came to refining the mouse to the point where it was ready for consumer use. The design of the final form factor required sculpting and testing of hundreds of models. The team also examined user interactions and evaluated the kinds of gestures that made sense, developing an entire gestural set focused on enhancing window manipulation and management. At the same time, core technologies, such as firmware and hardware for capacitive sensing, had to be built and optimized for this specific form factor and device functionality.

“The gesture-recognition software is the brains behind all these interactions,” says John Miller, software architect with the Cambridge Innovation Development team at Microsoft Research Cambridge. “Our gestures are multitouch and designed to amplify your experience with Windows 7. So they are optimized for window management: docking, moving, minimizing and maximizing, going backward and forward on your webpage, switching between tasks, and so on.”

Getting the Right Touch

Benko and Miller agree that one of the toughest problems they tackled was the requirement that users should be able to operate the device using classic point-and-click interactions, as well as the newly developed set of multitouch gestures. The mouse form itself added complications: The shape encourages a user to rest both palm and curled fingers on the entire touch-sensitive surface, creating constant contact.

“That made everything much, much harder,” Benko smiles ruefully. “Instead of making palm rejection and other issues easier, it added a few more challenges. But at the end of the day, our goal was to have a comfortable, great-looking mouse that people enjoy using, with a nice look and feel that support the gestures, so it was definitely worth the effort.”

Unlike touch-screen devices on which one or two clear touches make user commands easy to interpret, a small sensor surface and the nature of mouse usage creates an entirely different set of problems.

“If you have a touch-sensitive phone,” Miller explains, “you interact by touching the screen, and as soon as you’re done, your finger lifts off the screen. We have completely different issues with the mouse. We have a device that not only has to support gesture touches, but also has to deal with times when the user is just holding it.

Gestures for controlling Touch Mouse
A core technical challenge: developing a gesture set that enables clear differentiation between various types of user contact with the touch surface.

“Next, fingers can be very close together when making contact. To the sensor, they can appear as one finger rather than multiple fingers. But if you want to have reliable gesture recognition, you need a way to differentiate between one, two, or three fingers. We had to develop technology that enhances signal processing and reliably tracks contacts.

“And here is one more example,” Miller continues. “Everybody holds the mouse in a slightly different way. Some people hold their fingers flat on the mouse, and people with very small hands will hold the device differently than people with very large hands. So the mouse does not make contact the same way for all users, and they are all going to be performing these gestures in a slightly different way. As a result, there’s a lot more ‘noise’ to handle than from a touch-screen phone or a Tablet PC. We had to deal with a lot more data.”

To mitigate some of these problems, the team set a design goal that gestures should be both intuitive and distinct—the kind that would be hard for a user to perform by accident. This helped simplify the job of the recognition software.

They also developed a tool that recorded sensor data while human testers were using the mouse for actions such as pointing and clicking, multitouch gesturing, and grabbing and releasing the mouse.

“We ended up with data examples of good gestures for mouse usage and unintentional movements,” Benko says, “and this helped us conceive strategies for distinguishing between intentional gestures and incidental movements. It’s what allowed us to develop an engine that’s able to recognize some movements and ignore others.”

Collaboration Delivers a Quality Product

The Microsoft Touch Mouse project is unusual compared with other hardware-development projects, because it is not simply about hardware. Rather, it is a product that combines multiple disciplines in a tightly integrated way, a task that would have been impossible without close collaboration between multiple Microsoft Research and Microsoft hardware-development teams in different locations.

Decisions about the final product, for example, involved testing and evaluation of different prototypes and features by all parties.

Hrvoje Benko and John Miller
Touch Mouse collaborators Hrvoje Benko (left) and John Miller.

“There were a lot of concepts from the original research,” Benko says, “and some of those we decided to leave out. That doesn’t mean they were bad ideas, just that we were being very careful about our choices. It’s how making a product works: You assess the pros and cons of every choice. Both the research and hardware teams were focused on nailing down the core experience, to make sure that everything we included was critical and didn’t distract from the user’s task. Our goal was to deliver a delightful, fluid desktop experience.”

Even though the multitouch-mouse project officially belonged to the hardware team, Microsoft Research remained integral to the development.

“The original Mouse 2.0 paper was just the starting point,” Miller says. “The research efforts didn’t stop there. They continued in tandem with product development. There was a lot of additional research from different parties before we could turn the multitouch-mouse concept into a device that consumers can buy off the shelf.”

For Benko and Miller, one of the most rewarding aspects of this project has been the close collaboration between the hardware team and Microsoft Research in both Cambridge and Redmond. It went beyond technology transfer and was absolutely critical to delivering a successful transition from research prototype to consumer product.

The Microsoft Touch Mouse proves that quality research doesn’t have to address technologies that are many years away from commercialization. Sometimes, it’s about exploring new possibilities. There’s always room for a better mousetrap—make that, a better mouse.

Microsoft's MS-DOS is 30 today

MS-DOS is 30 years old today. Well, kind of. On 27 July 1981, Microsoft gave the name MS-DOS to the disk operating system it acquired on that day from Seattle Computer Products (SCP), a hardware company owned and run by a fellow called Rod Brock.

SCP developed what it at various times called QDOS and 86-DOS to run on a CPU card it had built based on Intel's 8086 processor.

MS-DOS 1.19

Command line: MS-DOS 1.19 still running after all these years

The company had planned to use Digital Research's CP/M-86 operating system, then still in development. But, having released the card in November 1979 - it shipped with an 8086-compatible version Microsoft's Basic language interpreter-cum-operating system - and reached April 1980 without CP/M-86 becoming available to bundle, SCP decided it had to create its own OS for the card.

Enter, in August 1980, QDOS. It really did stand for Quick and Dirty Operating System. That's actually what it was: a basic but serviceable OS good for coding and running programs written in 8086 assembly language - the x86 instruction set. It was written by SCP's Tim Paterson, who had joined the company as a programmer a couple of years previously and began work on it in April 1980.

Some observers later claimed that QDOS too closely resembled CP/M for comfort. Paterson himself would later say that QDOS' design criteria specifically included the abililty to support programs written for CP/M and compiled for the 8086. That's not at all surprising given that SCP undoubtedly saw QDOS as a temporary stand in until Digital Research (DR) shipped CP/M-86.

The picture we have today is muddied by the claims that IBM originally wanted to use CP/M-86 in its first personal computer. IBM and DR famously failed to come to terms that would allow CP/M-86 to be bundled with the PC, and IBM turned to Microsoft for an alternative. Digital Research founder Gary Kildall, who died in 1994, would later allege that Microsoft's product was a rip off, fuelling plagiarism claims that Paterson has always denied - he reverse engineered it.


The competition: CP/M-86 in action
Source: Wikipedia

Update My fellow Reg hack Andrew Orlowski points out that, no matter what Paterson says, the US court ruled against the programmer in a defamation lawsuit he brought against publisher Little Brown for claiming the origins of QDOS were not clear-cut.

Back in 1980, Paterson continued to evolve QDOS through the year, the OS being renamed 86-DOS - it was now evidently no longer viewed as a rough-and-ready stand-in - between September and December 1980. Accounts differ as to when the name - and the OS' status - was switched, but December is the date Paterson himself gave during a Softtalk interview published just a few years later.

'Hi, it's Microsoft. Say, can we license your OS?'

It's at this point that Microsoft re-enters the picture, acquiring from SCP a licence to market and sell 86-DOS, paying $25,000 for the privilege. Microsoft was now working with IBM in place of DR - the two had been partners since November 1980 - to supply the operating system for the hardware giant's first personal computer, but it kept IBM's identity hidden from SCP and Paterson until it acquired the OS in its entirety the following year.

"We all had our suspicions that it was IBM that Microsoft was dealing with," Paterson would later say, "but we didn't know for sure."

MS-DOS Advert

Microsoft would later advertise MS-DOS' claimed superiority to CP/M-86

Microsoft had been in contact with SCP ever since the latter asked to use its Basic, so it would have been aware of SCP's work on QDOS, the operating system's design goals and its convenient compatibility with CP/M-86. Microsoft would have seen how closely QDOS matched the product it had been commissioned to supply to IBM, and its ties with SCP would have helped it gain that initial re-distribution licence.

You can read a copy of the 86-DOS Programmer's Manual (PDF) here.

By July 1981, Microsoft had sufficient understanding of IBM's plans - and the vision to conceive of what the personal computer market might become - to consider not merely licensing 86-DOS but buying it outright from SCP, for a further $50,000 - $75,000 in total, $180,000 (£112,000) in today's money. SCP was allowed to continue to offer the OS with its own hardware. Paterson had already quit SCP, in April 1981, to join Microsoft the following month.

Seattle Computer Products DOS diskettes

Seattle Computer Products' DOS
Source: Ty's Computer Interest Website

"So on 27 July, 1981, the operating system became Microsoft's property," Paterson said in the 1983 Softtalk interview. "According to the deal, Seattle Computer can still see the source code, but is otherwise just another licensee. I think both companies were real happy. The deal was closed just a few weeks before the PC was announced. Microsoft was quite confident."

In August 1981, Big Blue introduced what would eventually become known as the IBM PC, though it was originally the 5150. It was based on the Intel 8088 CPU, a lesser - but cheaper - version than the 8086 that used an 8-bit external bus rather than the 16-bit bus found on the 8086.

Paterson came with his operating system, and stayed with Microsoft for a year while 86-DOS was honed into MS-DOS 1.0, released as a standalone product early in 1982. He left in March 1982, after the completion of MS-DOS 1.25, but would later return (twice) to Microsoft, where he would go on to work on Visual Basic. He eventually formed his own hardware company, Paterson Technology, though his blog now lists his status as retired.

MS-DOS 3.2 box

Microsoft boxes up MS-DOS 3.2
Source: Hugepedia

Now 55, Paterson continues to blog about the QDOS' development, emphasising the reasons for its CP/M friendliness yet stressing its under-the-hood differences.

MS-DOS triumphant

From July 1981, SCP continued to sell the operating system it had created, now calling it Seattle DOS and bundling it with its hardware products. It continued to do so until 1985, by which time its was clear buyers wanted systems, and cheap ones - whether from IBM or the many 'cloners' who'd released products compatible with its technology.

MS-DOS Advert

Microsoft advertises DOS in 1983
Source: Fraggle UK at Flickr

Brock now sought to sell his rights to MS-DOS, a scheme with which Microsoft was not best pleased and said its agreement with SCP did not permit. Brock sued, and the case went to trial in the last few months of 1986. Brock and Microsoft quickly came to an out-of-court arrangement, however: Brock sold his licence to Microsoft for $925,000, leaving the software giant in complete ownership of the OS.

Through this time, Microsoft was releasing version after version of MS-DOS, each mirrored by a release of IBM's IBM-DOS and, later, PC-DOS, as its take on the OS came to be called.

Other versions appeared, tweaked by PC manufacturers using Microsoft's OEM kit to more closely fit the specifics of their hardware. Many would run software developed for the IBM PC, others would not, though they would run generic MS-DOS-compatible applications.

CP/M-86 was eventually released, in 1981, and subsequently offered by DR as a third-party alternative to MS-DOS. As you can see from the ad above, Microsoft saw it as as a threat. DR's OS was bundled with a number of IBM PC rivals, from the likes of Apricot and Siemens.

You can view the source code for CP/M-86 - and other versions of the OS - here.

In May 1988, CP/M-86 was effectively re-released as DR-DOS and pitched more directly as an alternative to MS-DOS itself than to IBM's PC-DOS.

DR-DOS found many supporters but failed to dent Microsoft's market share. Microsoft quickly established the technique of announcing new MS-DOS features well ahead of their appearance, previously seen as an approach that could only kill sales of the current version. Instead, it kept buyers away from rival offerings, and it's now a common tactic employed by highly competitive tech companies.


MS-DOS gets upgraded, kind of

Meantime, MS-DOS continued to evolve, gaining a graphical user interface of sorts with version 4.0, disk compression tech with version 6.0, and FAT32 support with version 7.1.

Version 4.0 should have been the final release - even Microsoft said so, announcing in 1987 that "DOS is dead" and that we should all be using OS/2, jointly developed by IBM and Microsoft, though the latter stepped away from it when Windows 3.0 became a huge success. That's another story.

Microsoft's work on DOS eventually took the OS to version 8.0, the release used for Windows XP boot discs. With that release, on 14 September 2000, MS-DOS development formally came to an end, though significant work stopped some years earlier with MS-DOS 5.0 when it ceased to be offered as a standalone product. ®

Is Apple declaring war on DVDs?

Apple is pulling the DVD drive from its Mac mini. Are we moving towards a future of only streaming video?

In its new product announcement last week, Apple rolled out a lot of new features – including significantly faster processers and greater expandability for its Macbook Air and Mac Mini lineups. But Cupertino also quietly took something out of its lineup (besides the vanilla Macbook, that is): the Mac Mini is now missing its DVD drive.

Skip to next paragraph

They say you need at least two data points to draw a trend, and now we have them: the Macbook Air has never had an optical drive, and now that the Mini’s has disappeared as well, it likely indicates that the company is eyeing a future in which media doesn’t come on a DVD – or a CD-ROM or Blu-Ray disc, for that matter.

For a lot of companies – and a lot of users – the move to a discless world makes a lot of sense. It’s easier for both parties to deal in digital downloads – as opposed to the comparatively byzantine process of burning software to physical media, packaging it, and shipping it around. And the exclusion of an optical drive allows computers to be that much smaller, lighter, and less expensive.

This isn’t the first time Apple’s been in this position, either. Back in 1998, the company introduced the original iMac without a floppy drive, pulling the plug on a technology that was still considered standard. (In hindsight, that was probably a good call, though Apple’s move caused quite an outcry at the time.)

For a lot of people, though, it really is too soon to ditch the discs. Let’s assume that Apple will continue to remove optical drives throughout its laptop and desktop lines, as it did with the floppy drive: this is probably an unwelcome scenario to anyone hoping to watch a DVD on an airplane.

There’s also the home theater crowd to consider. The previous generation Mac Mini, with its flexible display options and DVD drive, gained acclaim as a near-perfect media player (just hook it up to an HDTV and you’re good to go!). But now that the drive is gone, the retribution from home-theater enthusiasts is swift. Over at tech site Engadget, Nilay Patel cited the lack of DVD support as his biggest gripe with the machine, saying, "The Mac mini looks like it'd be the ideal home theater PC … [but] having access to Hulu, Boxee, iTunes and Netflix is just half of the story -- there aren't too many HTPC owners that never pay their local Redbox a visit."

There’s no reason to think that Apple will bring back optical drives in the future, which means it’s also unlikely that it’ll ever introduce Blu-Ray drives in Macs. Steve Jobs famously referred to Blu-Ray as “a bag of hurt” back in 2008, and it’s worth pointing out that when Lion, the next iteration of the Mac OS X operating system, arrives in a physical format in August (it’s download-only for now) it’ll be on a USB stick, not a disc.

What’s your take on Apple’s move? Have you moved on from optical media already, or do you have a collection of discs that must now sit unplayed? Let us know in the comments section. In the meantime, sign up for our free weekly newsletter, which arrives every Wednesday.

Mac mini review

For those familiar with last year's Mac mini, what you're peering at above isn't likely to strike you as jarring. Heck, it may even seem somewhat vanilla at this point. In truth, Apple did exceedingly little in terms of design changes with the mid 2011 Mac mini, but given the relatively recent cosmetic overhaul, it's not like we were genuinely expecting anything above a top-to-bottom spec bump. And that, friends, is exactly what we've received. The mini remains quite the curious beast in Cupertino's line -- it's the almost-HTPC that living room junkies are longing for, yet it's still a country mile from being the headless mid-tower that Apple steadfastly refuses to build. It's hardly a PC for the simpleton (given that it's on you to hunt down a mouse, keyboard and monitor), and it's actually taking a giant leap backwards on one particularly important front. Care to hear more? You'll find our full review just past the break.

Hardware and design
Make no mistake about it -- the mini is just gorgeous to look at. As with the prior model, this 2.7 pound slab of aluminum looks nicer than its price tag indicates, and it honestly feels more like a decoration than a computer. It's sized at 7.7 x 7.7 x 1.4 inches, exactly the same as its predecessor, and outside of the chromed Apple logo on the top, a matte black strip of ports on the rear and a similarly hued lid on the bottom, it's a clean sweep of brushed silver. It'll sit nicely on its edge for those contemplating a vertical installation, but the protruding lid on the bottom makes it a little less elegant for those applications.

Speaking of the rear, the dozen connectors found there aren't cosmetically different than those on the last build. From left to right, you'll find an AC input, gigabit Ethernet jack, FireWire 800 port, HDMI (full-size), Thunderbolt, four USB 2.0 sockets, an SDXC slot, an audio input and a 3.5mm headphone port. Funny enough, last year's DisplayPort socket looks identical to this year's Thunderbolt connector, and not surprisingly, DisplayPort monitors and peripherals will happily fit themselves in with no adapters needed. For what it's worth, Apple does include an HDMI-to-DVI adapter, but oddly, no Thunderbolt dongle. Sure, we know those cables are laced in gold, but what better way to encourage adoption of a new I/O port than to toss in an appendage for newcomers? Even a DisplayPort / Thunderbolt-to-HDMI or DVI cable would've been greatly appreciated -- making it simple to hook up dual displays right from the get-go would have seriously tickled our fancy.

Tinkerers are bound to love that bottom lid... and then grow frustrated by what's underneath; a simple twist reveals a WiFi module, cooling fan, two SODIMM slots and plenty of other, not-easily-accessible components. Our test unit came with a pair of 1GB memory modules, but even the greenest DIYer could swap those out with more sizable ones -- a couple of snaps and a tug is all it took. Unfortunately, we're still miffed at Apple's decision to keep the HDD away from a user's fingertips. If we had our druthers, the RAM wouldn't be the only thing that's just a few clips away, but alas, we're stuck with what we've got.

We shouldn't have to chide Intel and Apple (and whoever else wants to claim responsibility) for not having USB 3.0 on Macs in the year 2011, but regretfully, we are. A foursome of USB 2.0 ports are cute, but when sub-$400 netbooks are boasting SuperSpeed USB ports... well, let's just say it's about time Apple took notice. Unfortunately, Steve Jobs still seems to think that the newest iteration of the world's most popular port isn't going anywhere fast, so we're apt to see Thunderbolt pushed as the true USB 2.0 replacement. That doesn't mean we have to like it, though.

Given that it's the only new port onboard, it's worth mentioning that Thunderbolt is a fantastic addition to the array. The ability to daisy-chain monitors and peripherals off of it enables the bantam desktop to play grown-up in a few key ways. It'll handle vast display resolutions (up to 2,560 x 1,600; the HDMI socket tops out at 1,920 x 1,200) and outlandish storage solutions, and thanks to the revised CPU, it can more easily handle 'em with poise (more on that in a bit). It's also worth pointing out that the power supply is still internalized (huzzah!), leaving you with nary a power brick to fiddle with. Let's all breathe a simultaneous sigh of relief, cool?

We tested out the base mini -- a $599 rig with a 2.3GHz dual-core Core i5, 2GB of 1333MHz DDR3 memory, a 500GB (5,400RPM) hard drive and Intel's HD Graphics 3000 processor with 288MB of DDR3 SDRAM, which is shared with main memory. All things considered, that's a halfway decent spread for an MSRP that's $100 less than the base model of 2010, but alas, there's no optical drive to pay for, either. Whisking about Lion and handling mundane tasks (we're looking at you, Office) was a breeze, though we confess to getting a little impatient when waiting for heavier applications to load for the first time. Bootup routinely took right around 45 seconds from off to usable, and there's no question that an SSD swap would do wonders for the general snappiness of the system.

We also noticed a bit of slowdown after having Photoshop, Word, Firefox, Chrome, TweetDeck and Lightroom open for around three hours. We're pinning that on the lowly 2GB of RAM; granted, we were intentionally pushing it, but those hoping to get creative work done on a mini will certainly want to invest in a few more gigs (and a speedier disk drive). Thankfully, 2GB proved plenty when playing back 1080p files, YouTube HD clips and anything we could find in Boxee / Hulu.

On the gaming front, the results were downright impressive. We fired up Half Life 2: Episode 2, turned the details to "High" and cranked the resolution to 1,920 x 1,200 to natively fill our 24-inch panel. The result? A consistent 31 frames per second. Granted, that title isn't exactly the newest in the stack, but this at least confirms that light-duty gaming with your favorites from yesteryear is indeed possible. Turning to XBench and Geekbench -- staples in the world of OS X benchmarking -- we found similarly impressive stats. This particular system scored 291.21 (overall) / 228.84 (CPU) / 400.30 (Thread Test) on the former, while notching 5,919 on the latter. For comparison's sake, the mid 2010 Mac mini scored 3385 on Geekbench, proving that the Core i5-infused newcomer is leaps and bounds more powerful in terms of raw number crunching.

The so-called HTPC factor...
Like it or not (Apple's firmly on the 'not' side from what we can gather), the Mac mini looks like it'd be the ideal home theater PC. It's tiny, beautiful, and it supports insanely high resolutions and just about any HDTV / monitor you could think of. It's also a dream come true for heavy Boxee users and iPhone owners; just toss up the overlay and allow your phone to handle the controls. It couldn't be simpler, and if you're able to find an easy solution like this that negates the need for a dedicated mouse and keyboard, you might be just in heaven. It's also worth noting that regardless of how hard we pushed this thing, it simply refused to get even a notch above 'warm,' and the fan noise was practically inaudible from ten feet out.

But here's the rub. While we're able to forgive the mini for not having room for a TV tuner (internally, at least), the sudden and unwarranted departure of the optical drive is downright baffling. We know -- too many people will simply write this off without a second thought, rationalizing it as Apple just killing off something that's on the way out, but it's a decision that we wholeheartedly disagree with. Losing the floppy drive when you have a smattering of other options is one thing; but spiking the optical drive? On a desktop computer? It's a terrible, terrible decision, and the truly ludicrous part is that Apple didn't even shrink the size the chassis to make up for it. As much as Apple would love to have you believe that nothing worthwhile will ever ship on a physical disc again, the HTPC argument alone rebukes that. Having access to Hulu, Boxee, iTunes and Netflix is just half of the story -- there aren't too many HTPC owners that never pay their local Redbox a visit.

Last year's mini could easily play back any DVD rental (read: the only reasonable way to get newer movies at home), install applications that shipped on physical discs, rip your CD collection, and even burn back content and homemade movies. For whatever reason, Apple has decided that you won't need to do any of that with this year's mini, and the only consolation prize is a $100 discount at the register. Gee, thanks for the option. In reality, Apple spiraled off in the wrong direction here. Instead of downgrading the mini from optical drive to slotless, it should've swallowed its misplaced disdain for Blu-ray and finally offered the clear next-gen format victor as a build-to-order option. We can pay $600 (!) to swap in a 256GB SSD in what amounts to a mid-level desktop with no expansion options, but we can't pay $100 to throw in a Blu-ray drive in what's obviously a made-for-HTPC machine? It's not only senseless, it's laughable.

In case it's not crystal clear, Apple has made it effectively impossible for us to recommend this as a media PC, but those dead-set on making it one will be glad to find that multichannel audio output is supported over HDMI, and finding the proper resolution to fit one's TV is a lesson in simplicity. So, for those content with a streaming-only HTPC option, this one's about as gorgeous as they come, but we'd definitely recommend a phone-based remote option. Apple doesn't make a combination mouse / keyboard, and even the best of those tend to feel awkward in use. In all honesty, HTPC diehards are better off dropping $99 on an Apple TV and bidding the hassle adieu -- without an optical drive, we're struggling to see why one would pay an extra $500 for something that'll never leave the den.

It's not often that Apple products take a turn for the worse when a new revision comes out, but there's no question that the design of 2010's mini is superior to the design of this guy. Sure, the revised edition is a heck of a lot more powerful and $100 cheaper, but it's in the same infelicitous spot that it's always been in: by the time you invest in a halfway decent keyboard, mouse and monitor, you're pushing $850+ for a mid-level machine with a sluggish hard drive, the bare minimum amount of RAM that we'd recommend for Lion, no USB 3.0 and no optical drive. For whatever reason, Apple's made the new mini even less useful than the last, and while a Benjamin off the sticker is appreciated, it hardly puts it in a new class in terms of value.

On the upside, OS X Lion is a superbly polished operating system, and the mini itself is easily the most stunning SFF PC on the market today. It's also eerily quiet, power efficient and cool, and it's everything the average college student or studio apartment dweller needs. Handling 1080p multimedia, basic video / photo editing and even gaming is no problem, but we just can't get over the paradoxes here. Apple dumbs down the back panel so the DIYers among us can't access the hard drive, but selling a computer without three essential peripherals (monitor, keyboard and mouse) ensures that the target market will be one that's at least remotely familiar with technobabble.

In isolation, the Mac mini is a fine computer. It's quick on its feet, and it's happy both beside your TV or in the office. As with all Macs, there's an elusive premium that comes with the overall software experience, and those placing a high value on OS X and the bundled iLife suite may find the compromises here acceptable. But imagining how stellar this bundle of joy could have been with a Blu-ray drive (or any drive) is an impossible vision to shake. Perhaps it's just getting more difficult to logically recommend a Mac desktop, particularly one that's underpowered for serious AV work and near impossible to upgrade. Apple has fine-tuned its laptop options in such a way that makes the revamped mini look underwhelming -- grandiose thoughts of an entry-level MacBook Air docked to a (reasonably priced) 27-inch Thunderbolt Display continue to find their way into our brains.

If you're still fixated on the beauty here, our honest recommendation is to pick up last year's model as it inevitably drops in price (and in turn, increases in value). We've been looking long and hard for an ideal use-case for this guy, and sadly, we've yet to find it.