Tuesday, July 8, 2008

Experiencing Vista slowdown, Move to Linux

Experiencing Vista slow down, solution is Linux...!
For performance and speed.


Open Source Projects

Best open source software projects...

12 catagories and 120 open source projects


Catagories are:

Best Project
Best Project for the Enterprise
Best Project for Educators
Most Likely to Be the Next $1B Acquisition
Best Project for Multimedia
Best Project for Gamers
Most Likely to Change the World
Best New Project
Most Likely to Be Accused of Patent Violation
Most Likely to Get Users Sued
Best Tool or Utility for SysAdmins
Best Tool or Utility for Developers

Remote sign out and info to help you protect your Gmail account

Your email account can contain a lot of personal information, from bank alerts to love letters. Email that, I'm sure, you don't always want other people to see. We understand how important your Gmail accounts are to you, so we're adding a new layer of information and control. With this new feature, you can now track your recent sessions and you can also sign yourself out remotely.If you are anything like me, you probably sign in to Gmail from multiple computers. I, for example, occasionally sign into my Gmail account from a friend's house when I need to check an important email. Usually I remember to sign out, but every once in a while I wonder if I really did. Now I no longer have to wonder.


Friday, July 4, 2008

A series of experimental features including a game called Old Snakey have been added to Google Mail

Google is currently rolling out the new features to English-speaking users of Google Mail and the most popular ones will stand a chance of becoming permanent.

To find the new features you'll need to sign in, click on the Settings tab and then the Labs tab. You can then choose whether to enable or disable them.

Google product manager Keith Coleman outlined some of the new, experimental features.

"Some of them we've found really useful, like Quick Links, which lets you save searches and any other views in Gmail," he said.

Coleman said that those that prove popular will be kept though the ones that aren't will be retired. There are 13 new features in all.

Five Best Windows Maintenance Tools

You download, create, delete, and move around countless files and endless piles of data on your PC every day. While your PC would ideally handle all of this data for you, it doesn't take long before you end up with a disorganized, cluttered computer. Hit the jump for an overview of the five best Windows maintenance tools, then cast your vote for the best of the bunch.
Any application called CCleaner where the first 'C' stands for Crap has to be good, right? Okay, maybe not, but this one is, thoroughly cleaning out your web browser, Recycle Bin and temporary files, registry, unnecessary third-party application trash, and oh so much more. Running CCleaner on your system promises to free up space, keep your computer running smoothly, and protect your privacy. It's also very fast and very easy to use.
Revo Uninstaller
Revo Uninstaller is a feature-rich replacement for the Windows default Add or Remove Programs feature (or Uninstall Programs in Vista). Why use Revo Uninstaller instead? Because not only does it just uninstall a program—it also removes all traces of the program from your system where the basic uninstaller may not. Revo also boasts a Hunter Mode for uninstalling apps by dragging a cross-hair onto the app you want to uninstall, whether it's the app's system tray icon or just a shortcut. It even helps manage your startup applications
Auslogics Disk Defrag
Auslogics Disk Defrag is a fast and effective disk defragmenter intended to replace the Windows default. It's quick and easy to use, and Auslogics DD provides a nicely formatted report of the latest defrag, including a motivating "Defragmentation has increased this computer performance by X%" message.
JkDefrag is an open-source disk defragmenting tool. Unlike Auslogics DD, JkDefrag boasts command line support to allow you to set up your disk defrags on a schedule. Perhaps even better, JkDefrag has an installable screensaver that will automatically start defragging your hard drive whenever the screensaver is launched—it even displays the defrag process. Out of the box JkDefrag isn't the most attractive application you'll ever use, but when teamed with the previously mentioned JkDefrag GUI, it's just as attractive and easy to use as any defragger you'll find.
Spybot - Search & Destroy
Even if you're using one of the best antivirus applications available, you may still end up with some form of malware on your computer. Spybot-S&D will ensure that it isn't there for long. Apart from removing spyware, adware, dialers, keyloggers, and trojans, Spyware-S&D can also cover your usage tracks—like browsing or file history—to give you enhanced privacy on your computer.
Now that you've seen the most popular five Windows maintenance tools as chosen by your fellow readers, it's time to vote for the best.

most data on enterprise networks rarely gets accessed after it is written to network storage

Statistically speaking, most data on enterprise networks rarely gets accessed after it is written to network storage, according to researchers from NetApp Inc. and the University of California (UC). Evidently, we are too busy writing new data to go back over old data.

Andrew Leung, a computer science researcher at the UC, presented the findings at the USENIX conference in Boston last week. Given those results, organizations might want to consider moving much of their data to slower but less expensive storage units since it rarely gets accessed, he said.

The team studied the traffic that flowed through NetApp's enterprise file servers, which manage more than 22T of material relating to all aspects of the company's business operations.
Leung said the study is the first large-scale examination of network traffic patterns. "How people have been deploying network file systems has been changing over the past five to 10 years," he said. "They are being used more commonly for different kinds of things. So what we would like to know is how this affects the workloads of the network."

During the three-month period that the network was under scrutiny, more than 90 percent of the material on the servers was never accessed. The researchers captured packets encoded using the Common Internet File System protocol, which Microsoft Windows applications use to save data via a network. About 1.5T of data was transferred.

"Compared to the full amount of allocated storage on the file servers, this represents only 10 percent of data," Leung said. "[This] means that 90 percent of the data is untouched during this three-month period."

Moreover, among the files that were opened, 65 percent were only opened once. And most of the rest were opened five or fewer times, though about a dozen files were open 100,000 times or more.

"What this suggests, in general, is that files are infrequently re-accessed," Leung said.
The team also observed that the ratio of data being read from storage versus the amount of data written to storage has changed from what had been seen in previous studies. Bytes written compared to bytes read by a ratio of about 2-1. "Past read-write ratios saw read-to-write ratios of 4-1 or higher," Leung added.

Developers of file systems might want to take into consideration the fact that their creations are spending almost as much time writing data as reading data. "The workloads are becoming more write-oriented, so the decrease in read-only traffic and the increase in write traffic suggests that file systems want to be more write-oriented," Leung said.

File server vendors also might want to consider re-jiggering their pre-fetching and caching algorithms to improve performance, given those findings. "If we know that files aren't frequently re-accessed, what this suggests is that [caching] algorithms may not be the best for network file systems" because the material cached will probably not get retrieved, he said.

Speaking to Government Computer News after the presentation, Leung described the 10 percent of data that was being re-accessed. Typically, it is in the file format most closely associated with the user's job. Architects might use computer-aided design files, while developers use source-code files. Also, files that are higher up in a file path or closer to the user's home directory tend to be accessed more often than those buried deeper down in a hierarchy of subfolders.

More than 75 percent of the files being opened were very small -- less than 20K each -- although another 12 percent were more than 5G each.

Free Web app security scanner

Google claims that Ratproxy is quick and less intrusive than other security scanners
Google has released for free one of its internal tools used for testing the security of Web-based applications.
Ratproxy, released under an Apache 2.0 software license, looks for a variety of coding problems in Web applications, such as errors that could allow a cross-site scripting attack or cause caching problems.
"We decided to make this tool freely available as open source because we feel it will be a valuable contribution to the information security community, helping advance the community's understanding of security challenges associated with contemporary web technologies," wrote Google's Michal Zalewski on a company security blog.

Ratproxy -- released as version 1.51 beta -- is quick and less intrusive than other scanners in that it is passive and does not generate a high volume of attack-simulating traffic when running, Zalewski wrote. Active scanners can cause problems with application performance.

The tool sniffs content and can pick out snippets of JavaScript from style sheets. It also supports SSL (Secure Socket Layer) scanning, among other features.

Since it runs in a passive mode, Ratproxy highlights areas of concern that "are not necessarily indicative of actual security flaws. The information gathered during a testing session should be then interpreted by a security professional with a good understanding of the common problems and security models employed in web applications," Zalewski wrote.

Google has posted an overview of Ratproxy as well as a download link to the source code. Code licensed under the Apache 2.0 license may be incorporated in derivative works, including commercial ones, but the origin of the code must be acknowledged.

Weak web application security continues to embarrass companies, potentially causing the loss of customer or financial data.

A 2006 survey by the Web Application Security Consortium found that 85.57 percent of 31,373 sites were vulnerable to cross-site scripting attacks, 26.38 percent were vulnerable to SQL injection and 15.70 percent had other faults that could lead to data loss.

As a result, security vendors have moved to fill the need for better security tools, with large technology companies acquiring smaller, specialized companies in the field.
In June 2007, IBM bought Watchfire, a company that focused on Web application vulnerability scanning, data protection and compliance auditing. Two weeks later, Hewlett-Packard said it would buy SPI Dynamics, a rival of Watchfire whose software also looks for vulnerabilities in Web applications as well as performing compliance audits

The worst time to send an email is between 5pm and 6pm on Thursdays

The worst time to send an email is between 5pm and 6pm on Thursdays, research has found.

Web performance monitoring firm Epitiro said that an estimated seven per cent of emails are either severely delayed or lost altogether during this 'rush hour'.
This is due to the number of people using the web at that particular time, Epitiro said.

And it's not just the day of the week or time of day that can affect email delivery times - the time of year is also a factor too.

Epitiro found that in the autumn months, email delivery times got slower, with eight per cent taking longer than three minutes to arrive.

"Email delivery speed is a very important factor in internet communications, not least for the credibility of an online business," said Gavin John of Epitiro.

"When customers shopping online are told to await email confirmation of a purchase, they expect this to arrive in no more than a couple of minutes. If an email takes longer than three minutes to arrive, many customers will worry that something has gone wrong," Johns continued.

Thursday, July 3, 2008

Accessing Information is not Acquiring Knowledge

It used to be traditional to blame bad decisions with lack of foresight and risk averse leadership. Both reasons have at their core a lack of information that cumulatively led to the bad decision being taken in the first place.

But with the advent of advanced computing power, and the networking enabled by the Internet, this reason, of lack of information, no longer holds water. The information is there, collected in sometimes repetitive, overlapping cycles.

The issue is therefore no longer a lack of it, but more of accessing and finding the right info at the right time - to be delivered to the right target. The challenge is to match an information need with an appropriate resource.

This challenge focuses our attention on two key aspects are:

(1) Accessing information - covering the technology dimension. How do we ensure that a decision-maker, in need of information to choose between alternatives and take a decision, is provided that info? How can the information be provided? The question is one of technology - how can a decision-maker access information quickly and efficiently?

(2) Finding information - covering the management dimension. With the advent of the Internet comes a new expression - information overload - of an overwhelming volume of information being delivered without sufficient disseminating justification. How can information be managed better - packaged better - to facilitate effective decision-making? Is a 200-page folder detailing the entire activity necessary to take a decision to initiate it (for the decision-maker)? Or is a one-pager with a bulleted list giving the salient points sufficient?

Knowledge is a construct that is created in the mind of the user, as a result of the cycle of accessing, processing and understanding information.

But providing and ensuring access to information will not complete the knowledge cycle. Quite clearly, it is the opportunity for value adding to information given to the user, which leads to generation of knowledge and understanding. Thus along with the provision of access to information, lies the need to create a two-way flow of opportunities to generate knowledge.

On one hand are value-adding opportunities for the user to contribute experiences, insights and related information to the information being accessed. On the other are opportunities to contextualize and localize the information being accessed to the environment within which the user works.

It is this value-adding, interactive give-and-take that leads to the generation of real knowledge.
Hari Srinivas