Linux FUD Pattern #9: Microsoft will sue you if you use Linux
Warning! Using Linux will expose you to legal action by Microsoft! At least that’s what some would like for you to believe. Many months of news articles have focused on this issue, which is why it is on my Top 10 List of Linux FUD Patterns. Users beware!
I’ll See You In Court!
Nothing instills fear like a lawsuit, and nothing prevents Microsoft from filing one against Linux contributors, distributors and users. The fact is, in the United States, you can file a civil suit against anyone for just about anything. Of course, court cases must have some basis in reality or they will never see a day court and there is also the risk of the plaintiff being counter-sued for bringing a frivolous lawsuit.
The legal threat posed by Microsoft is not so open-ended. Barring specific actions such as breach of contract, the legal issue that worries (potential) Linux users the most is patent infringement. This isn’t your run-of-the-mill negligence case either, this is a Federal offense.
Patent law is codified in Title 35 of the United States Code. §271(a) begins by setting a broad scope of application for infringement: making, using, offering or selling a patented invention without authority. That pretty much covers all contributing programmers, users and both commercial and non-profit distributors.
Of course, there are conditions that nullify the infringement claim. The most obvious and most important is Prior Art, also known as novelty. §100-§105 describe the patentability of inventions and §102 specifies some of the conditions under which a patent is not valid including prior use of an invention by another party. Also, a defendant named in an infringement case may be able to prove that he is actually an “earlier inventor” of a method as described in §273(b), which renders the patent unenforcible against that defendant.
Microsoft vs. Linux
Microsoft has claimed that Linux violates approximately 235 patents. The company has reportedly “chosen” to not sue, and the rationale for this choice has been the topic of much speculation. Microsoft has not revealed the details of the violations, including the identifying numbers of the violated patents.
Lack of merit in the claim is probably the reason most people believe Microsoft has not filed – in other words, Microsoft is bluffing. Perhaps Microsoft knows that the patents are not enforcible for one reason or another, but it also knows fully that it retains power derived from fear so long as it can make threats that sound credible. If the claim does lack merit, that power would diminish rapidly once a case is brought against the first defendant. Either the patents would be found to be unenforcible (e.g. prior art would be proven), or legal action against one defendant would prompt the Linux community as a whole to adapt quickly. Details of the suit would provide the vital information required to ensure that Linux complies with all patents going forward.
Besides using fear as a way to dissuade conversion to Linux or to encourage conversion away from it, another possible strategy might be to besiege Linux. By presenting a constant threat and keeping the Linux Community guessing, Microsoft may be trying to drain the time and other resources of the Linux community. Court cost aside, doing patent research and verifying that no rights were violated takes time and can be expensive. Also, Linux developers may spin their wheels fixing problems that might not actually exist, giving Microsoft more time to improve competitive features on its own OS.
Even if no action is planned, Microsoft cannot allow itself to gain a reputation for not defending its own patents. I have heard in the past that a failure to defend a patent may be considered abandonment or an implied license, but I cannot find any information in the legal literature to support this claim. Some may be confusing patents with trademarks in this regard – failure to use or defend a trademark against infringement may result in the loss of trademark registration. Nonetheless, it would not behoove Microsoft strategically to allow the abuse of legitimate patents.
Don’t Worry, Use Linux
Here are some good reasons why Linux users should not worry too much about being sued.
Cost-Benefit. The decision for a company to file suit is ultimately a business decision, which means that the benefits of any legal action would have to outweigh the costs. Lawsuits are not cheap and the payoff for suing individuals for a few hundred dollars each for lost profits would probably not be worth the trouble. Defendants must be named, which means Microsoft would have to specifically identify Linux users, requiring a lot of paid hours of research.
Damage to brand. Suing those who you wish to be your customers is probably a very bad idea. Not only does it alienate those being sued, but it looks very bad in the eyes of other customers. Ultimately, it might cost Microsoft more in lost profits than what it was able to recover through lawsuits. Apple, IBM and Sun on the other hand, may be very happy with this outcome indeed!
Prior art. As mentioned above, the use of an invention prior to the grant of patent exempts the defendant. Much of Linux is based on other Unix variants and I’m certain the code looks very similar. DOS appeared on the scene in 1980-1981 and Windows became available for the first time around 1985. The first Unix was written in 1969. Don’t forget that Microsoft did release an x86 Unix variant called Xenix in the 1970s and 1980s, but eventually sold the rights to this OS to the ne’er-do-well SCO Group.
Of course, contributors and distributors are much easier targets on all of these points, but if it were just that easy, I’d think we’d have seen some major court action by now.
Linux FUD Pattern #8: Linux will void your warranty
Will the use of Linux void the manufacturer’s warranty of your computer hardware? This is one fear that prevents some people from making the leap to Linux, which is why it is on my Top 10 List of Linux FUD Patterns. The short answer is, it depends; however, there are steps that you can take to increase your probability of receiving service under a warranty.
What Is A Warranty?
A warranty is a seller’s obligation to provide a remedy when a product fails to meet the conditions of the warranty. The conditions and the remedies are specific to the warranty for a product, though some warranties are legally implied and need not be explicitly expressed. The Federal Trade Commisison provides a very informative page describing warranties.
Conditions under which a buyer may exercise the right to receive a remedy usually concern attributes of product quality. Express warranties are an incentive to the buyer, because it shows that the seller is willing to stand behind its products and protect the consumer from unintended defects that arise during manufacturing or during normal use. The definition of normal (or “intended”) use may be specified in the conditions.
In the United States, Article 2 of the Uniform Commercial Code governs warranties, both express and implied. Sellers are legally limited in the extent to which they can disclaim warranties. Specific statutes are established at the state level. Moreover, the Magnuson-Moss Act of 1975 was enacted to make warranties more readily understood, but its application is limited to consumer (read: household) products.
Why Not Honor Warranty Claims?
Why would a company not want to honor a claim made against its product warranty? In a word: cost.
Warranties are considered to be contingent liabilities for financial accounting purposes. At the time of sale, a reasonable estimate of warranty costs can often be made. This also means that until the costs either are realized (e.g. warranty work is performed) or expire (e.g. at the end of the warranty period), the obligation to replace or repair the product in question impacts the financial health of the seller or manufacturer. Depending on the product, such obligations can be significant. Financially, expiration is much better than realization because it does not impact cash.
Moreover, the cost of troubleshooting and repairing a system with a nonstandard OS installed is higher than that for a standard configuration because time must be spent either learning to work within the unfamiliar operating environment or time is spent working around that environment. It’s also much easier to determine when a problem is not a result of manufacturing or normal use when the technician is working within a known environment as it has been (pre)installed by the seller. If you replaced the software used by your automobile’s internal computer with a variety of your own, do you really think a dealership or even an independent mechanic is prepared (much less willing) to assist?
What’s worse, if a component of a product that provides some sort of control over the use of the product for the purpose of maintaining or extending its useful life, then the replacement or modification of that component may cause hard to the product as a whole. This is obviously not a manufacturing defect and is unlikely to be considered “normal use”. The extent to which an OS fits this description depends on what functions the OS provides (e.g. teperature control).
Read Your Warranty!
Ultimately, the answer to the question lies in the language of the warranty itself. A statement of warranty is a legal document and the one shipped with your new PC was probably written with or by a lawyer. The specific conditions and remedies are contained therein. READ YOUR WARRANTY! This will be the primary source of coverage information should you decide to take a dishonored claim to the courts. If it is that important to you, read the warranty before you buy the computer and only buy a computer with a favorable warranty.
Rest assured, the company will probably steer clear of violations of implied warranty, which means that they will probably not refuse to replace items that pose grave safety hazards, such as exploding laptop batteries. Dealing with your non-standard OS is much less costly than a court settlement with your home insurance company or your estate. A motherboard or power supply that stops working altogether is not a grave safety issue and claims regarding these issues are subject to more scrutiny.
HP Case Study
An exegesis of each and every warranty provided by every PC manufacturer over time is far beyond what I can do here. But, since a Web search for ‘Linux’ and ‘warranty’ readily retrieves stories about Hewlett-Packard, and since my family has two HP laptops in the household currently, I decided to do a little research on their warranty specifically. Here’s what I found, followed by an account of my own experience with HP support.
The HP warranty is published online, so rather than quote what is on my warranty card, I thought it might be more useful for the reader to have access to the warranty disclosed publically. In the first paragraph of the “Limited Warranty” section, the application of the warranty conditions is expressly limited to hardware products and specifically excludes software and non-branded peripherals. The section continues to explain the HP guarantee, the customer’s entitlement to receive hardware warranty service, and the conditions for repair or replacement.
The “Software Limited Warranty” section near the bottom of the warranty page explains that HP’s obligations are limited to defects in the removable media (i.e. floppies, CDs, DVDs, etc.) shipped with the product, and then, for only a period of 90 days. Of course, the chance that an average PC customer is actually going to use the recovery or installation CDs for a preloaded PC within 90 days, especially for the express purpose of testing the media for defects, is pretty remote – good thing the expectations weren’t set too high for software.
That section also explicitly disclaims support for “freeware operating systems and applications.” Yes, Linux is Open Source and not freeware, but then, the actual verbiage of the paragraph refers to “software provided under public license by third parties” and that would include an aftermarket installation of any GPL software.
So far, so good. Hardware is supported and software isn’t. Uh oh…
There is a possible out for HP in the “Customer Responsibilities” section. For “best possible support”, the customer must be able to “run HP diagnostics and utilities” and even allow HP to monitor using “system and network diagnosis and maintenance tools”. No doubt, these are compiled for Windows only. “I’m sorry, we cannot fix a problem that we cannot diagnose. Good bye.”
This doesn’t mean that HP will not support you, but it does provide them with a logical and reasonable excuse not to do so. Indeed, HP reportedly clarified in early 2007 that the installation of Linux does not affect the warranty of the hardware so long as the software is not the cause of the problem being fixed.
Here is a case in point. In recent months, HP issued a recall of specific models of the Pavilion laptop due to a BIOS problem. As I understand it, the problem had something to do with the computer’s ability to regulate temperature, so units would overheat. Battery problems and other component failures were extreme symptoms. My wife’s laptop was one of the models listed, so I called tech support to schedule a repair. During the course of the conversation, I told the representative that I would remove the hard drive prior to shipment, primarily because it contained sensitive data (which was true). I also mentioned that Linux was installed and that the hard drive would be of little value in the repair process. The rep said that removal of the hard drive was acceptable. The unit was fixed and returned without incident.
How To Protect Your Warranty
Based on my experience and research, here are a few things that you can do to help ensure warranty service:
Troubleshoot the problem. If you are tech-savvy enough to run Linux, you probably know a thing or two about computers. Troubleshooting problems is a science, not an art, and the more you isolate the problem to a specific component, the more leverage you have with the warranty organization.
Buy a second hard drive. The only evidence of a Linux install is on the hard disk (unless you’ve replaced the Windows case badge with a Linux one, of course). When you buy a new PC, set up the preinstalled system, register it, remove the hard drive and store it in a safe place should you need to run vendor-supplied diagnostics. Buy and install a second hard drive for your Linux install and go to town!
Retain possession of your hard drive. Do what I did and tell them that you will be sending the unit in for repair sans hard drive. Data security is a big deal, even more so if the data in question is your employer’s data! Besides, the worthy repair facilities have their own diagnosis disks to use in lieu of a customer’s drive. You need not mention Linux at all. Of course, you cannot expect them the repair or replace a hard drive if you do not furnish the broken one.
Play it safe. Do not use software or perform other system tweeks that have the potential to harm hardware if you want the warranty honored. It is very unlikely that a standard Linux distro will cause such harm, but Linux does provide much more software access to hardware components than do other consumer operating systems. You may be surprised how easy it can be for the experts to determine how a component burnt out and the probable reasons as to why.
|<< Go To Part 7||Part 9 Coming Soon >>|
This article contains information on warranties, but does not contain legal advice. Opinions expressed herein belong solely to the author. If you have a warranty issue that may necessitate legal action, please contact a lawyer.
Linux FUD Pattern #7: Linux software is always behind the curve
For those in the United States, I wish many happy returns to you on this Tax Day, April 15th, 2008. I finished compiling my tax return about a week ago…on paper. It’s not all that painful, really, and I actually prefer doing it on paper. I’ve tried tax software in the past, but going through the motions of reading the rules each year to see what’s changed and performing the calculations by hand give me a sense of control over what I am reporting and more confidence in the results. I do not do it as some form of corporal mortification nor does it have anything to do with the lack of tax software for Linux…ah, and that last point is a great segway into the next item on my Top 10 List, Linux software is always behind the curve.
The lack of personal tax software, like the lack of commercial games, is sometimes cited as a clear indication that Linux is not yet ready for ‘prime time’. To be honest, I haven’t the foggiest idea as to why this is such a ‘tell-tale’ sign. I mean, look at what is being asked for. Given the relative number of Linux users out there, writers of commercial tax software are unwilling to incur the costs of writing, testing, packaging and shipping their programs to office-supply stores nationwide – the demand is just too low.
An Open Source alternative is possible, but there are several reasons why this will probably not occur. First, programmers are usually not tax lawyers, and tax lawyers aren’t cheap. Second, U.S. tax laws are not like the laws of physics – they change every year and can involve unintuitive calculations. Third – and this is directly related to the previous two – if you get it wrong and cause other people to lose money, it is likely that these people will be very upset with you and they may do bad things to you – the risk is just too high.
I’ve considered writing a basic tax-prep script in Perl, not for distribution but for my own use. However, I figure that the time spent on the initial version alone would far exceed the amount of time I’d spend doing my taxes by hand from now until retirement age. There’s really no payoff.
Regarding Everything Else…
The perception that Linux is behind the times, always playing catch-up, spawns from the fact that the development of Linux as a mainstream desktop environment has been, by and large, reactionary. Open Source programmers have spent the last decade or so making sure that the average user has a nice user interface, a productivity suite, a feature-rich art program, a powerful web browser, wireless networking, digital camera connectivity, and at least a couple of games – in other words, nothing new. Is this really a surprise? After all, many people have developed many programs for other operating systems for many years. Linux is young by comparison and there have been plenty of growing pains in its childhood. Moreover, if developers don’t address these basic needs first, it won’t matter what awesome software they do develop, people won’t want to use the platform.
So where is the proactivity? For a start, there are the distros. The emphasis has shifted in the last few years from providing the distro with the most to providing the distro with the least. Trends in PC recycling and ultra-small and/or ultra-cheap hardware show that small footprint is in. Live CDs, Linux on a (USB) stick, virtualization and embedded systems are also hot areas of development. All of these efforts share a common goal: proving that Linux is effective in a variety of situations, that it is not a one-size-fits-all solution. Flexibility is good. And don’t forget about non-desktop applications, such as clustering, and the use of Linux in process-intensive scientific research and graphics-rendering.
Finally, I’ll mention another hot topic, media formats. Linux always seems to lag in this department. The primary reason is that most of these formats are proprietary, and the format designers do not have an incentive to open the specifications; indeed, there is a much stronger incentive not to open them. This is a very complicated topic which I would like to explore in more detail in the future, so stay tuned.
Show Me The FUD!
Where’s the FUD in all this? Obviously, there is some truth to this perception, no? Until Open Source developers use the Linux platform to completely change the way people approach a problem or to provide a solution to a problem never solved before, the perception will not (and logically should not) change.
On the flip side, I think that the “behind-the-curve” card is grossly overplayed. Sometimes, the FUDster doesn’t realize or completely ignores the existence of viable Open Source alternatives that bridge the alleged “gaps”. This may show up in print but more often occurs in online articles when there is either no reader commentary allowed or the author is hoping for a good flame war to help increase readership.
In some cases, the “gaps” are warranted, such as when a given function is becoming obsolete or is not demanded from the start, yet the gaps are exploited to spread fear and uncertainty. For example, a FUDster could cite a lack of virus protection as a Linux weakness and the uninformed masses could accept it as reality despite the true demand for virus protection on Linux. The impact need not be direct either. The lack of tax-prep software may cause some to question the usefulness of Linux as an “everyday” desktop environment, even if they do not purchase such software themselves. And, for the sake of asking, do we really need to reinvent every wheel? Some software concepts were bad to begin with, but do we really want to perpetuate them?
Linux is behind in some areas and ahead in others. That’s life. Before you are convinced that Linux is completely useless just because it can’t replicate something (like your paper tax forms), determine why the gap exists and if that gap is really a problem. As always, be suspicious of generalizations and do your homework!
|<< Go To Part 6||Go To Part 8 >>|
Linux FUD Pattern #6: Linux is low-quality software
Every once in a while, an article or post will appear, claiming that Linux is just not good enough for everyday use. The reason? Concerns over quality. Such ‘blog fodder can range from the sensationalist author’s “Is Linux Ready for Prime Time?” teaser to the rants of the disgruntled because his experience with Linux was sub par. Neither contain anything resembling an objective approach to quality and neither result in a useful conclusion. That’s the topic of this sixth installment of my Top 10 List of Linux FUD patterns, the accusation that Linux is low-quality software. To recognize when FUD of this kind occurs, we must first have a working knowledge of quality measurement.
What is quality? There are several dictionary meanings, but when discussing software quality, the second definition in Merriam-Webster’s online dictionary seems to be the most applicable: a degree of excellence. Other dictionaries generally concur with only minor deviations in terms. Fine, but what does that really mean? The definitions of ‘excellence’, ‘excellent’ and ‘excel’ emphasize the superiority of the excellent, its ability to surpass others in some way. Moreover, by adding the word ‘degree’, M-W subtly introduces measurement. Therefore, quality as it applies to software is really a two-part activity: measurement of one or more attributes and the comparison of these measurements for the purpose of determining the degree to which the software excels.
Just off the top of my head, I can name three comparisons commonly used in software quality assurance: benchmarking, regression testing and expectations management.
In software benchmarking, attributes of a program are compared with the same attributes exhibited by its peers. Most benchmarking is metrics-based, measured numerically, and is often related to the relative speed of the program. The time it takes for a program to start up or the number of calculations a program can perform per unit of time are common examples. I consider feature list comparisons as types of benchmarks, albeit non-quantitative ones. Competing software packages that perform roughly the same function usually share a minimum set of features. For example, all “good” word processors are expected to have a spell checker. Of course, many factors, not just the number of features, must be considered.
Regression testing is a comparison of a program to itself over time, usually between builds, releases or other milestones in the evolution of a product. Usually, regression testing means testing unchanged functionality to determine if a program was inadvertently broken by a change to some supposedly-unrelated function (i.e. make sure it all still works). This is an example of a binary determination (working or broken); however, degradation in speed or capacity and unacceptable trends in various controls and tolerances may be detected as well, indicating programmatic problems in the code. Metrics that describe the development process provide valuable feedback, leading to process improvements that should ultimately improve the product either directly or indirectly.
I saved the best one for last, the management of users’ expectations. Don’t let the name fool you – it may not sound like a measurement-and-comparison activity, but the management of expectations involves constant gap analysis which inherently necessitates measurement. The quality of a product is most often measured as the extent to which the product meets the needs of its users. This means that the end product must be compared to the requirements defined by the users, often traceable via some sort of design documentation. The requirements specification may have been created by explicitly documenting the needs of a readily-accessible user base, or by extrapolating the needs of an generally inaccessible user base through market research and analysis. This type of comparison is the most important of all because user requirements can potentially render both benchmarking and regression testing unnecessary. For more discussion on this topic, pick up any number of books on quality at the bookstore or library and see what the experts say regarding the importance of meeting the needs of customers.
Ok, so measuring quality means drawing comparisons of various kinds. Now what? Suppose you want to determine if a particular software program or package is good enough to use. This can actually be quite simple. The first step is to list your needs and wants, including how you expect the software to behave and how well you expect it to perform. The distinction between needs and wants is deliberate and necessary. If software lacks a function you need, you won’t use it, but the decision is different if it lacks something you simply want and yet is acceptable in every other way. These requirements may then be weighted or ranked, and measurement criteria defined, both qualitative and quantitative. Assuming that alternative products exist, select one or two of the competing programs to evaluate; this is called a “short-list”. Design and execute your tests, observe and measure outcomes, then weigh and rank the results. Several rounds of evaluation may be required if results are inconclusive, and the addition and/or refinement of requirements with each pass. At some point in the process, you will determine if the software meets your needs or if you would be better off with one of the competing products.
When comparing several programs or packages, it may be helpful to create classifications based on multiple criteria. For example:
Low-quality software will:
- crash or not start at all,
- contain calculations that return incorrect results,
- corrupt data,
- have components that don’t work as documented or don’t do anything at all,
- have little or no documentation
- have poorly designed GUI layout, and
- have a poor or missing CLI or API.
Medium-quality software will have none of these ills and will often:
- have a consistent look and feel,
- include useful documentation, and
- have an intuitive layout and command options and other user-friendly qualities.
High-quality software will:
- sport outstanding UI considerations,
- have accurate and friendly data validation,
- contain no material defects in functionality (especially calculations),
- include fully- and easily-configurable options,
- have a full-featured CLI, and/or API, and
- include complete documentation with examples of use.
The type of software being evaluated often establishes quality criteria. For example, one of the most important attributes of desktop productivity packages for many users is a feature-rich user interface with an intuitive layout. Some processing speed can safely be sacrificed to achieve these goals, as the program is limited to the speed at which the user can provide input anyway. Contrast this with automated server software that must process many incoming requests very quickly, but because it is configured once and allowed to run in the background (“set it and forget it”), ease-of-configuration is of lesser importance. There are always exceptions though. Some desktop users may want an extremely basic interface with only core functionality while others are willing to sacrifice server software speed on low-traffic services in exchange for easy configuration and management. Notice, these exceptions are examples of how user requirements can trump benchmarking.
Of course, if you are the author of the software and not just a user, you probably already know that testing at multiple levels is not only desirable buy almost always necessary. In an internal IT shop, developers unit test their modules, the software is often tested in an integrated environment that simulates or mirrors the ‘production’ environment, and finally, users perform functional testing and formally accept a product. For commercial software, internal testing is performed, possibly followed by alpha and beta testing performed by external users.
What About Linux?
So far, we’ve discussed quality in general, and absolutely nothing specific to Linux. Obviously, Linux is software too, so all of the points made above apply. Computer users have various needs and wants, requirements and expectations, regarding the operating system. These requirements can be translated into tests of quality for Linux just as they can for any other software.
I think that the ways in which Linux differs from other platforms, primarily in philosophy but also in more tangible respects, is arguably a major reason for the perception that Linux and its applications are low-quality software. For example, everyone’s needs and wants are different, and the Linux community strives to provide as much freedom as possible to the user in satisfying those requirements. To accomplish this, multiple distributions, windows managers, and the like are offered; unfortunately, this tends to confuse the uninitiated into believing that using Linux means living with chaos. To make matters worse, producers of commercial software products focus on market share and go to great lengths to be the ‘only game in town’. While competition is supposed to foster innovation, the unfortunate after-effect is a reduction in the number of choices available to fulfill users’ requirements. It hurts when a product that meets a specific need is acquired by a competitor of the vendor, subsequently discontinued, and replaced with the new vendor’s surviving product which didn’t fulfill the need from the beginning.
In my experience, another reason commonly cited for “poor quality” of Linux and Open Source Software in general stems from faulty logic, predicated by the old adage, “you get what you pay for”. If the software in question is sponsored by a software company, then it stands to reason that the company (a) probably knows how to develop software, (b) has adequate testing resources to do so and (c) has a reputation to protect. These companies cannot afford to build bad software. A track record for producing bad Open Source software could very easily bleed over to customers’ perceptions of the company as a software producer overall, impacting sales of existing and future commercial software packages. On the other hand, many Open Source applications are home-grown solutions, not supported by companies, but maintained and promoted though grass-roots efforts. The authors are individuals motivated to write quality programs because they themselves use them, and they are kind enough to share the fruits of their labor with the rest of us. While it is true that “quality costs”, development isn’t (economically) free either; so, just because an application it is available without monetary cost to you doesn’t mean that it is without value.
Finally, Linux distributors, especially the “name brands” such as Ubuntu, SUSE and Red Hat, usually do a good, professional job in testing their products. Applications that run on the Linux platform vary more widely in the level of testing applied. Check the websites of these applications to determine how thoroughly the software is tested before each ‘stable’ release. See if the authors employ dedicated resources to test and/or engage end users in alpha and beta testing efforts. Third-party certification, though rare, but is an invaluable tool for boosting end-user confidence.
Don’t believe blanket statements about the quality of software available for any particular platform unless they are backed by real data. Most are biased, unsupported or outright FUD. Unsubstantiated and grossly subjective claims are irrational and the hallmark of FUD. Instead, do research and evaluate software for yourself. Only you can determine if an application meets your needs. Only you define quality.
|<< Go To Part 5||Go To Part 7 >>|
Linux FUD Pattern #5: Linux is not secure
There are some out there who would like for you to believe that Linux is unsafe. What better way to instill fear than to form doubt in your mind about a system’s abilities to protect your data?
A reason for the supposed lack of security often cited in FUD is the origin and maintenance of Linux in the “hacker” community. The term “hacker” has evolved from a term of endearment to one associated almost exclusively with cybercrime. To say that Linux was created and is supported by hackers gives the impression that the OS and its related applications are riddled with built-in security holes, backdoors for gaining system access, spyware for purposes of identity theft, hidden network tools that help intruders cover their footprints as they travel from machine to machine through cyberspace, and any other sort of malicious software for various and sundry purposes. To “hack” no longer means to “tinker” or to “fiddle with”, but to “break into” and “cause harm”. The term may conjure mental images of a scene from a horror movie, an evil man with an axe about to hack his way through the door to the house protected by the dark of night. Such is the imagery used to spawn fear.
Let’s examine Linux security by answering two questions. Do security components exist? And, can they be trusted?
The components required to make a system secure depends on many factors, because different systems are used in different ways by different people. Moreover, a weakness in a system’s security may be mitigated by strengths in some other compensating controls. There are some basic options that are commonly used to secure systems, all of which are available on Linux.
Password protected login is the hallmark form of authentication. It is easy to implement, easy to use, can be highly effective, doesn’t require additional/expensive hardware and the expectations and conventions surrounding it are already present in modern culture. Sure, there are more advanced biometric devices such as palm readers and retina scanners, but the relative cost in money and effort of implementing these safeguards for the average home user and for most business desktops is prohibitively high. There are two aspects to password security: the strength of the password itself, and the authentication scheme behind it. Password strength is the responsibility of the user, not the OS. Most Linux distros either require password protection or at least have it enabled by default. Usually, the passwords are protected on the local system by shadowing and various schemes such as Kerberos can be used to protect the transmission of login information over a network.
Related to password authentication is the file system permissions granted to users once they’ve logged in. Linux and Unix use file-based permissions, denoting how the owner, members of the owner’s primary work group and the “world” of users on the system can interact with each file or directory. Privileges do not cascade as they do with other operating systems that use Access Control Lists.
Network security is a broad topic encompassing the combined abilities of the OS, applications, network devices, administrators and users to detect and/or prevent a breach attempted across a network connection. A basic way to accomplish this is to disallow certain types of messages from reaching the computer; this function is usually performed by a firewall server or program that monitors network traffic and filters communications based on predefined rules. Every computer that communicates over the Internet uses the TCP protocol, which allows for approximately 65,000 possible “ports”. These ports are similar to radio stations or TV channels; each application that needs to communicate does so using one port. Ports that are not used by an application but are still available for use (“open”) can be exploited. Port scans are a good way to determine if a system has any open ports that are not being used. Firewall capabilities are built into the Linux Kernel and several good front-end packages are available for configuration, monitoring and reporting purposes.
All of the safeguards discussed above constitute protection around the data. What about protection of the data? A data file can be encrypted thereby changing the contents to an encoded, unreadable format. The content is usually restored using a key or a password. E-mail can also be encrypted prior to transmission. GNU Privacy Guard (GPG) is a Pretty Good Privacy (PGP) compliant application that implements public key cryptography on multiple OS platforms, including Linux. Of course, constantly having to decrypt and encrypt every individual data file before and after use would be painful; instead, entire file systems can be encrypted by the system and several cryptographic file systems exist for Linux. It is also possible to create a loopback device, which is a file that can be mounted as an encrypted file system similar to the commercial product Cryptainer LE by Cypherix.
So, the components do exist. Now, the question remains, can these components be trusted?
FUDsters will argue that any security software for which the source code is freely available to the public is inherently not secure. This is based on the assumption that the source code will either reveal the secret functionality that makes the security software work or expose bugs in the security software itself that can be exploited as well.
First, if someone cannot open their source because they are afraid it may reveal secret functionality, then it wasn’t properly designed from the start. The worst-possible example of this is hardcoding passwords in programs, especially if they are scripts stored in clear text. Good security schemes, such as encryption, rely directly on information the user provides, and often make use of one-way functions.
Second, Open Source software is available for public scrutiny. If you cannot read and understand the code yourself, rest assured that there are many folks out there that can and do. Why? Because many businesses do actually use Open Source software and have everything to lose if they don’t test it out first. That being said, I consider many corporate “testimonials” sponsoring one OS or another based on security or other factors to be FUD, mainly because they often appear in paid advertisements and seldom reveal the details of tests performed to lead to such conclusions. Independent certification and research performed by government or other nonprofit entities are usually the most objective and reliable.
Aside from learning the code, another way to test an application’s security strength or to see if it transmits private data is to watch (or “sniff”) the port on which it communicates using a network monitoring tool. Such data may be encrypted, but the (data) size and timing of requests made by the client software should be consistent and reasonable. This is a technical task, but a bit easier than learning how the code works. Just remember, sniffing outside of your own network may be considered illegal.
Finally, there are many Linux opponents that would jump at the chance to expose real security weaknesses in Linux and its applications. These are often vendors of competing software and have both the money and channels to make themselves heard. When such a claim appears on the Web, look for specific details about the vulnerability. If there are none, it may be FUD. Also, check the software website to see if the vulnerability has been acknowledged or refuted as well as any status on its repair. Never take such claims at face value.
Here’s a few tips to remember to help protect yourself.
Any security expert worth his salt will tell you that physical security is the most important aspect of system security. If physical access to a computer is available, then it is usually just a matter of time before the system will be compromised, regardless of operating system. Obviously, the probability of such breaches skyrockets for laptop users, especially when so few (based on my own observations) choose to utilize even the most primitive of safeguards, cable locks. Also, I’ve not seen any major headlines on this so far, but Live CDs, as wonderfully useful as they can be, are ginormous threats to the security if physical access is available. This is because most Live CDs provide superuser access to a system and all of its devices. It is best to keep computers under lock & key whenever possible.
One of my friends from university used to work in an engineering lab on campus. He had set up a Linux box on the network, with full consent of the administrators of course. But one of the the permanent staff members approached him one day, asking how he managed to cloak his machine from the nightly SATAN network scans. The answer was simple! He turned the machine off before he left each day! Turning a machine off or at least disconnecting it from the Internet when not in use will deprive the would-be attacker the time needed to successfully break in using a brute force attack.
And, as always, be careful what you download. There is always a chance that someone will write spyware or malware for Linux. Stick with applications that have large communities and good reputations if you can. Search the Internet for evidence that an app may not be secure before downloading it. To quote the the Gipper, “trust, but verify”.
|<< Go To Part 4||Go To Part 6 >>|
In this installment of my series on the Top 10 Linux FUD patterns, I address two patterns that have more to do with software packages that run on the Linux platform than with the Linux OS itself. As I stated in a previous post, every believable piece of FUD has some element of truth behind it, and these two are no exception.
Linux FUD Pattern #3: With Linux, you cannot access old files or share new files with others
Do you remember the Word Processor Wars? For those who don’t, a conflict began in the early 1990s as to which word processing application was the “best”, the most feature-rich, and the one most likely to dominate the market. It was a fight to the death. Though this was largely a war over functionality, the decisive battles were often fought on the file system. Why? Because the ability to understand and use a competitor’s file structure has certain advantages. First, almost any hot new function can be replicated because a sample of data speaks volumes about the processes that created it. Second, the ability to open and use the other format eases transition away from the other product. Proprietary file formats became the weapon of choice, and the strategy, to lock as many users into them as possible. The mentality that whoever controls the data controls the world solidified.
Except for the occasional Is-Linux-Right-For-You? article in the trades, this pattern does not manifest in print very often. It is more likely a topic hotly debated between OS zealots. Most often, I have been personally presented this tasty FUD cake by folks that have no experience with Linux or its applications, who think (through no fault of their own) that we *nix users type all of our letters and papers in on the command line. I raise the fact that OpenOffice can not only open many other document formats but that it can *cough* natively export PDF files as well, and suddenly the eighth-grade-level trash-speak subsides.
File exchange is not a myth; indeed, it is a very important issue. Moreover, for anyone who’s been watching the OOXML vs ODF standoff, it should be clear that the Open Source community is very much in favor of a set of open documentation standards as well. Whether or not it used to be, cross-platform file sharing is just not a problem with today’s desktop environments and is becoming less so.
If this is a serious concern, my advice is to either save your files in a highly-interchangeable format from the start or have an exit strategy that entails migration to another app later. HTML is one option, but only if maintaining strict page layout is of little importance. I have had better luck with the Rich Text Format (RTF); granted, this is not an open format, but it is highly portable and since it is ASCII-based mark up, I can read it with a text editor in a pinch. Also, I tend to save copies of documents in Adobe’s Portable Document Format (PDF), not just because it is portable, but because it looks more professional when sending documents to others. When I upgraded to a new machine and installed Linux for full-time use, I had to convert all of my AutoCAD Drawing (DWG) files to the Drawing Interchange Format (DXF) for use with QcaD. Between that and converting all of the Works 2.0 documents to RTF, I spent many hours executing an exit plan that could have been avoided altogether – lesson learned.
Linux FUD Pattern #4: There are no good software titles for Linux
Looking back, the title of this pattern should have been “There are no popular software titles for Linux”, but in my haste, I typed the word “good” instead of “popular”. This gives the impression that this pattern addresses the quality of Linux software, an issue to be covered later under pattern #6. My apologies.
Nonetheless, this statement – as related to the popularity of software titles – is a highly relative one. OpenOffice and Firefox are wildly popular amongst Linux users. They are bundled with nearly every major distribution and receive a lot of press. They are also available for other platforms, and though they do not dominate these market segments, they seem to be gaining popularity.
The measurement used to determine popularity is an important factor underlying this statement. Is popularity based on customer registrations? Sales? How about the rack space devoted to software at the store? All of these metrics are biased toward commercial software and against free software. Considering the number of try-before-you-buy commercial apps available, download-counting may be a tempting metric to use, but it is biased in the opposite direction and doesn’t consider anomalies such as multiple installations or the ultimate rejection of the product by users. An unbiased consumer survey may be the only way to truly determine popularity. If anyone has actually accomplished this, please share.
Another important question is, does popularity really matter? There is a link between popularity and the fear that an app will eventually lose support, but that risk can be mitigated with a good exit strategy as discussed above. That fear is the target of this FUD pattern. Also, in my opinion, computing is not a popularity contest. If a software application meets my needs and the outlook for support is favorable, then I don’t care if everyone uses it or not. Sometimes, form is more important than function and sometimes it is not, but choosing an app solely because “everyone else is using it” is rarely an acceptable strategy.
The obvious exception is high-end computer games. Computer games in general have created a special culture, and each game has a following, large or small. Games are not about functionality and meeting requirements, but about being part of the culture…shared experiences are part of the entertainment. Admittedly, there are few “big names” producing or porting popular game titles to Linux, a trend that will continue until the gaming market demands otherwise. The desire to be a member of that culture can certainly be enough to dictate which OS to use some or all of the time. Hopefully, things will change.
Finally, the statement is much too general. While it may be true in respect to high-end games, the supposed lack of software is often exaggerated to include all possible uses of the OS, creating yet more FUD.
|<< Go To Part 3||Go To Part 5 >>|
Linux FUD Pattern #2: Linux is not “officially” supported
When you hear the phrase “official support,” what comes to mind? Informative user manuals? A well-staffed call center? But what makes it “official”? This is the second item on my Top 10 List of Linux FUD patterns: the lack of “official” Linux support. The goal of FUD based on this notion is a mixture of fear and uncertainty, to make you believe that using Linux means having no place to turn when a problem occurs.
Generally speaking, “official support” for a product is provided by the entity that owns the intellectual property for the product and/or has the right to produce and distribute it. Products are typically sold or leased, both of which are types of business transactions; this implies that the entity in question is operating as a business. A third-party provider paid to support a product may be licensed by, or otherwise affiliated with the original vendor, but only the vendor’s fixes and upgrades are “officially” supported. “Official” support connotes a certain level of authority or expertise, but also implies consequences, usually legal or fiscal, for a failure to meet service expectations. This is the model used by businesses today.
Linux, however, is not a business-supported product (per se). Linux is not “owned” by a particular entity, nor does one particular entity retain the exclusive right to update and distribute it. It is licensed under the GNU Public License (GPL), which permits any software recipient to modify and distribute the software or derivatives thereof as long as the conditions of the GPL are not violated. This is coupled with the open source philosophy, but they aren’t exactly the same thing – an open source application may be licensed under something other than the GPL.
So then, who does “officially” support Linux? The answer is that Linux has always been a grassroots movement. Though it was originally created by one man, Linux is “officially” maintained by a community made up of individuals, groups, and yes, businesses. Different groups within the community support different parts of the system. These groups are commonly known as “maintainers” and usually include original authors or those to whom the torch of authority has been successively passed. For example, assuming the Wikipedia article on the Linux Kernel is not out-of-date, Mr. Torvalds still supervises changes to the core of Linux and has designated the maintenance of older releases to other individual maintainers. The parts maintained are typically called “projects”. Various entities, such as Ubuntu and Red Hat, bundle various system parts together as a unit and ensure that their respective distributions operate as expected, that is to say, that they operate well.
While maintainer and/or community support for a Linux distribution or a particular project may be “official”, technical assistance may not be readily available, on demand, free of charge, or for that matter, available at all. Most maintainers are polite and willing to help, but please remember that much of Linux has been contributed by developers and that support offered pro bono publico doesn’t help feed the family or pay the mortgage. This is where the rest of the community helps out, in the form of online support forums.
Paid support is available as part of the commercial offerings made by Red Hat, Novell, Linspire and others. Additionally, some of these companies offer professional services, such as consulting and training, though these services are typically meant for consumption by businesses, not home users. Any company offering fee-based technical support for Linux is free to set their own price, whatever the market will bear.
In an increasingly tech-savvy world, I think the difference between commercial and community-based support is rapidly decreasing. Consider the available courses of action that may be taken when a problem does occur with a commercial OS. Almost always, the first step is to search the Internet for a root cause, if not a full-blown resolution. This is often done as a cost-saving measure (easy fix) or so that the user/administrator can better explain the problem to tech support when a call is eventually made. Moreover, help may be actively sought in a multitude of discussion groups, mailing lists, blogs, chat rooms and other forums dedicated to supporting various operating systems. Another option is to consult with a friend or relative that knows about these sorts of things. Of course, the “official” vendor or (gasp) a consultant can be called upon, usually for a fee of course. At the discretion of the user/administrator, the problem may be eliminated by brute force: reinstalling the OS. (Actually, this last option isn’t all bad as long as no data were lost – it provides an opportunity to “clean house” and possibly upgrade to a newer release or move to a different distribution.) The order of preference for these alternatives depends on the facts and circumstances surrounding the problem, but they almost always rank from the least- to the most-expensive in terms of time, effort and cash outlay.
Hardware support (or lack thereof) often appears as diversionary FUD regarding “official” support. Hardware must be able to communicate with the computer at several levels, starting with the physical. For example, a USB device can be attached to any machine with the appropriate port, but to use the device the OS must know how to communicate with both the USB itself and the device on the other side. Obviously, this issue quickly boils down to device drivers and brings us back to a discussion of “official” software support.
Rest assured, common devices such as keyboards, mice and thumb drives, almost always work using standard Linux drivers. In other words, they don’t support Linux; rather, Linux supports them. Newer device classes for which no “official” Linux drivers are provided often suffer a period of incompatibility or reduced usefulness. For example, Wi-fi network interface cards are now going through the same sort of transition that consumer-class Ethernet cards did about six or eight years ago. Many times, this is because drivers have to be derived from messages sent to and from the devices, often requiring many hours of experimentation. A general rule of thumb: hardware compatibility problems are more common as the hardware becomes more exotic. For example, I experienced new levels of frustration with the big-name vendor of a certain USB-ready programmable television remote control for which future Linux support was promised and never delivered. But, the fact is, hardware vendors have the right to choose to support Linux or not, a decision based on supply and demand. The need to operate specific hardware may dictate which OS is used.
The best advice I can give is to ignore the FUD and adopt a pragmatic approach to defining your support needs. Your needs are specific to you. Compile a scorecard and do some research. Questions that should be answered include the following. What is your level of expertise with computers? Have you needed professional OS support in the past? Do you expect to need it in the future? Are you comfortable doing your own support work? Based on community-supplied information? Is your hardware “officially” supported or listed in one of the various compatibility lists? Do you use exotic hardware components? Have you tried running a Linux Live CD, especially Knoppix? When buying a new PC or laptop, have other users posted their experiences with the same model? Research never hurts, but just be on the lookout for more FUD!
|<< Go To Part 2||Read Part 4 >>|
The #1 item on my Top 10 List of Linux FUD Patterns concerns its learning curve. This pattern is probably the most prevalent and primarily appeals to fear by attempting to convince you that Linux is too hard for the average person to use or that it is simply not user friendly. There are many variations of this pattern, from the straight-forward “Linux is for geeks” assault to more mature, logical arguments, such as “if Linux can do everything the fill-in-the-blank OS can do, why bother with the hassle of switching?”.
To be honest, as with every convincing piece of FUD, I think this line of reason has…or should I say, had…a glimmer of truth behind it. Back in the day, when I was casually messing around with Linux as a hobby, I spent many hours on “administrative” tasks, such as installing Slackware from 30+ floppy disks on old retired hardware and trying to configure the RedHat-bundled Metro-X server for specific video cards and monitors. Looking back, these tasks were difficult enough for a seasoned PC tech like myself, let alone for the general public. But today, it’s a different story, especially since Ubuntu makes it so easy.Nonetheless, web news headlines asking “Is Linux Ready for Prime Time?” still appear frequently. What makes Linux so difficult anyway? A quick look through screenshots and how-tos for modern Linux distributions tells quite a different story, does it not? I believe its close association with Unix is the primary reason.
Unix in general has a “bad” reputation for being a command-line-driven OS. It was written in the late 1960s and the graphical ‘X’ windowing system was not introduced until the mid 1980s. In contrast, Linux was first released by Linus Torvalds about 1991 and the development of the XFree86 windowing system for PCs began about a year later. Therefore, one could argue that Linux had a graphical user interface “from the start”. Moreover, Ubuntu and others have done a great job in reducing the user’s exposure to the system console altogether. The need to log into the system on a character-based screen and manually run ‘startx‘ is no more. Of course, you may forgo an X session and boot directly into a prompt if you wish, but that is not the default.
First impressions count too. Despite the availability of X, my first serious exposure to Unix was in university in the mid 1990s and took place, not on something as fancy as a Sun SPARCstation, but on an amber-on-black dumb terminal in the school’s computer lab. To me, Unix came to mean a terminal screen, often accessed via telnet over a dial-up connection with the host computer. It was not until several years later that I discovered X.
Case sensitivity is another classic example. Unix and its kin are case sensitive in practically every respect, and most visibly when saving and opening files. This can be a most obnoxious feature when working from the command line, especially for the occasional user; however, the impact is minimal in today’s point-and-click Linux world. I have heard the concern expressed more than once that having two or more different files in the same directory, each with the same name, differing only in case, would be too confusing. My usual response is in the form of a question: why would a person have so many files named essentially the same thing to begin with? Just because it can be done, doesn’t mean that it should be done.
Other differences exist, such as installation methods for both the OS and software applications, but I think I’ve made my point: Linux is very much like Unix, but it is not the same OS. Linux was made for the x86 PC platform, though other platforms are supported as well. It was written with the end-user in mind, knowing that the everyday user will demand a slick windowing environment, web browsers with plug-in support, and the like. Contributors to Linux and its applications are everyday users too, you know.
How can these negative perceptions be overcome? The concept that Linux is very similar – but not the same as – Unix is too academic, too logical and would take far too long to adequately communicate to the masses. It just doesn’t make for good marketing.
Nothing, however, beats seeing it in action! Remember what I said about first impressions? Live CDs are very useful weapons against FUD. They allow potential users to test drive the OS, to try before “buying”. This helps prove to some that Linux has come a long way in terms of automatic hardware detection and other features that make it user friendly. It’s also much easier than going to the extent of configuring a dual-boot system. The downside is, they can be a bit slow under certain conditions. If a friend has a Linux system already installed, it may be better to try that out instead.
It is also fortunate that the academic community has shown an interest in Linux. Of course, this stems partially from the never-ending need for schools to save money, but there are also purely-educational reasons for using Linux as well. For example, Linux provides an open platform for programming classes and many math- and science-based applications have been developed. Early exposure to Linux means that kids will “grow up” with it and its “peculiarities”.
Hopefully, this treatise will help you keep an open mind the next time you read an article on how Linux could dominate the market “if only it were easier to use”, or help you form an appropriate response when someone expresses the same sort of sentiment in conversation. Always seek out the reasons used to support these opinions and remember that experience should provide more convincing evidence than the rhetoric of FUD.
|<< Go To Part 1||Go To Part 3 >>|
The following is my Top 10 list of themes used by anti-Linux FUD campaigns. This list is based on observations made over my years of following the Linux market. The ranking loosely correlates to frequency of usage and is somewhat subjective at best. Understanding each pattern will help you recognize a nicely-prepared piece of FUD when you encounter it. Each will be covered in more depth in subsequent posts (links in the list).
- Linux has a steep learning curve.
- Linux is not “officially” supported.
- With Linux, you cannot access old files or share new files with others.
- There are no good software titles for Linux.
- Linux is not secure.
- Linux is low-quality software.
- Linux software is always behind the curve.
- Linux will void your warranty.
- Microsoft will sue you if you use Linux.
- The total cost of ownership for Linux is too high.
More to come!
|Go To Part 2>>|
My little Linux FUD blog (with a somewhat contradictory name!) has seen an influx of visitors twice within the past month for my “Adventures” series, making Linux FUD one of the Top Blogs and Fastest Growing blogs on WordPress twice (for a while), all thanks to srlinuxx for featuring the series (1, 2) at Tuxmachines.org.
I have to admit, though, that my little blog wouldn’t get as much attention if it weren’t for the fact that I concentrate on the Ubuntu distribution. The name is getting a lot of attention everywhere.
Ubuntu is getting a lot of flack from users at Digg.com since every time the Ubuntu name is mentioned there, it makes it to the top and the front page. Most of the time, these articles are either useless, repetitive how-to’s, or down-right inaccurate.
What pisses people off the most is the fact that new Ubuntu Linux users tend to think that what they can do in Ubuntu can ONLY be done in Ubuntu, when in fact, it can be done in any distribution. Just do a search for “Ubuntu” at Digg.com and see what I mean. Understandably, regular Linux users of ALL distributions are tired of it and want to see some good quality news articles on Linux at Digg.com, not just Ubuntu. I’m one of them.