And can someone start writing bad stuff for Mac? I'm tired of the gloating from a friend of mine....
They do. In fact there are several spam mailer viruses that are rather rampant - but nobody ever notices because Macs Can't Get Viruses So I Shouldn't Have Any Scanners.
Apple also has an INCREDIBLY TERRIBLE track record with fixing security flaws - they'll be notified of an incredibly severe flaw and typically take about a year to respond AT ALL - and then six months beyond that to add a fix to the next service update (or oftentimes they'll include the patch in the next version of OSX, and people using previous versions never get it.) This is in stark contrast to Microsoft, who typically issues an emergency patch within hours of notification, a download-on-demand patch within a few days, and rolls it into the next Patch Tuesday, which is never any more than 30 days out - and in really severe cases they'll push it automatically immediately. And when they fix it, they fix it for every affected version that's still officially supported, and often several that aren't.
*Rant follows*
Apple, line-for-line actually has MORE security flaws than Microsoft, because Windows operating systems are a complicated clusterfuck of hacks and patches to fix the mistakes of every software developer ever to write a program for Windows. If someone wrote and distributes a piece of software that depends on a certain undocumented internal piece of the OS to do something in such and such a way, Microsoft *WILL* find out about it, and when that part of the OS has to be changed to accommodate new features or fix a bug, or just make it better, they *WILL* have to rewrite that part of the OS to check for that program and make it see what it's expecting when it expects it. Therefore Windows is a huge, convoluted mess of code to fix other people's problems.
For example, in DirectX, there's a function call that Windows makes to the video driver that says "Hey, do you support such and such feature?"
Some lazy-assed programmer, at some point in the distant past, writing a driver for a video card that supported every single thing that DirectX supported at the time (and assuming that DirectX would never support anything new in the future) wrote that function to respond "Yes, I support that" in every case. Now, DirectX, every time it asks a video card that question, has to ask that question for a fake feature first, to determine if it can trust the answers it gets back or not. This same kind of situation happens MILLIONS of times throughout the Windows codebase.
This policy of fixing other people's mistakes may seem counterintuitive, Apple certainly doesn't do it - but the fact of the matter is that customers are little whiney bitches - if they upgrade Windows and one of their programs stops working properly, they will blame Windows - despite the fact that 99 times out of 100, that program does something ungodly fucking stupid that it shouldn't have done in the first place.
Coincidentally, the 64bit version of Windows contains MUCH less of this crap because it can no longer execute 16bit code - and the 16bit era, with its tendency to write in raw C and with far fewer actual documented APIs fostered the growth of a lot of these practices.
Apple simply breaks other people's programs with updates. You know how the first step to every guide to upgrading your OSX version is "Update EVERYTHING and uninstall everything that is known to interface with the OS at a low level" - that's why.