Every few years I take a look at Linux (I'm due another look soon) but have always to date felt that it lacked a lot of the basic functionality that I take for granted in Windows, and back away from it.
Where Windows has succeeded may historically have been due to dubious business practices, but since the NT kernel-based versions started going mainstream there has also been an element of "worse is better" about it - the harsh truth there is that Windows simply stopped sucking and started being good enough for most serious tasks a long time ago.
What seems to be an unfortunate bad habit of many in Unix-land is that they pick a baseline year, decide for themselves that absolutely nothing has changed since then, and carry on as if that were the truth. In this case it's sometime around 1998/99. So much has happened in Windows evolution and development since then; a Windows 2000 (even!) box is easily capable of uptimes of 5 years or more (in practice Windows updates mean that will never happen, but I've personally seen many such boxes hitting that mark in reasonably controlled/sealed environments), for example, so the old myth of "Windows crashes every coupla days" is blatant horsesh-t.
One other unfortunate thing about Unix-land is a tendency to rip itself apart with infighting. Historically this has been manifested in endianness wars, editor wars, and more recently Gnome vs KDE, distro wars, etc (you also see it in other technologies derived from this heritage, e.g. the evolution of many OpenGL extensions). In Windows culture you tend to get one way of doing things, you may not like it, but it's consistent for everyone and you just get on with getting stuff done.
Ultimately it's not the OS, it's what you do with it that matters.