ZeDestructor
Well-Known Member
- Joined
- Dec 13, 2008
- Messages
- 2,321
- Car(s)
- 1994 Toyota Tercel
I got a still running XPS but running is a strong word for it...
I think a lot of people at Google would disagree with you calling them hispters
That is true if you are coding for Windows it makes little sense to develop on OS X Linux blows as a desktop OS though, I have that on my work machine (we are not allowed to run Windoze) and I can't even get the mother effing Citrix receiver to work on it despite using a package straight from Citrix.... WebEx also bitches about Java not being installed (spoier alert it is), hell even the TeamViewer client I'm running on it is packaged with wine built in...
I would much rather run OS X or Windows with a Linux VM simply because Linux is just too god damn limited when it comes to normal desktop usage and I'm not even talking about purely consumer stuff like YouTube (which actually works fine). I will take OS X over Windows any day though because regardless of how outdated the shell is (you can just install a newer one from github or macports) it's still actual UNIX underneath with all the functionality that comes with it. This also means that just about any software made for Linux will actually run on OS X, that's where things like macports project and github come in. Mind you if you ever used BSD then you can see that OS X doesn't hide functionality as much as you think, it just goes about it in a more BSD way, which is annoying for anyone used to Linux myself included.
My Dells are running flawlessly: I sold my Optiplex 745 to a friend, who uses it 24/7 running ESXi for NAS and other stuff. My Optiplex 980 also runs ESXi, but I use it for running pfSense machine rather than NAS as primary purpose. Well., I mean to.. I'm testing some RAM atm...
Actually, Google thoroughly doesn't care: there is absolutely no code on your local machine EVER. End of story. Plus, at the scale you run at with Google, you need to test _EVERYTHING_ on clusters. Typical shitty testruns are on the 1000 machines scale minimum, and with all their serverside stuff running Linux, you can guess what remote environment they run. That is also why using a Chromebook is entirely feasible at Google of all places for serious coding.
Oh, and in order to manage Macs, Google had to build most of there management tools to deploy and update software. Turns out remote desktop isn't enough when you have 10+k end-user machines.
As for your Citrix woes, all the blame can be placed squarely at the feet of Citrix. At work, just about all devs run ArchLinux because that's where you can get all the bleeding edge software with minimal faffing around self-compilation or constantly adding repos. I myself run Arch on my desktop and two laptops and it works fantastically well.
As for Windows, well, if you do any form of embedded or hardware development, you'll find that life is just easier on Windows, because most tools are built for Windows over Linux. OSX has just about nothing in that field compared to Linux, let alone Windows.
Oh, and to top it all off, the OSX Desktop Environment/Window Manager just sucks compared with the stuff we have on Linux.
You're so cool, with your "real" computer-y work and all.
As someone who just went through Computer Architecture (how to design CPUs) and Extended Operating Systems (how to build Operating systems, funnily enough), I feel I have the right to make that statement.
To get into real computers you need a 3270 terminal, not some fancy ssh mumbojumbo.
I use the keyboard from a 3270 variant... does that count?