Why is Windows so slow?

I’m a fan of Windows, specifically Windows 7. For the most part I like it better than OSX. I have 4 Macs, 3 Windows machines and 3 Linux machines that I access regularly.

But…I work on a relatively large project. Windows is literally an ORDER OF MAGNITUDE slower to checkout, to update and to compile and build than Linux. What gives? I don’t know this is the fault of Windows. As far as I know some of it is the fault of the software I’m using not using Windows in the correct way to get the maximum speed. You can reproduce this yourself though. Download the code.

The simplest test is this: On Windows open a cmd, cd to the src folder and type

dir /s > c:\list.txt

Do it twice and time how long it takes. The reason to time only the second run is to give the OS a chance to cache data in ram

Now do the same thing in linux. Check out the code. CD to src and do

ls -R > ~/list.txt

In my tests, on 2 exact same machines (both are HP Z600 workstations with 12gig of ram and eight 3ghz cores) on Windows it takes 40seconds. On Linux is takes 0.5 seconds. Note I used git to checkout the files on both machines so there are more than 350k files in those folders.

Why is Windows in this case 80x slower than Linux? Is there some registry setting I can use to make it faster?

Similarly compile the code. Using Visual Studio 2008 follow the instructions here. Select the chrome project and build it. Edit one file, say src/gpu/command_buffer/client/gles2_implementation.cc. Just change a comment or something then build again. Try the same on Linux. For me, these incremental builds, on Windows it takes about 3 minutes, most of that time is spent in the linker. On Linux it takes 20 seconds. That’s 9x faster. I can install the new ‘gold’ linker and take it down even more. 7 seconds or 25x faster.

Come on Microsoft, step up your game! Personally I’d much rather program on Windows than Linux (yea, I know, sacrilege to some). Visual Studio’s debugger is far more productive than gdb or cgdb (maybe there is something better on Linux I don’t know). Plus, our users are mostly on Windows so I’d rather be on Windows so I get the same experience. GPU drivers are much better on Windows as well plus there are other apps I use (Photoshop, Maya, 3DSMax) that don’t exist on Linux.

But, I can’t stay on Windows with Linux being so much faster to build. It’s the difference between being totally productive and taking a coffee break every time I change a line and compile.

That’s not all of it either. git is EXTREMELY FAST on linux where as on Windows not so much. It’s probably no slower than svn on Windows as far as I can tell (I haven’t timed it) but one of the many reasons people switch to git is because it’s so fast on linux. Again it’s in the 10x to 100x difference between Windows and Linux.

All I can think of is 99% of developers that use Microsoft’s tools are writing Windows only code. As such they have no way to compare times and so Microsoft has no incentive to make it better. Except of course they do. If they themselves are using the same tools then their own developers are losing valuable time waiting for these tools to do their jobs.

Here’s hoping Microsoft will step it up.

PS: Of the 3 OSes, OSX is a mixed bag. git on OSX is slower than linux, faster than Windows. Building on OSX though is SSSLLLOOOWWW. Nearly 3 times slower than Windows using XCode.

PPS: Chromium has the option to build as a set of shared libraries instead of one large executable. This helps the link times in Windows significantly but it also helps link times in Linux. The relative speeds are till the same.

  • Dr Eel

    “…cumputing heaven.”

    Been watching prons much?

  • Marcusgy

    Here’s another data point for what it’s worth.

    We developed on Windows and the Java project took about 1.5 minutes to clean and build in NetBeans. After switching to Linux that dropped to 15 seconds. This is on the same machine. In this case it was Windows Vista (years ago) and Ubuntu 10.04.

    Always thought it was the better filesystem and better caching by the OS.

  • JPaul

    Bro, look at this link http://delphifeeds.com and you will see how much Delphi is very much alive.

  • Grunwald

    I had this experience when porting Solaris code to windows. But since some time constraints had to be met, i had to investigate further. After i while i had a library, which timed every io-call. On a fragmented disk i had file lookups which needed more than 10 Seconds (Yes ! A single file). This was done to get the file time. Since this is needed for every version control and rebuild code, all you can do is defragment often and early.

    I suspect NTFS to use the Btree lookup on fragmented storage, which is not going to work. BTree meets the timing constraints only, if you have O(1) for a node lookup.

  • The SERP rank of a web page is not only a function of its PageRank, but
    depends on a relatively large and continuously adjusted set of factors
    (over 200),commonly referred to by internet marketers as “Google Love”. Search engine optimization (SEO) is aimed at achieving the highest possible SERP rank for a website or a set of web pages.

  • A similar new use of PageRank is to rank academic doctoral programs
    based on their records of placing their graduates in faculty positions.
    In PageRank terms, academic departments link to each other by hiring
    their faculty from each other (and from themselves)