Advertisement

SSD for programming?

Started by December 29, 2015 05:05 PM
23 comments, last by Ravyne 9 years ago
Also, in case I put my project on the SSD, how would it affect the longelivety of the drive?

I wouldn't worry too much about that if I were you. I just treated my SSD like my old HDD, with all my projects and even the windows swap and hibernate files on it. Here's my current report after more than 2 years:

ptTDz9F.png

And remember, it doesn't completely die, it just becomes read-only.

In the end, just get both an SSD for fast start times and a massive HDD to have a lot of space, and test if the SSD speeds up your project build time. Oh, and use a tool to check the health estimate after a few months.

I tested VS build times between SSD, SSD RAID 0, and HDD RAID 0 and it made no difference. Code builds seem to be heavily CPU bottlenecked.

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Advertisement


In the end, just get both an SSD for fast start times and a massive HDD to have a lot of space, and test if the SSD speeds up your project build time. Oh, and use a tool to check the health estimate after a few months.

That looks pretty neat for an SSD that has been used for two years, quess I won't really have to worry about it though. I already got an additional huge HHD for all the big files, SSDs alone for space would still be way too fucking expensive. Was still surprised by how little they actually cost now compared to when I last looked into it in like 2010. Then, I would pay ~300 bucks or so for 120 GB I belive, and now its ~170 for 500? Thats pretty neat.


This is a false economy.

I pay $100 a year for unlimited cloud backups, and it's saved my ass several times (its faster to restore a corrupt ue4 map file or uasset from backup than faff with git, especially if you only just corrupted it and haven't even committed yet) and you really can't rely on manual copying and version control as a proper off site backup system.

I know, I know, this is my biggest "too lazy to do it even though I know it would actually save me time and trouble in the end"-case. I was incredibly lucky for the last ~5 years though with no HDD/OS failure at all, so that just furthered my lazy-ness. Having backups as my new years resolution sounds good though, might as well do it at one point :)


A real backup system doesn't need you to do anything. My laptop gets backed up every hour, local snapshots at any time, and then offloaded to long term first stage backup anytime I'm on my local home network, which is in turn automatically backed up to an offsite server. All versioned and such so I can go back and get various changes over time as well down to a given point.

Huh, I really will be looking into this, I always though backup systems where quite some work to manually run and wait for them to finish etc... if its really just setup and go, this raises the chance of my lazy ass to finally set it up... especially with a entirely new/clean OS/filesystem.


I tested VS build times between SSD, SSD RAID 0, and HDD RAID 0 and it made no difference. Code builds seem to be heavily CPU bottlenecked.

Also quite interesting. It doesn't seem to scale that well with multicore due to my own tests though, my old 980X with 6 cores actually build almost as fast as my new 4790k with 4 cores (I have multicore-compilation enabled). Maybe the new CPU is just that much faster per core that it evens out? Well I'll see how it goes with precompiled headers, its not really that bad for me though (yet), but I am already using lots of templates etc... so I would take a "free" reduction in compile time any day.

A real backup system doesn't need you to do anything. My laptop gets backed up every hour, local snapshots at any time, and then offloaded to long term first stage backup anytime I'm on my local home network, which is in turn automatically backed up to an offsite server. All versioned and such so I can go back and get various changes over time as well down to a given point.

What backup software do you use?

I've setup a NAS, but need to setup a new backup procedure for actually backing up the files to it.

What backup software do you use?


Personally I use crashplan and it is fully automated, and works well. There are many others just as good, but crashplan was one of the few to offer unlimited backup sizes.

I actually have two going. I setup Apple's Time Machine on the laptop, simply because it was there and beyond easy to setup. In addition to that I'm running some custom scripts that handles secondary backup on my Mac and backing up my other machines as well to local network storage and then private offsite server.

Have been thinking about adding in Crashplan for added redundancy as I keep hearing nice things about it.

Remember: Synced Drives (Dropbox, Google Drive, etc.) are NOT BACKUPS. Delete it one place, on purpose or by accident, and can get trashed else where.

Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.
Advertisement

What backup software do you use?


Personally I use crashplan and it is fully automated, and works well. There are many others just as good, but crashplan was one of the few to offer unlimited backup sizes.


I use Crashplan as well, and the Crashplan app itself is actually able to do local NAS backups in tandem with cloud backups. However it's a "packaged" backup, which is to say you don't get readable files on the NAS directly, but encrypted blocks that need to be unpackaged by the app.


Also quite interesting. It doesn't seem to scale that well with multicore due to my own tests though, my old 980X with 6 cores actually build almost as fast as my new 4790k with 4 cores (I have multicore-compilation enabled). Maybe the new CPU is just that much faster per core that it evens out? Well I'll see how it goes with precompiled headers, its not really that bad for me though (yet), but I am already using lots of templates etc... so I would take a "free" reduction in compile time any day.


My hexcore i7-5930k just rips through builds compared to the quad core chips, though.

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

It only makes a big difference if file IO is a bottleneck, I'm assuming you're using C++? If so it most likely won't do a huge difference as most of the time is actually spent compiling. For me it made a huge difference (C#, small projects, many of them refering to each other) because very little time was spent compiling and most of the time was spent copying DLLs from one folder to a related project's folder.

I actually have two going. I setup Apple's Time Machine on the laptop, simply because it was there and beyond easy to setup. In addition to that I'm running some custom scripts that handles secondary backup on my Mac and backing up my other machines as well to local network storage and then private offsite server.

Have been thinking about adding in Crashplan for added redundancy as I keep hearing nice things about it.

Remember: Synced Drives (Dropbox, Google Drive, etc.) are NOT BACKUPS. Delete it one place, on purpose or by accident, and can get trashed else where.

Add packrat to Dropbox and it becomes a pretty good backup however

I recall that compilation is best improved with more cores - makes a huge difference - and linking is RAM-bound?

But you didn't say what languages and I think this can be relevant as different ones work in different ways.

This topic is closed to new replies.

Advertisement