Smaller power supplies don't really save any money, so go big. Nobody's ever regretted having a power supply that is too big. But I don't know who the hell "COUGAR" is and would recommend sticking to one of the established brands. I've been really happy with Corsair and CoolerMaster supplies. (ThermalTake is supposed to be good too but mine literally exploded.) Other than that, I think you're making a mistake building a machine with less than 16 GB (2x8, not 4x4) of memory but everything else looks fine. I also wouldn't use a Vertex 2 drive but whatever.
Another PC Build
To confirm -- The wattage recomendation on the GPU specs is indeed for the whole system, and typically fairly higher-end systems in keeping with the GPU selection. I've had two 6990 GPUs (so 4 total GPUs) running in a system with a 90w CPU, 8gb RAM, 2 SSDs, and 4 laptop drives in RAID, all on a quality 900w PSU. The new MacPro has two GPUs, each one 60% larger than your 270x, alongside upwards of a 130w CPU and tons of memory, and it gets by on only a 500w PSU (albeit the entire system is engineered together for optimal efficiency).
That said, there's not much to be saved in the 500-800 watt range of PSUs. 700+ watts is a good cushion to have if you ever want to add a second GPU. Stick to better brands though -- the lesser brands often have either poor efficiency, poor power distribution, or poor power conditioning. Building a nice computer with a crappy PSU is a bit like training for a marathon while relying on fast food for all your nutrition--sure, you won't starve, but you're not building up your body for optimal longevity. Some good manufacturers are Seasonic, Coolermaster, Corsair, or Antec.
Also, be aware that you're paying a premium on that Radeon card due to the current litecoin mining craze. I know its a stated goal to run Radeon, but its something to be aware of. If this machine is largely for testing as you say, and will sit idle most of the time, you might even want to consider a little casual mining just to offset some of the premium you're paying.
throw table_exception("(? ???)? ? ???");
I'd probably ditch the blu-ray (can't remember the last time I used an optical disk) and spend an extra $30 on the cpu and go for an i5-4670K.
AFAIK usually the GPU power recommendation includes everything else in it, because the GPU tends to be the biggest power hog so usually one chooses the PSU based on the GPU.
With little more searching you can probably find how much the GPU itself actually needs, which will probably be around 300 W or so at the peak power use.
ah, i didn't realize that, krom's suggestion makes alot more sense now. i was always taking the listed power supply at face value.
Smaller power supplies don't really save any money, so go big. Nobody's ever regretted having a power supply that is too big. But I don't know who the hell "COUGAR" is and would recommend sticking to one of the established brands. I've been really happy with Corsair and CoolerMaster supplies. (ThermalTake is supposed to be good too but mine literally exploded.) Other than that, I think you're making a mistake building a machine with less than 16 GB (2x8, not 4x4) of memory but everything else looks fine. I also wouldn't use a Vertex 2 drive but whatever.
I'll check out changing the brands, the reviews were pretty solid for this one(although only 7 reviews), in the last build i went coursai, and i've heard of no problems from it so far. i chose 8 gigs for now since it's a relativly easy upgrade, and at the moment 8 gigs should be sufficient for what i need. i do agree about the vertex 2, iirc it's top run speed if for sata2 and not sata3, but realistically we're still talking about a SSD, it blows all hdd's out of the water anyway, so it's not a big priority for me to warrant a new purchase.
I did not know this, that's unfortuante that i'm buying at a time like this. although i've never heard of litecoin, is this a new competitor for bitcoins...or? my electricity is set at a flat rate, so some idle mining might not be a bad idea.Also, be aware that you're paying a premium on that Radeon card due to the current litecoin mining craze. I know its a stated goal to run Radeon, but its something to be aware of. If this machine is largely for testing as you say, and will sit idle most of the time, you might even want to consider a little casual mining just to offset some of the premium you're paying.
I'd probably ditch the blu-ray (can't remember the last time I used an optical disk) and spend an extra $30 on the cpu and go for an i5-4670K.
I was really thinking of ditching the disk drive, and maybe only ever picking up one if i ever did actually need one. i could probably steal the one i tossed into my friends to install windows, and would likely never use it again.
now then i have a question about the motherboard i picked, the gpu will sit in a pci 3.0 16 slot slot, and the MB has 2, but it has in paranthesis(CFX x8) and it also has a 2.0 pci card that has a (x4) next to it. can anyone shine some light on what these mean.(i know google, but at work atm, and it just dawned on me).
now then i have a question about the motherboard i picked, the gpu will sit in a pci 3.0 16 slot slot, and the MB has 2, but it has in paranthesis(CFX x8) and it also has a 2.0 pci card that has a (x4) next to it. can anyone shine some light on what these mean.(i know google, but at work atm, and it just dawned on me).
The numbers inside the brackets are the speed/mode that your PCIe bus will run at. While the numbers outside the brackets are the width (number of lanes) physically available in the slot on your mobo.
So your "PCIe 3.0 x16 (CFX x8)" has 16 physical lanes and will run at x16 unless you use both of them in CrossFireX configuration in which case both will only operate at x8.
Meanwhile your "PCIe 2.0 x16 (x4)" is physically compatible with an x16 card but only runs in x4 mode.
David Gill :: GitHub :: twitter .
I've updated the psu to a seasonic 650W, removed the blu-ray drive, and changed the cpu to an i5-4670.
Litecoin is one of the alternatives to Bitcoin based on the same principles, but with different parameters and a different function to perform proof-of-work. By overall value and price-per-coin, its the second most valueable crypto-currency. Radeon cards excel over nVidia cards because they can do single-instruction bitwise rotate while nVidia cards cannot, and this operation makes up the bulk of the cryptography function in both bitcoin (SHA256) and litecoin (Scrypt). Hence, demand for litecoin mining hardware, which has not yet moved to custom ASICs like bitcoin, creates additional demand for recent Radeon cards, driving prices up.
Mining will be a noticeable blip on your electric usage, probably around $40 worth of electricity per month assuming 500w whole-system power consumption at an electric rate of 12 cents/KWh. I'm not sure how your flat-rate works, but its unlikely whomever is offering it will be happy eating that cost. At current difficulty and value, you'd earn about $60 per month, which isn't a lot of profit for the hassle and power consumption if you're going to sell soon (unless your electric provider really will eat the costs happily), but could pay off if your strategy is buy-and-hold over a longer term. I don't intend to sell any of mine for 4 years.
now then i have a question about the motherboard i picked, the gpu will sit in a pci 3.0 16 slot slot, and the MB has 2, but it has in paranthesis(CFX x8) and it also has a 2.0 pci card that has a (x4) next to it. can anyone shine some light on what these mean.(i know google, but at work atm, and it just dawned on me).
The numbers inside the brackets are the speed/mode that your PCIe bus will run at. While the numbers outside the brackets are the width (number of lanes) physically available in the slot on your mobo.
So your "PCIe 3.0 x16 (CFX x8)" has 16 physical lanes and will run at x16 unless you use both of them in CrossFireX configuration in which case both will only operate at x8.
Meanwhile your "PCIe 2.0 x16 (x4)" is physically compatible with an x16 card but only runs in x4 mode.
Yep -- and none of which is cause for concern and completely typical -- one of the benefits of PCIe being a multi-lane serial interface is the ability to re-route signals in this way for added flexibility. Still plenty of speed to go around, PCIe 3.0 x8 is as fast as PCIe 2.0 x16, neither of which is a serious bottleneck for even higher-end GPUs that have enough memory and processing bandwidth for high-resolution textures that make up the bulk of graphics data transfers.
Most non-GPU PCIe devices are PCIe 1.0 or 2.0 at 1x (e.g. network or wifi adapters), x4 is typical of higher-end RAID adapters or PCIe-based SSD drives. Even a GPU would operate happily, if non-optimally, at x4.
throw table_exception("(? ???)? ? ???");