Ge Force ??
is it really a good card or has it been hyped?
i tried the SDR version from ASUS and it really sux.
which card do i buy at present??
i did notice a little speed improvement in Q3
Raptor
Raptor
I _THINK_ that GeForce2 is the best one at this moment
========================
Game project(s):
www.fiend.cjb.net
========================
Game project(s):
www.fiend.cjb.net
=======================Game project(s):www.fiend.cjb.net
Well, T&L is cool. but if games use their own software T&L you won''t notice any or very little difference. T&L accelerates the vertex lighting and the object->world->cam translation/rotation
If your games were running 60+ fps you probably wouldn''t notice a difference.GeForce is a good item for like upping your resolution and the bit depth while keeping the fps in the 200+ range (though an sdr might keep you down around 100).Also, the Nvidia reference drivers may be faster then asus'' drivers.The only problem I had with geforce is it runs so hot, like right now smart doctor has underclocked the card as low as possible and its running 48º!
If you gotta have a new card, get a 64 mb geforce2 card.The geforce2 uses a lot less power so it has a normal temperature of about 34º (barely needs a fan), 64 mb because a lot of games out now can use 32 mb like its nothing.Also, geforce and geforce2 both have hardware support for windows and xwindows 2d drawing operations, and OpenGL for both oses (slightly faster under linux).Nvidia also announced a low cost geforce-type chipset yesterday, the preformance is between the geforce2 and geforce, and requires very little power.
In my opinion you should not get a new voodoo card, 3dfx hasn''t really advanced their chipset at all in the past 2-3 years, so it brings about several problems, like too much heat, requiring so much power it needs an external power supply, and requiring a lot of physical space inside your computer.
Diamond''s viper II looks like very good hardware, but I keep hearing how the drivers are still not very good (the Viper II has been out for a while, the drivers should be excellant by now).
I''m not too sure about anything else, I usually don''t look up stuff about video cards unless I plan on buying one in the near future.Hope this helps a little.
----------
meh
If you gotta have a new card, get a 64 mb geforce2 card.The geforce2 uses a lot less power so it has a normal temperature of about 34º (barely needs a fan), 64 mb because a lot of games out now can use 32 mb like its nothing.Also, geforce and geforce2 both have hardware support for windows and xwindows 2d drawing operations, and OpenGL for both oses (slightly faster under linux).Nvidia also announced a low cost geforce-type chipset yesterday, the preformance is between the geforce2 and geforce, and requires very little power.
In my opinion you should not get a new voodoo card, 3dfx hasn''t really advanced their chipset at all in the past 2-3 years, so it brings about several problems, like too much heat, requiring so much power it needs an external power supply, and requiring a lot of physical space inside your computer.
Diamond''s viper II looks like very good hardware, but I keep hearing how the drivers are still not very good (the Viper II has been out for a while, the drivers should be excellant by now).
I''m not too sure about anything else, I usually don''t look up stuff about video cards unless I plan on buying one in the near future.Hope this helps a little.
----------
meh
----------meh
Well Now,
I don''t have a GeForce myself, although I
am saving for one. But several friends of
mine have GeForce''s and they all swear by
them. Of course there have been complications
when used with Athlons. However I think thats
more to do with either the motherboards or
power supplies that Athlons use rather than the
actual Processor.
STVOY
Mega Moh Mine!!
I don''t have a GeForce myself, although I
am saving for one. But several friends of
mine have GeForce''s and they all swear by
them. Of course there have been complications
when used with Athlons. However I think thats
more to do with either the motherboards or
power supplies that Athlons use rather than the
actual Processor.
STVOY
Mega Moh Mine!!
i tend to shy away from 3dfx products because the opengl support is absolutely horrendous .
i also hate the fact that you cant run stuff in a window. the opengl 3d acceleration only works in fullscreen - which SUCKS for developers - no debugging in VC++. i dont know if it can run d3d/glide in a window.
(dont anyone even bother trying to start a huge fight over nvidia v 3dfx...that fight has been fought - nobody won then and no-one will win now).
yes, it really is good (assuming that the game/app you are running accomodates hardware T&L - which d3d [optionally?] does - even without t&l it is a damn fast card). i've seen fps counts multiply be 5 or 6 (sometomes even more) in high poly tests when hw T&L is turned on - i just can't wait for more games to use it!
Edited by - POINT on July 1, 2000 7:38:53 PM
i also hate the fact that you cant run stuff in a window. the opengl 3d acceleration only works in fullscreen - which SUCKS for developers - no debugging in VC++. i dont know if it can run d3d/glide in a window.
(dont anyone even bother trying to start a huge fight over nvidia v 3dfx...that fight has been fought - nobody won then and no-one will win now).
quote: is it really a good card or has it been hyped?
yes, it really is good (assuming that the game/app you are running accomodates hardware T&L - which d3d [optionally?] does - even without t&l it is a damn fast card). i've seen fps counts multiply be 5 or 6 (sometomes even more) in high poly tests when hw T&L is turned on - i just can't wait for more games to use it!
Edited by - POINT on July 1, 2000 7:38:53 PM
nVidia is ages ahead in terms of features and OpenGL ICD quality, and the GeForce is the high end model from nVidia...
There''s no better consumer card for doing OpenGL development. If you want to do cool demos, it''s absolutely the best card around. 3Dfx can''t even touch the 2 1/2 years old TNT1 from nVidia when it comes to image quality and feature completeness. It''s great to have a card that has all the next-gen features. You can try them out now... really nice card for demo coding, and 300/333 mhz RAM has enough beandwidth for all the cool stuff...
Tim
--------------------------
www.gamedev.net/hosted/glvelocity
glvelocity.gamedev.net
www.glvelocity.com
There''s no better consumer card for doing OpenGL development. If you want to do cool demos, it''s absolutely the best card around. 3Dfx can''t even touch the 2 1/2 years old TNT1 from nVidia when it comes to image quality and feature completeness. It''s great to have a card that has all the next-gen features. You can try them out now... really nice card for demo coding, and 300/333 mhz RAM has enough beandwidth for all the cool stuff...
Tim
--------------------------
www.gamedev.net/hosted/glvelocity
glvelocity.gamedev.net
www.glvelocity.com
Tim--------------------------glvelocity.gamedev.netwww.gamedev.net/hosted/glvelocity
With faster Procceser''s T&N will start slowing the games down. Very Very soon(like when we get 1.5 ghz Mustangs).
quote: Original post by Esap1
With faster Procceser''s T&N will start slowing the games down. Very Very soon(like when we get 1.5 ghz Mustangs).
No way in heaven or hell.
T&L is most computational intense operation a non-accelerated 3D game does. Besides, if it becomes an issue that the card can''t do it fast enough then divide & conquor & make the GPU & CPU bang out the T&L in parallel.
Having HW acceleration is not likely to ever slow you down. The amount of time it takes wired electronics to perform such calculations is an order of magnitude less than general processors. A 350MHz GPU doing T&L is like a 3500MHz CPU doing T&L... unless the CPU has an accelerated T&L engine in it And while the GPU is doing T&L calculations the CPU is free to do other things... You''d need a 5000MHz machine for the card to slow you down... as it should, time to buy the new GeForece10
...
I''ve used a lot of different cards at work, I have a TNT & a TNT2 at home (saving those $ for a geforce... maybe a 3 will be out when i''m ready to buy )
3Dfx was an industrial & commerical graphics company once-upon-a-time. They still have a proprietary philosophy about thier equipment.... they were one of the first company''s to offer a decent & relatively inexpensive 3D acclerator... nVidia competed quite well with the TNT card, before that with the savage & whatnot 3Dfx had a speed edge... Then the TNT2 came out, it was the best consumet level card in existence when they released the GeForce. Now the GeForce2 is out! Unlike some other company''s *cough Intel* which withhold new technology for higher profit margains, nVidia has been nice enough to dish-out some amazing products very rapidlly. It looks like they''re concentrating on making the cards more power effiecent (hense cooler) right now - which is a very good thing! Laptop GeForce''s are around the corner
The only question I have is; what could possibly be next!?
nVidia with the GeForce2 is best card on the market as far as I can tell. And don''t buy a shiity ASUS video card, get a Creative Labs one... its not _that much more...
hehe I can''t imagine trying to download video card drivers from ASUS, it took me two weeks to get a <100k bios update
3Dfx''s proprietary T-buffer gfx''s just don''t cut it. (Is the Glide API free?)
Ah, if you do not have an AGP port, you may want to consider a Voodoo; only reason I can think of.
- The trade-off between price and quality does not exist in Japan. Rather, the idea that high quality brings on cost reduction is widely accepted.-- Tajima & Matsubara
quote:
Of course there have been complications when used with Athlons.however I think thats more to do with either the motherboards or power supplies that Athlons use rather than the
actual Processor
This is mostly a problem with older versions of AMD''s 750 motherboard chipset, the geforce 2 is more compatible (I don''t know this for a fact, it was in the nvidia press statement) with this chipset.The via kx133 chipset is better for geforce, but having a lower-power power supply (sub 250w) can complicate the problem.I had a 750 motherboard (stepping 4) and it was a nightmare (only under windows, though, linux did everything fine, what a surprise ), when I switched to the kx133, I''ve had no problems since then.
----------
meh
----------meh
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement