![]() |
Yeah, that's what I'm guessing. A vid card upgrade and some more RAM and I should be set.
Mind you, every goddamned penny these days is going towards flowers, candles, decorations, tuxedos, catering, etc etc etc for the wedding in October. I probably won't upgrade my rig 'til next Winter. Owned. |
256 mb cards are pretty high end these days though a 512mb card will be available this year from both ati and nvidia. UNreal 3.0 engine according to tim sweeny will require a card with 1 gig of ram on it to run at max details, but thats a couple of years off so 1gig will probably be standard by then.
|
Quote:
time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0 looks insanely detailed... can't wait to see what they'll come up with... |
Why does the vid card need its own memory? Why cant it use the computer memory oOo: maybe they should make video card memory upgradable.
|
Quote:
|
Quote:
Upgradable gfx cards would be a good idea these days, dont really know how it could be done though since the memory chips are soldered to the board. Then thers the question of mixing slower ram timings on the board and shit, be more hassle than enough id say. |
[quote:a381b]Even THEY have trouble getting smooth framerates at present... so by the time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0 looks insanely detailed... can't wait to see what they'll come up with...[/quote:a381b]
Dual shotgun PCIX x800xt 512. |
no
|
Quote:
I really think thats more of a gimmick than anything else, from the article i read it boosts performance by around 40%, if both cards are only doing half the work then why not around at least a 90% boost? Obviously the cards were not designed with a version of sli in mind so this alienware thnig seems to be more of a hack job that ends up losing performance. |
I wasnt even refering to the alienware gimmick. I just think that with what games can do these days that the old school way of dual shot gunning graphics cards should return. Especially with the new PCIX interface. Of course with the astronomical cost of a single card, running 2 would bankrupt some small countries let alone even the hardest of the hardcore gamers.
Now with alienwares hack job theres a 40% or so increase in proformace (i'll take your word seeing how i have seen no stats), i wonder about bottlenecks in other areas. That and the fact it is a hack job not something implimented from the get go you're naturally not going to see the increase in proformance you would expect. Frankly i think if major graphics developers were to look into this and impliment there own solutions into next generation lines you would see the proformance. |
I vote to bring bakc multiple vpu's on a video card.
edit |
VPU's
|
Well last real multi vpu card was the voodoo 5 5500 and the 6000. Personally id hate to see nvidia attempt a multi vpu board, their boards are stupidly large as they are now almost on par with the 6k and they only have the one vpu on them. oOo:
I suppose its a matter of time before this happens again though, or at least a company realises sli is still very useful for people wanting more performance....and for the amount of 3d mark freaks who live to benchmark they fit into that catagory. |
The XGI Volari Duo V8 has 2 vpu's and it sucks.
|
fire2:
|
All times are GMT -6. The time now is 11:36 AM. |
Powered by vBulletin® Version 3.8.12 by ScriptzBin
Copyright ©2000 - 2025, vBulletin Solutions Inc.
© 1998 - 2007 by Rudedog Productions | All trademarks used are properties of their respective owners. All rights reserved.