Alliedassault

Alliedassault (alliedassault.us/index.php)
-   Offtopic (alliedassault.us/forumdisplay.php?f=13)
-   -   Half-life 2: Counterstrike...Holy shit looks l33t (alliedassault.us/showthread.php?t=36999)

Zoner 05-27-2004 07:58 AM

Yeah, that's what I'm guessing. A vid card upgrade and some more RAM and I should be set.

Mind you, every goddamned penny these days is going towards flowers, candles, decorations, tuxedos, catering, etc etc etc for the wedding in October. I probably won't upgrade my rig 'til next Winter.

Owned.

geRV 05-27-2004 10:54 AM

256 mb cards are pretty high end these days though a 512mb card will be available this year from both ati and nvidia. UNreal 3.0 engine according to tim sweeny will require a card with 1 gig of ram on it to run at max details, but thats a couple of years off so 1gig will probably be standard by then.

SoLiDUS 05-27-2004 03:29 PM

Quote:

Originally Posted by Gerard
256 mb cards are pretty high end these days though a 512mb card will be available this year from both ati and nvidia. UNreal 3.0 engine according to tim sweeny will require a card with 1 gig of ram on it to run at max details, but thats a couple of years off so 1gig will probably be standard by then.

Even THEY have trouble getting smooth framerates at present... so by the
time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0
looks insanely detailed... can't wait to see what they'll come up with...

intrestedviewer 05-27-2004 03:51 PM

Why does the vid card need its own memory? Why cant it use the computer memory oOo: maybe they should make video card memory upgradable.

Akuma 05-27-2004 03:52 PM

Quote:

Originally Posted by intrestedviewer
Why does the vid card need its own memory? Why cant it use the computer memory oOo: maybe they should make video card memory upgradable.

They tried that along time ago. My old S3 video card had upgradable RAM.

geRV 05-27-2004 03:57 PM

Quote:

Originally Posted by intrestedviewer
Why does the vid card need its own memory? Why cant it use the computer memory oOo: maybe they should make video card memory upgradable.

It can use the pc memory but since its not attached to the card theres a lag involved in the gfx card sending instructions to it, that and most likely the game will be using a load of memory anyway. Would make for choppy gameplay.

Upgradable gfx cards would be a good idea these days, dont really know how it could be done though since the memory chips are soldered to the board. Then thers the question of mixing slower ram timings on the board and shit, be more hassle than enough id say.

Miscguy 05-27-2004 03:57 PM

[quote:a381b]Even THEY have trouble getting smooth framerates at present... so by the time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0 looks insanely detailed... can't wait to see what they'll come up with...[/quote:a381b]

Dual shotgun PCIX x800xt 512.

Judas 05-27-2004 03:59 PM

no

geRV 05-27-2004 04:00 PM

Quote:

Originally Posted by Miscguy
[quote:1c804]Even THEY have trouble getting smooth framerates at present... so by the time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0 looks insanely detailed... can't wait to see what they'll come up with...

Dual shotgun PCIX x800xt 512.[/quote:1c804]

I really think thats more of a gimmick than anything else, from the article i read it boosts performance by around 40%, if both cards are only doing half the work then why not around at least a 90% boost? Obviously the cards were not designed with a version of sli in mind so this alienware thnig seems to be more of a hack job that ends up losing performance.

Miscguy 05-27-2004 04:10 PM

I wasnt even refering to the alienware gimmick. I just think that with what games can do these days that the old school way of dual shot gunning graphics cards should return. Especially with the new PCIX interface. Of course with the astronomical cost of a single card, running 2 would bankrupt some small countries let alone even the hardest of the hardcore gamers.

Now with alienwares hack job theres a 40% or so increase in proformace (i'll take your word seeing how i have seen no stats), i wonder about bottlenecks in other areas. That and the fact it is a hack job not something implimented from the get go you're naturally not going to see the increase in proformance you would expect. Frankly i think if major graphics developers were to look into this and impliment there own solutions into next generation lines you would see the proformance.

Short Hand 05-27-2004 04:12 PM

I vote to bring bakc multiple vpu's on a video card.

edit

intrestedviewer 05-27-2004 04:18 PM

VPU's

geRV 05-27-2004 04:26 PM

Well last real multi vpu card was the voodoo 5 5500 and the 6000. Personally id hate to see nvidia attempt a multi vpu board, their boards are stupidly large as they are now almost on par with the 6k and they only have the one vpu on them. oOo:

I suppose its a matter of time before this happens again though, or at least a company realises sli is still very useful for people wanting more performance....and for the amount of 3d mark freaks who live to benchmark they fit into that catagory.

Akuma 05-27-2004 05:09 PM

The XGI Volari Duo V8 has 2 vpu's and it sucks.

Coleman 05-27-2004 05:28 PM

fire2:


All times are GMT -6. The time now is 11:36 AM.

Powered by vBulletin® Version 3.8.12 by ScriptzBin
Copyright ©2000 - 2025, vBulletin Solutions Inc.
© 1998 - 2007 by Rudedog Productions | All trademarks used are properties of their respective owners. All rights reserved.