Alliedassault

Alliedassault (alliedassault.us/index.php)
-   Offtopic (alliedassault.us/forumdisplay.php?f=13)
-   -   Half-life 2: Counterstrike...Holy shit looks l33t (alliedassault.us/showthread.php?t=36999)

Infection_Smith@ 05-27-2004 06:51 AM

Half-life 2: Counterstrike...Holy shit looks l33t
 
http://www.gametrailers.com/gt_vault/t_ ... 4_cam.html


oOo: oOo: oOo: ed: ed: ed:

Fireal 05-27-2004 06:57 AM

Holy fuck!

The way that water splashes up is sweet, really sets a mood in the environment

KTOG 05-27-2004 06:59 AM

gerv posted this a while back, but its still leet.

Only probs IMO are the nade animation and not enough enviroment effects. Whats up with the guy not using a crosshair either?

Da_Bian 05-27-2004 07:12 AM

whats up with the left-handed character?

Zoner 05-27-2004 07:13 AM

I was thinking the same thing. Having never played the original CounterStrike (which had left-handed characters too, no?), I'd have a bitch of a time getting used to that.

KTOG 05-27-2004 07:18 AM

You can switch left or right handed in original CS

Zoner 05-27-2004 07:21 AM

Ah.

See? I told you I never played it. biggrin:

Swill 05-27-2004 07:25 AM

hmmm
 
Yeah zoner, it gets addicting"Counter-Strike"

You should try it... but all in all the new one looks really demanding for computer wise graphics requirements.

Gonzo 05-27-2004 07:40 AM

that looks sick rock: , cant wait for that

HeadUp 05-27-2004 07:45 AM

Re: hmmm
 
Quote:

Originally Posted by Swill1496
Yeah zoner, it gets addicting"Counter-Strike"

You should try it... but all in all the new one looks really demanding for computer wise graphics requirements.

which l00ks like something youll have to worry about

Zoner 05-27-2004 07:49 AM

It's a forgone conclusion that I'm going to have to ditch the 64MB vid card soon. Too many kickass games on the horizon that would make my MX440 their bitch.

Tystnad 05-27-2004 07:51 AM

Quote:

Originally Posted by Zoner
It's a forgone conclusion that I'm going to have to ditch the 64MB vid card soon. Too many kickass games on the horizon that would make my MX440 their bitch.

I guess i should upgrade from my AMD 1 Ghz CPU aswell then.
And maybe change me 9200 128 Mb gfx card.. biggrin:

Zoner 05-27-2004 07:53 AM

I'm assuming that HL2 and Doom3 et al won't choke up my AMD 1.73GHz processor too much.

I can't afford a total overhaul.

Infection_Smith@ 05-27-2004 07:54 AM

Quote:

Originally Posted by Zoner
It's a forgone conclusion that I'm going to have to ditch the 64MB vid card soon. Too many kickass games on the horizon that would make my MX440 their bitch.

we have another member in the MX440 club ed:
Damn, I thought I was the only poor guy with this crappy card.

Mr.Buttocks 05-27-2004 07:54 AM

Quote:

Originally Posted by Zoner
I'm assuming that HL2 and Doom3 et al won't choke up my AMD 1.73GHz processor too much.

I can't afford a total overhaul.

You should be able to run both on that CPU no problem.

Zoner 05-27-2004 07:58 AM

Yeah, that's what I'm guessing. A vid card upgrade and some more RAM and I should be set.

Mind you, every goddamned penny these days is going towards flowers, candles, decorations, tuxedos, catering, etc etc etc for the wedding in October. I probably won't upgrade my rig 'til next Winter.

Owned.

geRV 05-27-2004 10:54 AM

256 mb cards are pretty high end these days though a 512mb card will be available this year from both ati and nvidia. UNreal 3.0 engine according to tim sweeny will require a card with 1 gig of ram on it to run at max details, but thats a couple of years off so 1gig will probably be standard by then.

SoLiDUS 05-27-2004 03:29 PM

Quote:

Originally Posted by Gerard
256 mb cards are pretty high end these days though a 512mb card will be available this year from both ati and nvidia. UNreal 3.0 engine according to tim sweeny will require a card with 1 gig of ram on it to run at max details, but thats a couple of years off so 1gig will probably be standard by then.

Even THEY have trouble getting smooth framerates at present... so by the
time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0
looks insanely detailed... can't wait to see what they'll come up with...

intrestedviewer 05-27-2004 03:51 PM

Why does the vid card need its own memory? Why cant it use the computer memory oOo: maybe they should make video card memory upgradable.

Akuma 05-27-2004 03:52 PM

Quote:

Originally Posted by intrestedviewer
Why does the vid card need its own memory? Why cant it use the computer memory oOo: maybe they should make video card memory upgradable.

They tried that along time ago. My old S3 video card had upgradable RAM.

geRV 05-27-2004 03:57 PM

Quote:

Originally Posted by intrestedviewer
Why does the vid card need its own memory? Why cant it use the computer memory oOo: maybe they should make video card memory upgradable.

It can use the pc memory but since its not attached to the card theres a lag involved in the gfx card sending instructions to it, that and most likely the game will be using a load of memory anyway. Would make for choppy gameplay.

Upgradable gfx cards would be a good idea these days, dont really know how it could be done though since the memory chips are soldered to the board. Then thers the question of mixing slower ram timings on the board and shit, be more hassle than enough id say.

Miscguy 05-27-2004 03:57 PM

[quote:a381b]Even THEY have trouble getting smooth framerates at present... so by the time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0 looks insanely detailed... can't wait to see what they'll come up with...[/quote:a381b]

Dual shotgun PCIX x800xt 512.

Judas 05-27-2004 03:59 PM

no

geRV 05-27-2004 04:00 PM

Quote:

Originally Posted by Miscguy
[quote:1c804]Even THEY have trouble getting smooth framerates at present... so by the time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0 looks insanely detailed... can't wait to see what they'll come up with...

Dual shotgun PCIX x800xt 512.[/quote:1c804]

I really think thats more of a gimmick than anything else, from the article i read it boosts performance by around 40%, if both cards are only doing half the work then why not around at least a 90% boost? Obviously the cards were not designed with a version of sli in mind so this alienware thnig seems to be more of a hack job that ends up losing performance.

Miscguy 05-27-2004 04:10 PM

I wasnt even refering to the alienware gimmick. I just think that with what games can do these days that the old school way of dual shot gunning graphics cards should return. Especially with the new PCIX interface. Of course with the astronomical cost of a single card, running 2 would bankrupt some small countries let alone even the hardest of the hardcore gamers.

Now with alienwares hack job theres a 40% or so increase in proformace (i'll take your word seeing how i have seen no stats), i wonder about bottlenecks in other areas. That and the fact it is a hack job not something implimented from the get go you're naturally not going to see the increase in proformance you would expect. Frankly i think if major graphics developers were to look into this and impliment there own solutions into next generation lines you would see the proformance.

Short Hand 05-27-2004 04:12 PM

I vote to bring bakc multiple vpu's on a video card.

edit

intrestedviewer 05-27-2004 04:18 PM

VPU's

geRV 05-27-2004 04:26 PM

Well last real multi vpu card was the voodoo 5 5500 and the 6000. Personally id hate to see nvidia attempt a multi vpu board, their boards are stupidly large as they are now almost on par with the 6k and they only have the one vpu on them. oOo:

I suppose its a matter of time before this happens again though, or at least a company realises sli is still very useful for people wanting more performance....and for the amount of 3d mark freaks who live to benchmark they fit into that catagory.

Akuma 05-27-2004 05:09 PM

The XGI Volari Duo V8 has 2 vpu's and it sucks.

Coleman 05-27-2004 05:28 PM

fire2:

Short Hand 05-27-2004 06:39 PM

Quote:

Originally Posted by Akuma
The XGI Volari Duo V8 has 2 vpu's and it sucks.

the core's are also no where near the sophistication of the nvidia and ati cores. + the engineering is not to ken, drivers are half ass, baad ram on most. That usaully = shitty not the fact it has 2 vpu's.

+ it didnt score to bad and the price is half decent. I've heard with the new drivers it handles ff better etc. better etc.

Akuma 05-28-2004 04:05 AM

Its Image Quality and IQ are horrible even with the new drivers. It costs more than a 9800XT with only half the performance.

This is with the new drivers. Take a look at some of the ingame screenshots.

http://www.hexus.net/content/reviews/re ... 19JRD03NTU


All times are GMT -6. The time now is 10:27 AM.

Powered by vBulletin® Version 3.8.12 by ScriptzBin
Copyright ©2000 - 2025, vBulletin Solutions Inc.
© 1998 - 2007 by Rudedog Productions | All trademarks used are properties of their respective owners. All rights reserved.