have you seen nvidia's new titan rendering power? (it can do what you see in the video in real time) so ages no, but it still cost to much for a normal guy to buy
The titan gets beaten by Nvidia's newer stuff, its not designed for rendering like this.
The titan is for Cuda development. The GTX 980 costs like half the price and beats the titan in rendering.
So, can either of you guys tell me what card and processor team-up will net me the most frames per second on CS:GO or CSS Zombie Mod? I just really wanna be able to play without **** getting choppy when more than four people start firing simultaneously. Y'know, audio drivers and all that junk.
Well iron ore at first, then refined to reduce carbon to make steel. But since iron has to be refined at a liquid stage, the transformation from iron to steel makes steel start out a liquid.
The only thing is, we're still pretty far from having games like this. It's too much for current hardware to render large game environments in this way in real time. You're going to have too much going on for it to happen. Fluid physics, "semi-fluid" physics like sand, and fabric physics including hair coupled with large environments and decent AI would bog down nearly any consumer build out there, including the average "gaming" PC and certainly consoles.
Devs need to start throwing some of their advertisement budget towards talent, and publishers need to encourage innovation, not stagnation.
When you run into a wall, you find a way around that wall, or through it, or make it look like there is no wall.
Has anyone ever tried having fluid physics in water only pertain to relevant objects? No one cares if the wave always crashes into the wall with the same animation, if the water around my feet reacts realistically.
No big companies try this **** anymore. They've gotten fat off of something that started like this and seemingly fit to ride the wave out.
Part of the blame is on console and PC component manufacturers. Game developers aren't going to make a game that will only run decently on hardware 10 years from now. They have to make it with today's hardware in mind. PC part makers, like Nvidia and Intel, are having fewer and fewer performance gains these days. The best they can do right now is lower power consumption. They're actually pretty close to proving Moore's Law wrong, honestly. The consoles are pretty self explanatory. They were 2 years outdated when they were announced and 3 years outdated at release.
Now that doesn't mean that innovation can't occur though. It just has to be in the form of gameplay rather than visual eye candy. The only problem is, people still buy shoddily made AAA games before it's even released. What incentive do the devs have to make a game great if they already have your money? Or they don't bother optimizing for PC and just slap high spec requirements on it, since the majority of their sales come from console players that are used to the same old **** all the time. "OOOH, gotta preorder CoD Post-Revolutionary Space Warfighter 6 so I can get the PeeWee Peashooter preorder bonus gun." Then the dev has their money. The game ends up being the same old **** the last game was, just with more bloom and a lower resolution.
The good thing is, the indie game scene is getting more of the limelight now. But this is also a double-edged sword, because frankly, too many indie games are just half-assed pixelated cash grabs. Occasionally, you do get that great game that makes it all worth it.
Anyway, I've rambled on enough about this.
tl;dr
Hardware sucks now, AAA devs don't care, some indie games are pretty good.
Hardware will always be a limitation and always has. I agree that consoles hold things back with dated hardware, but devs sucking Sony and Microsoft dick by saying things like "we chose to go with a more cinematic look closer to 24fps" isn't acceptable. Restricting improved graphics on more powerful systems isn't acceptable. And unfortunately, until either innovative games become more lucrative than AAA (something like a minecraft success) or the AAA scene does so poorly that the average consumer is appalled by the games churned out, it's going to be that way.
I'm hopeful for the indie scene; just earlier I was talking about how it's growing out of "2d retro platformer, with a twist* *gimmick", and hopefully they turn the industry towards making Good games sell.
Oh and the mobile market I honestly can't even relate it to gaming is fixing for a crash. That market is headed towards the second option of E.T. levels of content.
Yes, hardware will always be a limitation, but it's not an excuse for outdated hardware. Hell, the N64 was more powerful that the average gaming PC for 6 months before it was finally taken down by the Voodoo GPU.
I won't even comment on the cinematic frame rate and parity with consoles ******** . It rustles my jimmies.
And yeah, I'm honestly hoping for a crash for AAA and mobile games. The Nintendo fanboy inside me also wants Nintendo to be the one to dig the market out of the crash again.
Yeah. That's how CGI movies are done anyway. You just render it, then output to a video file. The only problems you'd run into is 1) being a single person making a feature length film, and 2) having to render an entire feature length film on average consumer hardware have fun waiting while your PC sits at 100% usage .
Large studios like Lucasfilms and Pixar have huge "render farms" dedicated to just rendering their CGI. For instance, Pixar's render farm is around 13000 CPU cores and insane amounts of RAM. Each frame of Cars 2 took almost 12 hours to render. Rendering isn't like a game, where the card just ***** out an image in real time. Rendering involves much more complicated physics, like seen in the content, and lighting. It calculates the light bouncing off every single surface until it zeros out, and there are millions of surfaces in each frame. **** takes forever.
I was gonna try to give you an explanation but you deleted the comment. if you still want an explanation:
It's a little hard to explain and I'm not an expert, but I'll try
Movies and games are very different because in a sense, what you see in a game is being created in front of you. Not from scratch, but still a lot of it is being done as things go. Things are actually 3D. The light actually has to bend around things, algorithms have to be run to decide how physics works, and other such things. In a movie, that isn't happening. The movie is a recording, just a series of pictures moving so fast it makes things look like theyre moving. Movies aren't actually "3D" in that sense. There are no algorithms dont to show you the movie.
Because of this, you can watch a 60 FPS video of someone playing a game, even if you couldn't run that game at all
I deleted the comment because i read the one i replied to again, and thought i understood my mistake, but your comment makes it clearer. Thank you for the explanation!
Animation major here. Unfortunately Houdini isn't an engine for games. It's a 3D animation toolkit, Mantra is the rendering engine. My guess is that those first few second particle demonstrations took hours or likely days to render on production level hardware.
I would guess we are still years away from running stuff like this real time.
I can't wait until clipping stops being a problem. These new engines are getting better and better, but I won't be fully erect until I play a game with absolutely no clipping
Quadro would do nothing for you. You're better off getting whatever the top "gaming" GPU is. Quadros and FirePros are optimized specifically for computing, while "gaming" cards are optimized for real time visuals.
I'm aware they're workstation gpu's. You're probably gonna need every bit of the 24 gb of memory two quadros have to run an environment made of that stuff.
Er. No. The Titan series are just rendering GPUs rebranded for gaming. This is why they're so expensive. The Titan Z barely benchmarks above the r9 295x2, at just over double the cost. And the 980 benches better than the Titan Z at a third of the cost. The Titans are garbage for gaming unless you wanna try to flaunt your money by buying inferior hardware. The only thing you're really gaining is more VRAM, but considering that I'm still getting by on 2GB, the 8GB of the 295x2 or the 6GB of the 980 should be plenty.
The Titan cards are pretty good for workstations, so I'll give it that.