the highest fps i can get is 64 it will not go above 64 :S i put it down too the lowest settings i could think of and all i get is 64 :\ :/
i have a GeForce FX 5600 256mb vid card
here is a list of something i have tried:
-graphic, video,genreal options too as low as they can go.
-full screen & windowed mode with drawdist 50
-reconfigured driver settings
-crowing, spectating
-getting latest drivers
-installing latest Direct X
and still i cant even get it too go too 65fps :S
what is the highest FPS for A-tractor ??
um ?
i tried turning up the Frequancy Refresh rate but that dint seem too help any :\ :/ :\
every argument i've ever heard on this stuff always misses some factor or another. i've yet to hear something i'd deem definitive. that and most of the stuff i've ever read that seems about accurate instantly puts me to sleep with how feckin' dry it reads
one tidbit i'd heard however that sounded reasonable is the idea that higher refresh could actually make things appear worse if it was "out of synch" with the fps your card was drawing, or some crazy thing.
i don't remember the exact details, it was all terribly over my head, but the general conclusion was that folks should stop high-number whoring and just fiddle around with stuff until they had a result they found pleasing, which i immediately concluded was the best advice i'd ever hear on the topic.
now, obviously if you're getting 5fps or your monitor refresh is 30.... okay, upping that a bit might help....
a few numbers:
movies look good at 24fps, but employ motion blur and afterimage.... along that theory, turning the lights off in your room and having a high contrast, brightish monitor, and upping your screen refresh would in theory make the game look smoother. i like to think "what's good enuf for film", etc, though there's been plenty of geektalk on dicerning the difference between 120fps and 121fps... after awhile it just sounds so coke/pepsi taste-test.
television runs at a 60htz refresh rate. that's like bottom of the barrel for most computer monitors. dvds play at 30fps, because afterimage can't be employed as much due to televisions and computer monitors doing line-by-line refresh rather than "fullscreen" refresh as a film screen would. that and the environment is often brighter than a movie theater.
lack of motion blur in a comp game would i guess make you want to shoot for getting your monitor refresh and card in sync at somewhere between 60 to 75 htz / fps...anything higher than that would both be overkill and result in my really wanting to see what kinda feckin' graphics card you've got
honestly if i'm getting 20 fps though i'm not that dicked about it.
etc.
one tidbit i'd heard however that sounded reasonable is the idea that higher refresh could actually make things appear worse if it was "out of synch" with the fps your card was drawing, or some crazy thing.
i don't remember the exact details, it was all terribly over my head, but the general conclusion was that folks should stop high-number whoring and just fiddle around with stuff until they had a result they found pleasing, which i immediately concluded was the best advice i'd ever hear on the topic.
now, obviously if you're getting 5fps or your monitor refresh is 30.... okay, upping that a bit might help....
a few numbers:
movies look good at 24fps, but employ motion blur and afterimage.... along that theory, turning the lights off in your room and having a high contrast, brightish monitor, and upping your screen refresh would in theory make the game look smoother. i like to think "what's good enuf for film", etc, though there's been plenty of geektalk on dicerning the difference between 120fps and 121fps... after awhile it just sounds so coke/pepsi taste-test.
television runs at a 60htz refresh rate. that's like bottom of the barrel for most computer monitors. dvds play at 30fps, because afterimage can't be employed as much due to televisions and computer monitors doing line-by-line refresh rather than "fullscreen" refresh as a film screen would. that and the environment is often brighter than a movie theater.
lack of motion blur in a comp game would i guess make you want to shoot for getting your monitor refresh and card in sync at somewhere between 60 to 75 htz / fps...anything higher than that would both be overkill and result in my really wanting to see what kinda feckin' graphics card you've got
honestly if i'm getting 20 fps though i'm not that dicked about it.
etc.
the monitor refresh will limit the maximum FPS but it wont necessarily affect it. i.e if your monitor is at 60Hz then the game wont run above 60fps. If you're running at 30 FPS then drawing and processing the game is taken more time than it takes for the monitor to update, so gettin a higher FPS would rely on gettin new hardware, turning off graphic options or having lower detail (or less) models on screen.
All I know to add to Hedgie's collection is that carttons and model stuff like Wallace and Gromit is done at 60fps. feckin hell, I'd get pansy off wi that :]
I thought all TV shows ran at about 60 as well, tho.
You don't want too high an FPS thou, or it'll get to the stage where it's running too smooth and just looks.. wierd.
I thought all TV shows ran at about 60 as well, tho.
You don't want too high an FPS thou, or it'll get to the stage where it's running too smooth and just looks.. wierd.