Jump to content


1080ti Meh

1080ti low fps bacon

  • Please log in to reply
57 replies to this topic

Balc0ra #41 Posted Dec 05 2017 - 05:44

    Sergeant

  • -Players-
  • 1325 battles
  • 136
  • Member since:
    03-10-2016

View PostLesser_Spotted_Panzer, on Dec 03 2017 - 15:56, said:

You are hitting the cap of the game engine.

Hope you didn't buy a 1080ti just for this game, a 1070ti is more than good enough.

Anything over 30 fps and you won't visually see any difference.

 

View PostVinnyI82, on Dec 03 2017 - 16:09, said:

 

Uh, no this is completely incorrect.  There are many people who can tell the difference, visually in a game, between 30, 60, 90 and even higher fps.

My high point is around 90fps.  lower and I will notice.  higher and I won't notice.

 

What the eye can see. And how you see it are two different things. You notice instantly if you run a game on 30, and then play the same game on 60 after it. The smoothness is what you notice. COD always ran on 60 fps even on last gen machines, and you notice a huge difference in animation smoothness vs 30 fps shooters on console. 

 

If you play a game on 60.. you would notice it instantly if it suddenly went down to 30. But from 120 to 90? Not many would notice that even on smoothness tbh.Or even between 144 or 127. On the lower fps range from 60 and down sure. But higher up? Hardly.

 

 

 



Viserion_Dies #42 Posted Dec 05 2017 - 05:56

    Major

  • Players
  • 30339 battles
  • 5,111
  • [FADED] FADED
  • Member since:
    06-19-2013

View PostBalc0ra, on Dec 05 2017 - 04:44, said:

 

 

What the eye can see. And how you see it are two different things. You notice instantly if you run a game on 30, and then play the same game on 60 after it. The smoothness is what you notice. COD always ran on 60 fps even on last gen machines, and you notice a huge difference in animation smoothness vs 30 fps shooters on console. 

 

If you play a game on 60.. you would notice it instantly if it suddenly went down to 30. But from 120 to 90? Not many would notice that even on smoothness tbh.Or even between 144 or 127. On the lower fps range from 60 and down sure. But higher up? Hardly.

 

 

 

 

incorrect. the human eye cannot see anything above 30. your TV runs at 29. is that not smooth? because smoothness and fps dont mean the same thing, nor do they correlate

Hurk #43 Posted Dec 05 2017 - 18:26

    Major

  • Players
  • 46856 battles
  • 14,348
  • [KGR] KGR
  • Member since:
    09-30-2012

View PostViserion_Dies, on Dec 04 2017 - 21:56, said:

incorrect. the human eye cannot see anything above 30. your TV runs at 29. is that not smooth? because smoothness and fps dont mean the same thing, nor do they correlate

again, this is completely false. the only thing that 22fps does (not 29) is tell your brain "stop showing me still images, show me motion". and it only works because those frame rates are consistent. 

 

people have been tested to be able to detect subtle changes in images at 400fps+. more for large changes (such as a cell transitioning from black to white and back)

 

your senses sample your environment THOUSANDS of times per second. your brain just SORTS that information into something usable by you so that you are not overloaded with information on every spec of dust in your view. 



xperiment2g #44 Posted Dec 05 2017 - 18:37

    Sergeant

  • Players
  • 27620 battles
  • 175
  • [DHO4] DHO4
  • Member since:
    06-17-2013

View PostColonelShakes, on Dec 03 2017 - 07:36, said:

 

Must be annoying , as almost every single light has a frequency of 60hz.  What's it like living most of your life with lights blinking on and off constantly?  

 

Fantastic lie your 90...  Not true however.  

 

^^^This....there is so much misinformation out in the industry it makes me laugh.

Diomidis #45 Posted Dec 05 2017 - 18:40

    Sergeant

  • Beta Testers
  • 24240 battles
  • 134
  • [AMPED] AMPED
  • Member since:
    09-24-2010

Long story short: 

  1. The human eye (and brain) is proven by lab tests  to distinguish and identify flickers of 500 and in some cases even 1000Hz - if you want to translate that to 1000 FPS, cool, it is not the same, but saying we cannot see more than 24 or 30 or w/e FPS is plainly ignorant. Some people are born with better perception/reactions etc, but video gamers are also TRAINED to do so by spending time and trying hard to pay attention to details that give them the edge. 
  2. Buying overkill GPUs for 60Hz panels is silly: the 60Hz panel cannot display more than 60FPS, it is plain and simple, so you are wasting your money. Sure, getting a 10-20% overhead to ensure 60 FPS all-around is not a waste, but a 780Ti on a 1080p was and still is overkill for most games with remotely good optimization. 
  3. High refresh rates are noticeable only in dynamic scenes; in slow or static scenes you rarely will see any difference beyond 30 FPS. But this is a game with action and like with other games that contain fast moving things that "pull the trigger till it goes click", 30 FPS don't and won't cut it. Plain and simple. Most movies and your favorite soap operas will look fine at 30 FPS. 

Edited by Diomidis, Dec 05 2017 - 18:40.


Hurk #46 Posted Dec 05 2017 - 19:10

    Major

  • Players
  • 46856 battles
  • 14,348
  • [KGR] KGR
  • Member since:
    09-30-2012

60hz isnt the limit you make it out to be. AA and FXAA etc and filtering, all use multiple frames to make 1 frame. so a 60fps display may be being fed by ~240-300 frames of data. thats how AA, etc, work. 

while a 1080 is "overkill" for 1080p, its not overkill if what you are after is bells and whistles and some future proofing. 

 

a rock steady 60fps is a better gameplay experience than a variable 90-150fps that fluctuates. 



Charlie_The_Cat #47 Posted Dec 05 2017 - 20:12

    First lieutenant

  • -Players-
  • 12656 battles
  • 937
  • Member since:
    05-06-2015

View PostAltansar, on Dec 03 2017 - 09:30, said:

So I picked up a 1080ti to replace my really old 780ti which I use to play with a1080p, 144 mhz monitor.  I was expecting to get 144 fps with my 1080ti however, I'm only getting 120-127 with spikes down to 100 randomly.  Gpu is only showing 40-50% load and my cpu is only showing 20-22% usage.  Is it normal to only get 100-127 fps or should I be expecting more? 

 

Computer is set to max power, same with card.  Nothing stands out as limiting the gpu.

 

CPU: I7-4770k 3.5ghz

GPU: evga 1080ti

PSU: 1000 g2

Mobo: Z87-HD3

Windows 76 4bit (ultimate edition)

 

 

Solved, Game cap is 127

 

Then WoT graphic engine max FPS is ~120, so that's why you can't go any higher at the resolution your are playing.

Uilleam72 #48 Posted Dec 05 2017 - 21:59

    Sergeant

  • -Players-
  • 12927 battles
  • 125
  • Member since:
    03-18-2016

View PostHurk, on Dec 04 2017 - 16:21, said:

incorrect. power supplies are designed to operate at 50-80% load. that is the point where they are most efficiant in converting AC to DC. anything below 50% and most PSUs are generating waste heat and using more power to generate less DC power than they could at 50-80% load. 

https://www.anandtec...er-supply-myths

 

 

 

Thanks. Very informative. In that case I'd still prefer inefficiency over a PS getting close to it's max load. I've been working on computers for over 20 years and I've seen many under powered PS's die/wear out from not having that extra wattage. 

Jaspah #49 Posted Dec 08 2017 - 00:18

    Corporal

  • Players
  • 9321 battles
  • 46
  • [MAHOU] MAHOU
  • Member since:
    04-26-2011

You can uncap the frame rate by editing the engine_config.xml file buried in your World of Tanks game folder. The game engine is not limited to 120, it is just configured not to go any higher. You have to use a 3rd party tool to unobfuscate the XML file before you can modify it. This video explains how to do it... but if you break your game I'm not going to help you. I know the tweak works because I'm currently using it... haven't had any issues either.

 

I run the game with a 1080 Ti as well on a 1440p 144Hz G-Sync monitor. I've got everything maxed out except for grass being off and my FPS never drops below 130 (it never quite maxes out at 144 either, even when looking at just the skybox).

 

Alternatively, you can wait until the HD maps come out because it seems like Wargaming removed the 120 fps cap when I played on the sandbox.

 

Also, you should update to Windows 10 so you can take full advantage of that card. Windows 7 is not going to let you run newer DX12 titles and I don't even know if the Windows 7 drivers are optimized as well as the Windows 10 drivers are... at least for newer cards anyway.


Edited by Jaspah, Dec 08 2017 - 00:19.


Jer1413 #50 Posted Dec 08 2017 - 01:21

    First lieutenant

  • Players
  • 40630 battles
  • 832
  • [RR13] RR13
  • Member since:
    02-24-2013

All I want to know is if I should bother running at 105 fps on a 60 hz monitor or if I should just sync to the monitor at 60 fps.

 

Only difference I can see in gameplay is my GPU running at 95-100% rather than about 60% when synced.

 

 



sliderone #51 Posted Dec 08 2017 - 01:27

    Corporal

  • Players
  • 14572 battles
  • 45
  • [ICY] ICY
  • Member since:
    01-10-2013
Can't wait to try it out on my new rog g703vi (should be here next week) 144MHZ!

Icon_Charlie #52 Posted Dec 08 2017 - 02:23

    First lieutenant

  • Players
  • 14184 battles
  • 698
  • [-DRB-] -DRB-
  • Member since:
    09-18-2013

View PostUilleam72, on Dec 05 2017 - 12:59, said:

 

Thanks. Very informative. In that case I'd still prefer inefficiency over a PS getting close to it's max load. I've been working on computers for over 20 years and I've seen many under powered PS's die/wear out from not having that extra wattage. 

That information given on that Anandtech is rather old and though it is in certain ways useful I don't particularly use. . As a Rigger with 27 years of experience, I will say the following.   The Down and dirty formula that I use is the following.

 

CPU wattage + Idle video card wattage x2 + 50.

 

Example.  My Ryzen 1800X is 95 watts. My EVGA Superclocked 1050 ti (excellent video card) video card idle wattage is at 75 Watts (Load is 128)   (95 + 75 ) x2  = 340 + 50 = 400* Watts (390Watts but I round up).  This formula is overkill but I have had no issue in getting the right PSU (note this will take into consideration 1 HDD, 1 DVD, Memory).

 

From there you have to make your decision  Cheap PSU or a more expensive one.  I prefer at least a 80% plus on my PSU.

 

Cheapies are old design models rated at between 50% (mostly) to 70% Efficiency.  In simple terms a 50% Efficiency Rating PSU means 50 of the wattage rated goes into giving off heat.   The rest is real wattage going into the computer under load.  So for a cheapie PSU  you will need  between a 600 to 800 Watt PSU to play it safe (70% -50% Efficiency).  You can get away with a smaller PSU but I won't and I have had no PSU failures in the Rigs I build. That is because I have so much extra, just in case.

 

In my Case I chose a Gold rated PSU (90%) rating at 650 Watts.  This gives me plenty of headway to slap on a 1070 in the future.

 

Again this is a down and dirty approach for how to choose a proper PSU

 

Look here for more detail on this subject.

https://en.wikipedia.org/wiki/80_Plus 

 

Secondly:

I clean out my Rig every 6 months and if I have a PSU that is 3 years or older I open it up ( with rubber gloves and a horse hair brush) and clean the insides of it.  Yea it voids the warranty but who cares on that age of PSU the warranty is probably voided anyhow due to age.  Dust is a major killer of electronics. 

 

My oldest rig is 17 years old, updated through the years and the PSU is 10 years old. Ancient but the Computer still works. Has some publishing/art software I still use.  It's all about taking care of your computer, buying the right parts, and spending where you need to spend it.

 

 

 

 

 

 

 


Edited by Icon_Charlie, Dec 08 2017 - 02:26.


Jaspah #53 Posted Dec 09 2017 - 01:55

    Corporal

  • Players
  • 9321 battles
  • 46
  • [MAHOU] MAHOU
  • Member since:
    04-26-2011

View PostJer1413, on Dec 07 2017 - 19:21, said:

All I want to know is if I should bother running at 105 fps on a 60 hz monitor or if I should just sync to the monitor at 60 fps.

 

Only difference I can see in gameplay is my GPU running at 95-100% rather than about 60% when synced.

 

 

 

Vsync always introduces latency. G-Sync and Freesync introduce latency as well, but no where near the amount that Vsync does. In a slower game like World of Tanks you can probably turn Vsync on and not notice a difference, but in faster paced games like CS:GO you should turn it off or get a G-Sync/Freesync monitor.



Asassian7 #54 Posted Dec 09 2017 - 05:51

    Major

  • Players
  • 24058 battles
  • 11,041
  • Member since:
    12-26-2011

View Postboysenbeary, on Dec 05 2017 - 06:11, said:

some of the things said in the replies remind me of the ole (debunked) console argument that the human eye cant see over x fps

And then some more remind me of "people who think they know what they're talking about but actually dont"



Asassian7 #55 Posted Dec 09 2017 - 08:17

    Major

  • Players
  • 24058 battles
  • 11,041
  • Member since:
    12-26-2011

View PostColonelShakes, on Dec 04 2017 - 03:36, said:

 

Must be annoying , as almost every single light has a frequency of 60hz.  What's it like living most of your life with lights blinking on and off constantly?  

 

Fantastic lie your 90...  Not true however.  

This is completely incorrect, you're trying to compare apples to oranges.

 

Most light bulbs run off Power delivered straight from the power grid. In the US i believe that is at 110v AC. Usually it is then converted from AC to DC, as most appliances usually run on DC voltage. This is an ANALOUGE signal. Its a constant signal. There are no "frames" or "on off" cycles to create a flickering effect in a lightbulb. Its a constant signal.

 

What you are trying to say is being done by a DIGITAL signal. And for that single, you would need a digital microcontroller controlling the cycles of the bulbs. Digital is the 1s and 0s. On or off. And unless you have an extremely modern house filled with LED lighting (because a regular light bulb would burn out pretty fast from being flicked on and off X times a second constantly) I highly doubt you have a microcontroller controlling your lights. 

 

And besides, you can see lights flickering on and off at high frequencies. I see it all the time, most often when those big tube flouresent lights are dying. Its annoying and gives me a headache, and they are probably doing it at 100+fps.  So your analogy is wrong anyway.



Icon_Charlie #56 Posted Dec 09 2017 - 16:58

    First lieutenant

  • Players
  • 14184 battles
  • 698
  • [-DRB-] -DRB-
  • Member since:
    09-18-2013

View PostAsassian7, on Dec 08 2017 - 23:17, said:

This is completely incorrect, you're trying to compare apples to oranges.

 

Most light bulbs run off Power delivered straight from the power grid. In the US i believe that is at 110v AC. Usually it is then converted from AC to DC, as most appliances usually run on DC voltage. This is an ANALOUGE signal. Its a constant signal. There are no "frames" or "on off" cycles to create a flickering effect in a lightbulb. Its a constant signal.

 

What you are trying to say is being done by a DIGITAL signal. And for that single, you would need a digital microcontroller controlling the cycles of the bulbs. Digital is the 1s and 0s. On or off. And unless you have an extremely modern house filled with LED lighting (because a regular light bulb would burn out pretty fast from being flicked on and off X times a second constantly) I highly doubt you have a microcontroller controlling your lights. 

 

And besides, you can see lights flickering on and off at high frequencies. I see it all the time, most often when those big tube flouresent lights are dying. Its annoying and gives me a headache, and they are probably doing it at 100+fps.  So your analogy is wrong anyway.

 US power Grid to the home, depending on the stat ranges from 110(old) to 120 Volts AC.  Most modern Voltage to US homes is 115 VAC  There are no changes from VAC to VDC coming from the US plug outlet.  All changes to VDC  are made by appliance there in. Also many of the US appliance Runs on VAC such as Freezers, TV's and Washing Machines.   Mainly, Dryers and Electric Stoves  mainly run 220 to 240 VAC.  Reason why it is normally Rated at 115 VAC is because you can normally survive an Electric Shock if the appliance shorts out (that's why you have a ground wire)  but your heart will stop beating at higher Voltage ranges, such at the 220 to 240 VAC.  Old time Electricians used to use wooden ladders, before going plastic, to  avoid/reduce the chance of getting shocked while working overhead.



Asassian7 #57 Posted Dec 09 2017 - 23:00

    Major

  • Players
  • 24058 battles
  • 11,041
  • Member since:
    12-26-2011

View PostIcon_Charlie, on Dec 10 2017 - 03:58, said:

 US power Grid to the home, depending on the stat ranges from 110(old) to 120 Volts AC.  Most modern Voltage to US homes is 115 VAC  There are no changes from VAC to VDC coming from the US plug outlet.  All changes to VDC  are made by appliance there in. Also many of the US appliance Runs on VAC such as Freezers, TV's and Washing Machines.   Mainly, Dryers and Electric Stoves  mainly run 220 to 240 VAC.  Reason why it is normally Rated at 115 VAC is because you can normally survive an Electric Shock if the appliance shorts out (that's why you have a ground wire)  but your heart will stop beating at higher Voltage ranges, such at the 220 to 240 VAC.  Old time Electricians used to use wooden ladders, before going plastic, to  avoid/reduce the chance of getting shocked while working overhead.

Im not from the US so im not sure of US voltages and how they're set up.

 

Over here its 240VAC. But yeah im aware not everything runs on DC. For the less knowledgable, the big bricks you see on plugs for appliances and stuff (and the bricks you get on laptop chargers for example) are converting AC to DC. (And also stepping the voltage down to what the appliance needs, most often 12V)



Hurk #58 Posted Dec 10 2017 - 00:47

    Major

  • Players
  • 46856 battles
  • 14,348
  • [KGR] KGR
  • Member since:
    09-30-2012

things plugged into the AC in the US do flicker if they cool fast enough for you to notice. for instance, my LED Christmas lights noticeably flicker. Also note that 60hz CRT monitors were considered JUNK because you could see the flicker on screen refresh. the good ones were 72hz or higher. 

 

flicker is very noticeable to many people, and its why some people get headaches in florescent lighting. the strobe effect is flicker as the ballasts get weak or go bad. its very easy to see the pulsing due to the AC cycle of 60hz. 

 

oh, and Vsync does not slow down anything. since what it does is prevent the video card from doing extra work. the only time vsync causes any issues is when you cannot feed 60fps. once you start dropping below 60 a lot, vsync switches to 30, and people notice that. its not vsync causing a slow down. its your crappy CPU not keeping up!







Also tagged with 1080ti, low fps, bacon

1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users