So you're always on the right 60hz timing? Why are there so many stupid people around here? Dotabuff used to gather smart ones, now it's filled with shit that is also fucking braindead
The problem is that 144Hz server would cost 5x more than 60Hz one. That's why they keep server tickrates at 60Hz. There is some COD server that has tickrate of 120Hz.
Tickrate won't matter on a topdown RTS as much as it would for other types of games, like FPS. Trust me I've played Counter Strike on a 64 tick server and 128 tick server to compare, difference is glaringly obvious. Now try a 3rd party host that hosts 120 tick servers on dota, try to tell the difference w MM servers, you won't feel much.
Put all that crap aside, i'm asking why is Puppey curretnly at TI playing on 144Hz monitor if the server is 60Hz? I'm not comparing CS, Dota or anything, i'm just asking a technical question that is not about how humans see it.
All of it is crap made by PC MASTERGAYS. I mean the universe itself is locked at 23.994 frame per second. All this is bullshit
Cus even if the game gives you positioning info every 60 hz the client on your side performs animations and such that a higher refresh makes it appear smoother. That's my guess anyway. Whether it makes you play any better is super questionable though, it just looks nicer.
It's not that you are gonna get 2k mmr when you upgrade your monitor from 60 to 144Hz.
It's so simple, server sends information in packages 60 times a second, but your GPU renders a game at 144FPS. So you see positional changes at instances every 1/60 of second at 144FPS.
It's not that there are exceptions, it terms that everything that is in the game needs to be sent to the server and back.
So my question, once again is, why are things appearing smoother (i tested this myself, i own a 1440p 144Hz monitor) on 144FPS if the server is sending info of everything that happens in a game at 60HZ?
Also, after a year of playing at 144Hz i tried to play at 60Hz and it was like game is lagging. I was like wtf is this, this lags a lot, almost unplayable.
i got literally no clue about the technical side of the topic (absolute 0 in this stuff), so dont blame me if my question is stupid
if the server is sending u packages at 144hz and if ur device interprets it at 60hz, they are slightly desynchronised and ur image gets a bit delayed as the intervals between input and output do not coincide: 144/60 = 2.4, not an integer, which means a 0.X delay occurs, where X is an even digit or zero depending on the cycle's sequence number. As for the 60fps, they are perfectly synchronised, since (i guess, i might be wrong here) the starting point is the same. as a result i assume that with 144hz we get a 0.4 (being equal to 0.00(66) seconds, as far as i understand) delay on average, while for 60hz we get none, so it seems more efficient.
Tickrate of the server and your in-game FPS that's synchronized with your screen has nothing to do with each-other.
So, if you're running 144hz screen with 140+ FPS constantly, you shouldn't be bothered about tickrate of the server.
It's simply not the same thing.
And to answer your question;
The things apear smoother because tick-rate isn't merged to your in-game FPS and your screen.
We dont' know the code so it's probably impossible to say but it's likely that the client smooths over some stuff which makes it appear less laggy at higher fps. Like maybe it doesn't just say hey hero is at position x,y right now, maybe it says hero is at x,y last position, x2,y2 currently and the client animates it, or maybe it says hero is at x,y, moving this direction with this speed and then you get course corrections later. It's obviously doing some kind of buffering client side.
Oh god, stop it.
Only because tickrate is measured in Hz it doesn't mean that it's the SAME SHIT as fucking Hz on your computer screen.
Don't be fucking retarded. What triggers me the most is that EVERY question simular this one is already on the Internet.
You're just fucking lazy to Google it.
People are just braindead to recognize the difference between overall FPS/Hz and Netcode which is completely differnet thing from your IN FUCKING Game fps.
Stupid animals
There's no reason to insult people for asking a honest question. They're curious about the topic and want to find out the answer so let them. Its not like everyone has tons of experience in this field and you will probably ask questions that might seem stupid to the experts if you're dealing with a field that you're not familiar with.
Anyway to simplify it a bit, the server will send information every 60th of a second. This information is generally in the form of commands to your client such as "reduce puck's hp by 100" or "sven casts storm hammer on lina". Your client will then carry out the commands and continue to do them until it has a reason not to. In the case of Sven casting stun, the server tells your client that he is casting it and then doesn't need to say anything else. You client will start the animation and play the animation until either it ends or the server sends a command to interrupt it (if sven is stunned for example).
So your client is able to do the whole sven stun animation without need for the server to say anything. How smooth this animation appears is then determined by your monitor refresh rate and your game's fps. At low refresh rates the animation looks jumpy and laggy. At higher refresh rates the animation appears smooth and fluid. Both take the same amount of time to cast the stun.
Now about the difference between server rate and frame rate, any command that the server sends will be visually applied at the next frame. If sven gets stunned, the frame after your client receives the command will start showing the cast animation stop and be replaced with the stunned animation. If your refresh rate is very low (less than 20 fps) the delay between the command being received and the next frame is enough to reduce your reaction time to the occurrence. At higher fps this delay is negligible.
Having a 144 Hz monitor will probably mean the incoming commands get displayed sooner than on a 60 Hz but this difference is so small that it doesn't have any effect. The main difference is that your game appears to flow smoother because your monitor is displaying animations more fluidly.
yes there is a reason. You come on Dotabuff forum, where mostly egoistic retards or retarded uncontained monkeys exist to ask such an ez question.
Google the freaking damn thing first. Do you really think they'll answer you correctly? Most of them are here for memes, not for this kind of shit
oh and to add
rich kids buying 144hz screens and i7/ryzen 7 or whatever to archive something they have no clue about..
yeah
thats for insulting
The only thing that can be linked to your framerate in terms of sending information is that of your client.
So CLIENT can send information at a "rate" of your framerate, and that's not connected to TICKRATE. But on the other hand, SERVER CAN NOT send information faster than TICKRATE itself is. So if tickrate is 60Hz, then server has to send info in increments that are delivered every 1/60th of a second.
What this means is that SERVER will get information faster/earlier (that can change outcome of a game in very low number of cases), and the next package of information that SERVER sends to a CLIENT will contain information of something happening before it would if your FRAMERATE was lower. (60 vs 144FPS)
So you basically get information faster if your framerate is higher, but that has some limit of a TICKRATE that SERVER is operating on. So 60Hz TICKRATE and 144FPS of your game/CLIENT will interfere, creating disorder in what manner information is being sent and recieved. So 60Hz TICKRATE is every now and then containing information of a 144FPS information that is being sent, but not fully since it can't keep up.
Sorry if i confused you.
@Twitch.tv/Vertoxity
I don't think you are right (speaking about expensive screens and CPUs), there are people that enjoy gaming and graphics details. Of course there are kids that get that kind of hardware while having 0 insight of what it is, but i don't think that majority of hardware of that quality being purchased is for kids.
@ Scy, yes there is slight delays between sending and receiving data when your frame rate doesn't match the tick rate, but this delay is so small it probably wont make a difference in almost any scenario. A 144Hz monitor will have 2.4 info packages for every server tick while a 60Hz monitor will have 1 package for each server tick. When you receive a package from the server, the game will show this on the next frame. Assuming you're able to instantly react (which you cant) your command will be registered at the next frame and be sent at the next available server tick.
On 144 Hz you receive the server tick, its displayed in the next frame, your command is inputted in the next frame and then is sent at he server tick. Total time is 2 server ticks
On 60 Hz, you receive the server tick, its displayed in the next frame, another server tick occurs, then your command is inputted at the next frame and then your command is sent in the next server tick. Total time is 3 server ticks.
So on a 60 Hz monitor you technically have a delay on 1 60th of a second compared to someone who has a 144Hz. Which, when compared with human reaction times and ping delays, is not noticeable at all.
And how would you explain the fact that game is for me after getting used to 144FPS almost unplayable on 60Hz? Game is just much smoother and much more fluid than at 60Hz. It's not just "noticable", it's night and day difference. (for me personally)
It visually looks smoother and visually it looks like it reacts faster to your inputs, but as far as the server is concerned, it shouldn't make a noticeable difference to how quickly your commands are registered.
Tick = amount of data sent per second to the server kind of thing
Fps = u know la
Higher fps simply means smoother graphics and inputs which gives you consistency to some extend
Hell, the difference between 60 fps and 100 fps can be felt even on 60 hz monitors due to faster hardware input
Also they're playing while promoting their sponsors' products, why would BenQ/asus/whateverthefuckmonitor want to showcase their 60 hz monitor instead of the 144 hz one
i really dont think it matters either way for this game in almost all circumstances, the only exception is (and only if the tickrate is higher than 60) blink/pshift dodges and fire remnant but if you care that much just get the 144hz to be sure, it will be very valuable for fps games anyway
well theres ur answer, it mattered to them bnecause they were used to it but go buy 144hz
u will notice they are talking about cast animations and mouse movements, these are independent of server tickrate
I own 144Hz monitor for few years now sir. (And a system/PC that can actually render 144+FPS)
the game is still rendering shit at any fps no matter the tickrate, your position might not update to the server but ambient effects, moving animations, cursor movement, particle effects etc all occur at your fps rate because they happen independent of ur connection to the server or some shit
aka when your internet goes out and the game freezes the turtles still crawl around, sf's fiery arcana still flickers, the water still ripples etc
Please sign in to post comments.
So i've read on reddit that tickrate of dota 2 servers is 60Hz. If that is true, there is no point in playing the game at 144/240Hz monitors, since game itself is updating 60 times per second.
Still, many people, including myself, even pros at LANs and stuff are playing at higher refresh rates than 60. Even i myself can confirm night and day difference between 60Hz and 144Hz. But what's happening, how can i see more than 60 FPS if server itself doesn't send that information?