Joined: Oct 02, 2011
Sat Sep 01, 2012 2:07 am
In a nutshell, most modern games use client side hit detection - your pc sends data packets to the game server - you see an enemy, you shoot, that data gets sent back to your pc - in milliseconds - and your pc game registers the hits or non-hits to the enemy target - and then sends that info back to the game server - in milliseconds.
Here's the best description I could find of the differences between server side and client side hit detection.
The simple answer is that yes, they both have issues... it's just that the server-side model has all of the client-side model's issues, plus a few more.
When you have a multiplayer game, every shot made has three relevant points of view; the shooter, the server, and the target. Due to the delay in data broadcast, these three points of view will never be the same when anything happens. It takes time for an event to propagate from one computer to the others, which causes discrepancy in what is seen. No networking model can remove this delay, so instead they have to decide what data needs to be presented with the maximum fidelity, and to which PoV.
For shooting, there are generally two methods; server-side, non-compensated on the one hand, client-side or server-side compensated on the other.
The server-side, non-compensated method treats the server as the most important PoV. It's important to note that, because it's a PoV that no player ever sees. When the shooter makes a shot, he just sends a "firing" packet to the server, giving the location fired from and the direction the shot travels. There is a delay between when the shot is made and when the server processes it, which means the shooter has to shoot where his target ends up at a specific moment in the future, accounting for both his latency and the server tick rate (For RO2, that's his latency plus 0-50ms). For a shot on a sprinting target when the shooter has 100ms ping (Significantly below average in RO2) that means the shot needs to be made about 0.5-0.75 meters ahead of the intended impact point -- assuming the target doesn't change direction or speed in that time. If, when the server processes the shot, it intersects with a target, then it passes this information on to the target. At this point, the server (Which has authority on the matter) has decided, say, that the target is dead, but the target does not know this until the data gets to his client, which takes a delay equal to their latency. Once that data gets there, the target dies, even if he's moved well past the point where he was shot. This is how we get the "dying behind cover" illusion. In exchange for all these disadvantages, it's harder to cheat the system.
On a client-side model, the data-flow is identical, but the authority is different. When the shooter makes a shot, his client now processes the shot to see if it hits something, right then. He doesn't have to lead for latency any more. His client sends in the info that he has fired (So other computers can see him make the shot), as well as saying if anything has been hit by his bullet. Once hits reach the server, the server typically will have to decide if the shot is invalid (Such as the target not being anywhere near where the hit was reported), and if valid, decides how much damage the hit will do, before passing this information on to the target. The server has authority over someone being alive or dead, so at this point the target is now dead, but again, there is a delay between when the shot is made and when the target hears about it.
Then there's server-side latency compensation. The basic mechanics are like the server-side model, except each shot has a time stamp, and the server keeps a history of events, and will check against that history to see if a shot hits, rather than the current situation. In effect, it emulates client-side hit-detection, but has the advantage of the server-side's resistance to cheating. On the downside, it takes more processing power on the server's side (Which unfortunately makes it effectively out of the question for RO2, which is already pushing servers pretty hard).
In the end, the client-side (Or latency-compensated) method preserves the shooting experience, which is kind of a big deal for a FPS, while the server-side makes the shooting much less reliable for questionable gains. The best way to demonstrate the difference is to try playing on a RO2 server with the Antilag mutator running (Which will hopefully be more once TWI pins down the bug in their system).
It's also telling to note the games that use each model. Quake and Unreal use server-side. Source games use server-side with latency compensation. Almost every other game, including most games built on the Quake and Unreal engines, use client-side hit-detection. Client-side has been effectively the go-to choice for games since internet gaming began, particularly when getting "only" 250ms latency was a really good connection. I could play Rogue Spear with 250-300ms latency more reliably than I can in vanilla RO2 on the exceptionally rare occasions that I can find a server with <100ms latency. The Antilag mutator makes a tremendous improvement in that.
So get your head around this one and see how it applies to your in game situation
AND QUIT BITCHING AND LET OTHERS ENJOY THE GAME !!!
even when we are BIATCHING about the gameplay !!!
myself fuggin included :p
For you wiki nuts, here's this
Also, from the original thread on the RO2 forum, here's this statement :
Even with perfect infrastructure, if you're not playing on a LAN the ballistics model is squandered.
The game was designed with pings, you will never ever get if you're not on a LAN. So from the very beginning you're playing the game not the way it was meant to be played.
You're missing half of your shots not because you miscalculated the lead, but because of a random ping the moment you pulled the trigger.