It’s been another crazy week of re-architecting the way multiplayer match-making works to handle the number of users.
Today we are pleased to announce the new update:
Here’s what’s new for this week:
Now, that said, what’s NOT in this update?
And lastly, let’s talk about 5 on 5.
There have been other peer to peer games. But the connectivity complexity is C = D * N*(N-1) where N is the number of players and D is the raw data being exchanged. While Demigod supports 5 on 5 multiplayer in custom, LAN, and single player, I have to tell you, unless you know who you are playing with, you’re asking for trouble if you try to do this with strangers. Even if you manage to find 10 random people to play with online and they all manage to connect to each other, when you get in game, because the game is synced, you’re going to likely have a non-ideal experience.
Let’s work through the formula where let’s assume D = 10.
With a 3 on 3 game, which, according to what I’ve seen in Supreme Commander, is the typical largest game people get into, C = 300. On Battle.net with Starcraft, the max number of players is 8. C = 560. But when you go to 10 players, C = 900, nearly double the complexity of trying to do an 8 player game.
What our work has been these last 2 weeks has been to re-architect the system so that we can bring down that D multiplier. That’s the only part we can control.
When trying to connect players together, each player, based on their internet connection and PC performance has their own threshold of what variable C can reach before things fall apart.
In the beta, we were able to bring our variable D down by adding more servers. And most beta testers had high end connections and PCs so they had relatively high thresholds before variable C had problems.
But clearly, in release, that threshold turned out to be a lot less. The first inkling that our D variable was way too high was that the pirates who were pounding the server that first week were able to overwhelm our servers. To put things in perspective, Impulse typically gets around 300,000 users using it each day and it doesn’t even break a sweat. But a mere 140,000 connections that first week brought Demigod’s online experience to its knees.
We were able to get a reprieve that first week by shuffling users off to a new set of servers to get them away from the warez users. But as the game has continued to sell (and it looks like it’s going to break the 100k milestone shortly).
So the thing we continue to do is try to lower that D variable so that more and more people with marginal systems can get in. There will likely be lots more updates as we find new ways to squeeze it. In the meantime, your best hope is to try not to go nuts with the # of players unless you know them.
We will probably have another small update tomorrow as we continue to refine this.
I think, and could very likely be wrong, the distinction is in the P2P method. Most FPS seem to be Clients connecting to one dedicated server or a host playing the game. Either way, that system has it's problems too. Things like host lag and the need to forward ports to be a host. Even being aware of the difference. We take it for granted because it's the norm, and, well, tried and true. But whoever made the call (SD, GPG, Barack Obama) to implement P2P instead was being ambitious enough to try and do it a better way.
And. well. Here we are. Best-laid plans, you know? Sucks even more when the ambitious ones fall apart.But seriously. Can we get those proxies? Just put them out. Just for me. I'm somebody special!
Oops. Double post. Don't I look silly?
I'd like to see frogboys response to that. Not being rude or anything, just interested to hear it.
dota=10 players. its sad you cant match that properly, dota's not exactly nasa tech.
I'm pretty sure DOTA is a Client/Server connection again, not P2P as DG is.
client-server is O(n)
p2p is O(n^2)
the advantage of p2p is that it basically crowd-sources network infrastructure so you can get it operating very cheaply compared to building a big enough server infrastructure to host 200,000 simultaneous users. the issue with crowd-sourcing anything (and p2p is no exception) is just one of compatibility. everybody has slightly different stuff. its incredibly difficult to build a robust enough system to get everybody talking to everybody else. this is the root cause of all the difficulties Stardock has had getting the multiplayer working.
ultimately they've paid the price for trying to do this on the cheap with p2p instead of the old-fashioned, but expensive, client-server model that has been proven to work so well.
Did they not have sufficient models on which to base thier netcode upon?
Also, just to let you know, the reason that GPG at least has been giving for using a P2P model rather than a client-server model is becaues of the melee-heaviness of DG. Shooters can tolerate the slower reaction time of a client-server model more than a melee game can. You can better evaluate the believability of this statement better than I can.
I just am confused, I suppose, over why this game is so radically different from other games that have not had nearly the same problems of this game with connection issues and if/why this differences in implementation and coding were neccesary.
Also, is O the number of connections and n the number of nodes in the network? Are those scaling relations for networks that you've posted at the top of your post? I know next to nothing about computer networking, but I know a little about network theory
That's actually "Big Oh" notation, its a kinda-sorta math that computer scientists use to describe the efficiency of an algorithm as it grows. Basically, it tells you how fast the resources needed grow compared to the growth of the inputs / items to be processed / what have you. O(log n) is best, O(n^2) is to be avoided, and O(2^n) is to be considered a plague upon men, never to be seen if you can possibly avoid it.
It's big-O notation, meaning that the growth of complexity is roughly proportional to n^2. See here:
http://en.wikipedia.org/wiki/Big_O_notation
I know this is going to be a picky post, but I can't help it .
Complexity is a little off. It's actually C = D * N*(N-1)/2, since you only need to establish one connection per person.
Also, my understanding from your previous post of how battle.net works with warcraft3 leads me to believe it's a linear increase in complexity. Each person only needs to connect to the host... so the 4v4 game would be C =D*(N-1), which means C = 70.
Sorry, double-replying because this caught my eye.
I'm not aware of any publically accessible implementation of high-volume peer-to-peer gaming (I'm also not a gaming-industry guy, so I'm prepared to be wrong), so they probably don't have any models to base their code on. Knowing that something has been done, somewhere, by someone, isn't enough when you're trying to replicate it in your own code; for something this complicated, even if you know in general terms what they did and have skills in the broad area of networking, there are always a million (*) little things that the people who wrote it ran into, fixed, and then probably forgot about, all of which you'll have to stumble over and learn on your own.
In the case of Demigod it's particularly hard because there seem to be a lot of problems that only manifest at scale, or worse, only manifest on particular combinations of network configurations (meaning machines, network topology, ISPs, etc). It sounds a lot to me like they're testing stuff in their "lab", finding a way to reproduce the symptoms customers are reporting, writing a fix for that, verifying in the lab that it works ... then discovering that while they did fix a problem, there were more problems that cause exactly the same symptoms, plus their fix broke stuff for people in some other situation that they didn't consider. I think Brad's posts are too optimistic (I've thought this since the release), but they are pretty much exactly the thought process that you go through when you're troubleshooting something big and complicated like this. "Hunting the bug ... found the bug ... fixed it! ... test ... not, not fixed. " Normally you want to test well enough to catch the non-fix before you release it (because customers don't like it when you release fix after fix that doesn't solve their problem), but I'm honestly not sure how I'd go about testing this sort of networking code -- you'd have to replicate a large chunk of the Internet to do a proper job of it.
(EDIT) Just to explain myself better, say I'm writing a spreadsheet program (not that anyone does that any more) and I wrote some bugs in it. One bug is that when you try to subtract zero from anything the program crashes. Another bug is that when you calculate the maximum value of two cells and one doesn't have a value, the program crashes. We get a bunch of calls to support saying that the program crashed and they hate us all, our software is worthless, they want their money back, our mothers are hamsters, etc. Naturally, everyone who calls just says "I was using the program like usual and it crashed!" So we try running the program a bunch of different ways and after trying various things, we find out that calculating the maximum value of two cells, one of which doesn't have a value, crashes. We fix that and verify that hooray! the bug is fixed. We put out the patch and ...... it turns out everyone was trying to subtract zero from something. FAIL!
Now, the example I gave is pretty silly (a bug like that wouldn't have been released, customers would have given us enough info to track both bugs down, etc). But the harder it is to find the cause of a bug in the first place, the more likely you'll end up fixing some problems but not all of them.
That's my view about the technical end of things. I'm not a businessperson and I don't know their business, so I don't know why they chose to write their own stuff instead of buying someone else's. (and it sounds like they did try to buy code to at least an extent, but it turned out the code they bought sucked)
(*) number pulled out of thin air, your order of magnitude may vary.
I'm sure Brad or any of the talented staff at Stardock can give you a real answer regarding the availability of off-the-shelf large scale p2p code for game networking. I suspect the answer you would get on this would be something like this: "No, not really. Its all very proprietary and top secret and to be perfectly honest nobody's ever got it working great for this kind of thing."
alot of other p2p systems wouldn't have been good fits because they don't have to consistently maintain uplinks with latencies of just a couple hundred miliseconds (or less). your BitTorrent stuff can go at whatever pace, it doesn't matter. If it lags out for 10 minutes it just makes it take longer for your file to download, it doesn't crash your game or interfere with your user experience.
so they chose something that allowed them to release a product without commiting to a large scale server infrastructure to host games. they decided not to build their own Battle.net. If you had given them 6 years and 200 million dollars they would have built their own Battle.net clone. yah, they didn't have that kind of time or money. given the time and money they DID have they put out the best product they thought they could. its not quite good enough. they're well aware of this. they're very busy improving it.
just for perspective, go back in time (in your memory, if you're old enough) and try to remember what Battle.net was like back in the days of Warcraft 2 and Diablo 1. then think about how many gigantic overhauls its recieved fueled by revenues from smash hit games.
So yea, they did use some off-the-shelf stuff. The NAT facilitator from RAKnet for example. Being off the shelf it wasn't exactly up to the task. They've spent alot of the last 3 weeks trying to make adjustments to unexpected failings at scale due to aspects of the RAKnet code that they didn't originally know about. I'm rather amazed at how quickly they've been working. Its extraordinary.
I think if more people had a good knowledge of how time consuming and error prone code development is they'd all be in awe of Stardock right now. The pace they're working at is basically unheard of. Thats alot of overtime pay. Its also an amazingly dedicated team who deserve more credit than alot of people are giving to them.
Ok, thanks... I suppose that puts it into a little bit of perspective. I do reserach in a lab, so I know how difficult reproducing someone else's result can be.
That said, I can't get over the fact that the majority of games at least have workable multiplayer. How many games have this many issues that are released by a decent developer?
My pings are now around 2 seconds. I'm just going to return this game if I can't play it within a week.
lots of fancy numbers, lots of math
still not working
I know of at least 1 more p2p game - COH (and as I hear - SupCom but haven't played it) And it too is having/had some similar problems like the "could not connect to player...." (still a problem after almost 3 years), stats not recording (they fixed it recently) etc.
COH however is in more then 90% cases a 1v1 or 2v2 game (or the 4vAI compstomps) and DG, like DOTA, is typicaly 5v5.
So, AFAIK, if/when Stardock/GPG make this game playable 5v5 without it requiring such relatively big upload speeds, they will become pioneers in the multiplayer gaming industry since no one has ever achieved this yet.
This, ofcourse, doesn't mean a lot to people who can't play the game as intended.
Stardock has had 1 month.
According to SD, battle.net is only p2p in sofar as a "peer" acts as the server or "host" for the game. It's really client server and has only N-1 connections.
Also... I remember playing WC2 on KALI. Are you sure BNet was around then?
They re-released several versions of Warcraft II that were Bnet enabled - I actually think they were offically labelled as Battle.net editions.
There is no excuse whatsoever. If a technology is available, they can simply ask or pay to use it instead of making everyone suffer. Saying they had only one month and Battle.net has been out for 10 years is non sense.
It is THEIR fault they wanted to come up with their own thing that's not working.
Noone cares if it's P2P or Client/server or a mix of both, people just want to click connect and play.
They could have gone with the classic working approach THEN attempted to improve the lag and ping with their technology.
I'm one of the new who never had any problem connecting and still don't, however, I'm an IT tech and student so I've got the knowledge to fix anything coming up that might be on my end. The common folk doesn't even know he can open ports on his router.
Which was the point, they were trying to avoid having to open ports, and it failed.
I admire the work you guys do, the reply you post on the forum and the patch you make available often.
However, it is frustrating to be unable to play 2vs2 games or 3vs3 games (easily) when 10 or 11 years ago I was playing them easily on Battle.net with starcraft I without touching a router thing port of anything else... /cry
Oh well... since the yesterday patch I was unable to play a game, but I only tried 5 different games and then get bored. Most of the time I am unable to connecto to 1 or 2 players.
Keep on guys!!! The game is fun damn....
I was able to play any online game after the beta patch. Before the beta patch I would connect to no players at all, after it... almost 100%. The patch after that, still worked great, now the May 14th one comes and I can always connect to the host but never to ANYONE else.
When I host a game I can connect to EVERY player but no one can connect to me.
This does not change if I am in the DMZ of my router's firewall, windows firewall on/off both public and domain levels (Windows 7.) If I plug straight into my modem (Qwest ADSL2) or if I forward a million ports, mod games files, .dll's ect...
I have re-installed demigod over 20 times, re-formatted my computer to 3 different OS's (XP 32-bit, Vista 64-bit and Windows 7 RC1 [7100]) several times now. Through all the the rebuilds nothing allowed me to connect to other players and then after my 4th OS rebuild I got the beta patch and it worked. So since I hate Vista I re-installed Windows 7 and LEFT MY FIREWALLS ON and my system and router in default configs, was still able to play fine. Then the may 14th patch came out and everything broke again.
The only thing that ever let me play online was the beta patch, the one after it and that's it. No amount of personal user config affects my ability to play games online at all. It's all Demigod software based at this point.
On a side note, VPN tools and apps such as Hamachi and Gameranger work 100% of the time for me with every and any patch. I play with other people that have purchased their copies of the game to support Stardock and Gaspowered Games as a company. However I would like the ability to play online again.
Please let me know if you need any more information such as dxdiags or what-not.
Thank you,
Shantarr Dal'rae
When will proxys go live? (Like fer real?)
What is the complexity if 5 people play against 5 AI?
lulwhut? What a useless bump QQ
There are many great features available to you once you register, including:
Sign in or Create Account