Hi all, I'm learning c# and decided this would be a good project to do it. Right now analyzing the code, ServUO uses a very complex serialization system to save and load the world states.

At first I was thinking of using a object oriented database like db4o:
https://en.wikipedia.org/wiki/Db4o#One-line-of-code_database

Only problem is its no longer maintained but I might try to implement it anyways if its faster. What I'll do instead is write the code to store everything in a SQL database so people can manage their own dbs plus its probably more newb friendly (plus I'm already a sql dba).

I was thinking about something like cassandra because its more scale-able, although I doubt any one shard is going to be that huge. Another thing I want to achieve is no waiting for the world to save so its just continually persistent.

I've written some simple classes to save the world to a database which doesn't seem too hard its just world.items world.mobiles etc... although I haven't compiled it and tested it out yet because I'm noticing everything has a serialize and deserialize class to it. Like every monster, npc, its coded in everywhere and I'm not sure if I can direct the flow of code to avoid it maybe by changing the base serialize class but at that point I might as well rewrite like a large chunk of ServUO. I'm watching Alan's (aka itwouldbewise) videos on youtube and saw him some on twitch and its really helpful but maybe some other devs around here know some things and could give me an idea on how much work would be needed to just wrap in a sql driver and get it saving to that. I figure without the server having to serialize everything it should save a lot faster and if I just want to eventually get a persistent world state I could use the delta ticks and maybe some compression and just load the delta into a memory queue and send it off to the db (it looks like the code already does some of this.) If I ever get this done I'll upload it to git and submit a pull request or fork it or code it to convert the serialized dbs to the sql db. Thanks for any feedback.
 
Hi all, I'm learning c# and decided this would be a good project to do it. Right now analyzing the code, ServUO uses a very complex serialization system to save and load the world states.

At first I was thinking of using a object oriented database like db4o:
https://en.wikipedia.org/wiki/Db4o#One-line-of-code_database

Only problem is its no longer maintained but I might try to implement it anyways if its faster. What I'll do instead is write the code to store everything in a SQL database so people can manage their own dbs plus its probably more newb friendly (plus I'm already a sql dba).

I was thinking about something like cassandra because its more scale-able, although I doubt any one shard is going to be that huge. Another thing I want to achieve is no waiting for the world to save so its just continually persistent.

I've written some simple classes to save the world to a database which doesn't seem too hard its just world.items world.mobiles etc... although I haven't compiled it and tested it out yet because I'm noticing everything has a serialize and deserialize class to it. Like every monster, npc, its coded in everywhere and I'm not sure if I can direct the flow of code to avoid it maybe by changing the base serialize class but at that point I might as well rewrite like a large chunk of ServUO. I'm watching Alan's (aka itwouldbewise) videos on youtube and saw him some on twitch and its really helpful but maybe some other devs around here know some things and could give me an idea on how much work would be needed to just wrap in a sql driver and get it saving to that. I figure without the server having to serialize everything it should save a lot faster and if I just want to eventually get a persistent world state I could use the delta ticks and maybe some compression and just load the delta into a memory queue and send it off to the db (it looks like the code already does some of this.) If I ever get this done I'll upload it to git and submit a pull request or fork it or code it to convert the serialized dbs to the sql db. Thanks for any feedback.
Couple of things that might help give you some ideas:
Look at MyRunUO folder (if it's still called that in ServUO) in Scripts->Engines. That contains a lot of character export code for publishing various character info to a website. I don't think any documentation still exists (if any ever did) on how to use MyRunUO, but at least the code is there to give you some ideas.

The other thing you should look at is the automated donation store (I'm not sure if it ever got posted here, but this is the RunUO link: http://www.runuo.com/community/threads/runuo-2-0-fully-automated-donation-store-system.98542/).
That package allows for players that donate to receive their items in-game without the need for an Admin to get involved.

Between these two things, you should be well on your way to getting world info saved to a DB. Your first save would likely take quite some time (if you've got an existing playerbase, etc), but after that if the only info that's sent to the DB is what has changed, it wouldn't be too bad.
 
Great suggestions from @tass23 above. I also seem to remember a partially finished DB saving method being released, I think on RunUO. It may have been part of a full set of server code posted when a shard closed, or maybe as a separate project. In either case, I remember seeing the code at some point if my slowly failing memory still serves me. If anyone can confirm that, or knows where it is, it might be helpful as well.
 
I found the code in the myrunuo and it looks like its just doing manual queries to a sql db. I believe with LINQ you can model a db where you can just directly store objects in c# which will make it easier than having to type out everything that gets saved into its own sql query. I have a few ideas to implement the system and can probably use the myrunuo code as well. I will have to study LINQ a bit more and see if changes to the db schema can be made easily or if not I will need to implement some workaround. Because later updates of the code that require new save data won't work if not.

I'm thinking worst case scenario just have a column for serialized data and anything extra that people want to add can just throw it into that column and not have to mess with changing the db schema. I know other enterprise applications when they do a major upgrade they just create layers of views in the database that go on top of data from older versions of the app. Right now in servuo the versions are hard coded into case statements when stuff gets deserialized. Anyways if I get something working I'll upload it to git so other people can play around with it.
 
Converting from the flat-file binary system may be as simple as saving things as binary blobs in the database.

Serialization is a fundamental practise of RunUO, not only do the Serialize/Deserialize methods of objects save/load data, they sometimes provide other functionality at run-time like verifying the data.

However you do database saves, you will either have to pass a custom GenericWriter or GenericReader to the Serialize/Deserialize methods; this can work to your advantage.

Instead of passing a FileStream-based BinaryWriter implementation of GenericWriter to Serialize(GenericWriter writer), you may pass a MemoryStream-based BinaryWriter and then use its buffer in a database query.

In theory, you'd only need one table with three columns to store your data, while maintaining compatibility;

'table_entities'
'serial' => INT(11)
'type' => VARCHAR(255)
'data' => BLOB

Where 'serial' is Item.Serial/Mobile.Serial, 'type' is the fully-qualified Type name of the entity and 'data' is the buffer of the custom GenericWriter you pass to Item.Serialize/Mobile.Serialize.

Deserialization would involve reconstructing the 'type' with the 'serial' as a constructor argument, then calling Deserialize with a custom GenericReader that has the 'data' buffered.

Keeping it in sync is the main issue, you won't be able to do the whole 'world saves every N minutes' and have it dump everything all at once, that will destroy your shard with millions of database requests.
This is usually the point at which people give up implementing this feature.

IF I were to write this feature for my own shard, I would leverage the pre-existing Item.DeltaQueue and Mobile.DeltaQueue processing to push updates to the database on a need-only basis. (Entities use their respective DeltaQueues to process critical updates for the object, such as name, hue, map and location changes).

Whenever an entity is queued for a delta update, I would use a MemoryStream-based GenericWriter implementation to call Serialize on the entity at that point, then use the writer's stream as the 'blob' for my database update query.


The reason the server pauses at the time of World Save is because of consistency.
Although the chances are low, it would still be possible for things like item parent tree conflicts.

For example's sake only;

Player puts an item in their pack.
World Save starts.
World Save saves data for the item (saving its' parent as the players' pack)
Player takes item out of their pack and places it on the floor.
World Save saves data for the players' pack.
World Save ends.
Shard reboots.
World Load starts.
World Load puts the item back in the player's pack.
World Load loads data for players' pack, the pack now has two time the weight of the item.


This isn't going to be an easy task, there are a lot of reasons why it hasn't been done mainstream with RunUO;
If I didn't cover them above, completing the feature and finding out yourself surely will :p
The .NET drivers used to communicate with *SQL services are often inefficient.
The ODBC driver for MySQL has some serious issues with performance over time with transactions for example.


Good luck!
 
OSI does this (saves to a DB). Back in the day though, their saves were only done roughly once a day and that was right before the servers rebooted. There were many times that I lost items because of their once a day saves. I don't know if this has been changed or not, but no matter if you're saving to a DB and/or just doing built-in saves, once a day is probably not good practice when it comes to RunUO/ServUO. I have noticed that the more often you do it causes issues with people trying to play (I think the default was saving every 5 minutes), but the world saves generally were very short (depending on item count, etc). However, if you spaced your saves out (say every 15-30 minutes), the saves might take a tad longer, but offered more time for "unobstructed" play from the world freezing during save time. There are a lot of things that can cut down on save times overall, but the one that most people seem to not like to do is FREEZE their maps when they add custom structures. This alone can really chop your save times dramatically.

I think if someone were to pick this up as a project for release that a DB save would be good, perhaps twice a day (every 12 hours), with world saves every 30-45 minutes, but you wouldn't want a world save and a DB save happening at the same time as that would be a lot of I/Os for most hosted servers (especially if you're using a VPS).
 
Converting from the flat-file binary system may be as simple as saving things as binary blobs in the database.

Serialization is a fundamental practise of RunUO, not only do the Serialize/Deserialize methods of objects save/load data, they sometimes provide other functionality at run-time like verifying the data.

However you do database saves, you will either have to pass a custom GenericWriter or GenericReader to the Serialize/Deserialize methods; this can work to your advantage.

Instead of passing a FileStream-based BinaryWriter implementation of GenericWriter to Serialize(GenericWriter writer), you may pass a MemoryStream-based BinaryWriter and then use its buffer in a database query.

In theory, you'd only need one table with three columns to store your data, while maintaining compatibility;

'table_entities'
'serial' => INT(11)
'type' => VARCHAR(255)
'data' => BLOB

Where 'serial' is Item.Serial/Mobile.Serial, 'type' is the fully-qualified Type name of the entity and 'data' is the buffer of the custom GenericWriter you pass to Item.Serialize/Mobile.Serialize.

Deserialization would involve reconstructing the 'type' with the 'serial' as a constructor argument, then calling Deserialize with a custom GenericReader that has the 'data' buffered.

Yeah, I was thinking this would be the simplest way of implementing it but I feel like the whole serialization can be bypassed although it might take some restructuring of a lot of code or at least I'll try to get most of it to bypass the serialization and keep part of the data serialized. One thing about having the data in structured tables in sql is because you can optimize it and doing specific queries on the fly would result in greatly reduced ram usage because LINQ will only query items as it runs into it (unless you specify it otherwise) so you could have garbage all over the world but none of it would get loaded into ram until someone ran by it. Also another feature of having the data structured is you can do cool analysis on the world external to the server application. You could have a badass web interface for admins and gms. One part of my vision is to have huge servers with like tons of npcs and such having virtual wars. I think its doable.

Keeping it in sync is the main issue, you won't be able to do the whole 'world saves every N minutes' and have it dump everything all at once, that will destroy your shard with millions of database requests.
This is usually the point at which people give up implementing this feature.

IF I were to write this feature for my own shard, I would leverage the pre-existing Item.DeltaQueue and Mobile.DeltaQueue processing to push updates to the database on a need-only basis. (Entities use their respective DeltaQueues to process critical updates for the object, such as name, hue, map and location changes).

Whenever an entity is queued for a delta update, I would use a MemoryStream-based GenericWriter implementation to call Serialize on the entity at that point, then use the writer's stream as the 'blob' for my database update query.

I agree, saving the whole world every n minutes is just inefficient, and I was thinking about using the delta queue to just stream updates to the database from a memory buffer.

The reason the server pauses at the time of World Save is because of consistency.
Although the chances are low, it would still be possible for things like item parent tree conflicts.

For example's sake only;

Player puts an item in their pack.
World Save starts.
World Save saves data for the item (saving its' parent as the players' pack)
Player takes item out of their pack and places it on the floor.
World Save saves data for the players' pack.
World Save ends.
Shard reboots.
World Load starts.
World Load puts the item back in the player's pack.
World Load loads data for players' pack, the pack now has two time the weight of the item.


This isn't going to be an easy task, there are a lot of reasons why it hasn't been done mainstream with RunUO;
If I didn't cover them above, completing the feature and finding out yourself surely will :p
The .NET drivers used to communicate with *SQL services are often inefficient.
The ODBC driver for MySQL has some serious issues with performance over time with transactions for example.


Good luck!

Yeah, I remember back in the good ole UO days with the server lines and such and I think people used them to dupe items. Consistency and bugs like duping is a huge nono so the world is going to have to have some consistency logic put into it. One thing I was thinking about was if the delta queue buffer stream to the db gets too big just stop buffering until the buffer is flushed and set a checkpoint to send to the db. It would make for a short period of time on the server where nothing is currently being saved so if it crashed while making the checkpoint it would lose some data but with the current save method with the serialization and all its pretty quick or if ram isn't too tight I could just create a separate buffer that waits for the checkpoint to finish. If the server crashes it has a checkpoint to go off of if there is some corruption somewhere since the last checkpoint. I know optimizing this will probably require some painful attention to detail but its the best way to learn.
 
Last edited:
OSI does this (saves to a DB). Back in the day though, their saves were only done roughly once a day and that was right before the servers rebooted. There were many times that I lost items because of their once a day saves. I don't know if this has been changed or not, but no matter if you're saving to a DB and/or just doing built-in saves, once a day is probably not good practice when it comes to RunUO/ServUO. I have noticed that the more often you do it causes issues with people trying to play (I think the default was saving every 5 minutes), but the world saves generally were very short (depending on item count, etc). However, if you spaced your saves out (say every 15-30 minutes), the saves might take a tad longer, but offered more time for "unobstructed" play from the world freezing during save time. There are a lot of things that can cut down on save times overall, but the one that most people seem to not like to do is FREEZE their maps when they add custom structures. This alone can really chop your save times dramatically.

I think if someone were to pick this up as a project for release that a DB save would be good, perhaps twice a day (every 12 hours), with world saves every 30-45 minutes, but you wouldn't want a world save and a DB save happening at the same time as that would be a lot of I/Os for most hosted servers (especially if you're using a VPS).

Yeah the IOs might be an issue for VPS servers. I would need to have a buffer that keeps track of the deltas and send that over to the db every so often. I'm trying to avoid whole world saves because it seems inefficient to me. But one step at a time.
 
If you wanted to do a web interface using the data, it would be better to serve it as Json from an API in real-time ( http://play.uofreedom.com/shard )

I wouldn't worry about the delta queues being too large for queries;
If you use a range of entries from the queue, you can send batch queries.
I found that with MySQL, you can create a batch of up to 10,000 queries in a single transaction, which will typically complete in under 1 second to a localhost MySQL server.

As for not loading trash in to the world until people run past it - think about the process that would involve;
Whenever a character moves, you'll need to know what to load, so you have to store some data somewhere and check it - if you don't store that data in memory, then you have to query the database. This doesn't sound near as optimal as just having the object sitting in the world doing nothing. The other caveat to lazy-loading objects would be that other systems that rely on those objects would break down.
 
Yeah, that makes sense. 10k queries or 10k rows? I guess its all relative as to whats being done on the db. As far as the lazy loading goes its built into the underlying LINQ framework. What I would do is make an abstract item class that has some basic information about items such as their location, position, type. That way the world can load the items at start but if something comes up that requires more information like armor stats or something then it dynamically queries the db when the inherited armor class gets executed for a related armor table. There are upsides and downsides to lazy loading of course I'll have to implement it and do some testing but so far I'm still learning the structure of servuo and c#.

I've gone down the rabbit hole and started digging from world.load() and world.save() and I started going through the serialize code and pulling all the variables from that and started designing the db tables. Not sure if clothing should have its own table because it seems identical to armor except for some clothing attribute. Its a lot of work but I'm bored in between jobs right now so it keeps my mind sharp lol.
 

Attachments

  • db.JPG
    db.JPG
    165.1 KB · Views: 58
You could automatically generate your tables by using reflection to take a snap-shot of all the readable/writeable properties and then deferring their database storage type to determine the columns that are needed.

You will likely need a table for each type of mobile and item that exists, objects that inherit BaseClothing will sometimes have additional serialization requirements.

I'm not sure how you will get around things like special code being executed in a Deserialize method;
As I mentioned, the method isn't just used to load the data state, it's use for all sorts of things like starting timers, de-fragmenting lists, etc.

I'm really interested to see what you come up with :D
 
You could automatically generate your tables by using reflection to take a snap-shot of all the readable/writeable properties and then deferring their database storage type to determine the columns that are needed.

You will likely need a table for each type of mobile and item that exists, objects that inherit BaseClothing will sometimes have additional serialization requirements.

I'm not sure how you will get around things like special code being executed in a Deserialize method;
As I mentioned, the method isn't just used to load the data state, it's use for all sorts of things like starting timers, de-fragmenting lists, etc.

I'm really interested to see what you come up with :D

Yep, I use Deserialize to remove errant properties on load for the Humility Hunt for the Humility Virtue, though it could be done elsewhere too.
 
Well after much debugging and hitting my head and cursing microsoft I managed to get the world to write the mobiles to the db. Well most properties of the mobiles there are some things I need to code in for certain things because it doesn't like integers being null. Its really slow saving though but I can probably do some more optimizing to it and change the sql behind the linq insert and update table code. I basically had to write my own version of the serialize method. I did end up serializing things like item lists. Next step after getting all the mobile data saved is getting items and guilds and custom data to save then finally to loading everything back in lol. One step at a time I guess.
 

Attachments

  • save.JPG
    save.JPG
    167.2 KB · Views: 69
I'm glad you are making progress on this! Have you given any thought to retaining backward-compatibility to be able to load world save data for existing saves? That will be crucial for existing shards.
 
I'm glad you are making progress on this! Have you given any thought to retaining backward-compatibility to be able to load world save data for existing saves? That will be crucial for existing shards.

Yes, I am leaving all the existing code in for the most part just adding in some if statements to switch to my code if its turned on. I'll be sure to add some command like SaveToSQL while using any save/load method. That would convert the current world to the SQL db then you could load from the SQL db after that.
 
Just wanted to post a quick update on my progress. It's been slow but I'm slowly learning c# as I go along. So far I've managed to get the mobiles saved in a bulk insert which is much faster than what linq was doing. Used to take 6 seconds to save a few thousand mobiles now it takes 0.4 seconds. Not as fast as I want but that leads me to my next task and that is saving to a buffer in memory first then streaming the data over to the dB. Should lead to very fast world saves. I'm also thinking of never overwriting saves lol that way we can do some detailed stats. If it takes up too much space I can set a job to delete the oldest saves based on a size or time limit.
 
The way saves currently work is by writing all the data to a memory buffer, then dumping it all to a file in one operation.
The bottle-neck comes from the CPU when it comes to world saves, actual serialization of all the items and mobiles takes milliseconds, but dumping that memory to file is the expensive part.
With a database, it's technically no different, the database software/controller still has to take all that data and write it to disc. Database software is, for all intents and purposes, an API that manages a data structure that is stored on disc. With this definition in mind, it's not a stretch to view RunUO as being it's own database software.
The benefit to using an outside database is that you can dump all of your data to the database API and have that API worry about buffering and writing, while your application continues uninterrupted, which I believe is what you're aiming for. I'm not sure if this approach is much different to having world saves run in a background thread, but that's here nor there at this point.

So far, I'm impressed and pretty interested to see how it goes :D
 
The way saves currently work is by writing all the data to a memory buffer, then dumping it all to a file in one operation.
The bottle-neck comes from the CPU when it comes to world saves, actual serialization of all the items and mobiles takes milliseconds, but dumping that memory to file is the expensive part.
With a database, it's technically no different, the database software/controller still has to take all that data and write it to disc. Database software is, for all intents and purposes, an API that manages a data structure that is stored on disc. With this definition in mind, it's not a stretch to view RunUO as being it's own database software.
The benefit to using an outside database is that you can dump all of your data to the database API and have that API worry about buffering and writing, while your application continues uninterrupted, which I believe is what you're aiming for. I'm not sure if this approach is much different to having world saves run in a background thread, but that's here nor there at this point.

So far, I'm impressed and pretty interested to see how it goes :D


Well let me explain what I mean or am trying to do with the buffer decreasing world save times. From what I understand the reason the world pauses during saves is to prevent inconsistencies with multiple threads and a big ever changing world. So the world pauses and saves and lets everyone go on their marry way. Really though we don't need everyone to wait for the world to flush to disk before resuming. We can pause and have everything copied (which it does anyways now) then resume the world and then deal with the buffered copy as we send it to disk. It should make for pretty instant world saves. Fast enough to not even drop any packets.

Although I have some ideas for other tweaks when sending everything to do the db I can load up 3 threads and connections to the db for mobiles, items and custom data. This should take advantage of multi core systems better and since the data is independent of each other no worries about locks. Yes runuo is a database system in itself, the whole serialization and persistence code is a pretty large chunk of the overall codebase. Moving it over to sql will just make it that much easier for shard owners manage their server and backups etc....
 
Our current persistence strategies do support background asynchronous writing of the save file. However there is a complex system of interlocking flags that enable this feature, which will never all be true without core edits. I think the primary risk with enabling this feature is the fact that it hasn't been tested in lord knows how long.
 
Well just a short update. I have managed to get mobiles to save to the dB and then also load to the world when the server loads. As my c# skills improve I am able to code faster. I have also got items writing to the dB now but I'm stuck on getting them to load. I'm too tired to keep debugging right now and will do some more tomorrow. But what's happening is I'm getting stuck at loading the types into the constructors. Basically it seems like it saves and loads the list of types but I'm getting types saving a type reference index value of 7000 and when I check the list of types saved there is only like 100 so not sure where it's getting that value and causing out of bounds errors but I'll debug it tomorrow.
 
I can assure you there are way more that 100 types :)
lol well I only had about 100 item types loaded at the time.

I managed to get the type index fixed by saving the item types to its own table in the db. Now I just realized how fucked I am, lol. When items get serialized/deserialized it calls their own serialize class then passes it down so like for example serialize(kasa) it goes to the items folder in scrips in hats and kasa has its own serialize class which writes its stuff it needs to save then it passes it to base hats serialize which writes its own stuff then passes it to base clothing which passes its own properties to write then passes it to items serialize which writes the rest. I might implement some of the base serialize classes into the db but I dont really want to go to every item and write another serialize function, and its not really a good idea to give each item type its own table in the database anyways. So what I'm thinking is maybe doing what someone mention earlier in the thread and creating a generic writer and letting the item do its serialize stuff and putting that information into a memory stream and writing that to an extra "data" column into the db and just having extra serialized data in there like that. It does kind of defeat the purpose of having a db but it will just be a shortcut in the meantime. I might be able to create a view in the db anyways with its own deserialize procedure so it might work out later.
 
You're doing a great job, even if for any reason you don't finish it (I think you will though), you will have learned so much by the end.

If you do decide to store all the data as raw binary in a single column, you could always try your hand at writing a WinForms app to be able to edit save files;
You can do this by referencing ServUO.exe and Scripts.CS.dll in your WinForms project, which will give you access to the RunUO API and the ability to create mobiles and items and call Serialize/Deserialize whenever you want. However, you will first have to manually configure some things that would normally happen when ServUO.exe is executed (such as assigning World.Items/World.Mobiles a new Dictionary) or you will have issues. It would be nice to have a SaveEditor.exe capable of editing the physical files and/or your database tables :D
 
Thanks, I've managed to get the world to load and save mobiles/items from the db finally, woot! so a large part of the code is done.

As far as serialization goes I've added the data column to both items/mobiles but the base mobile/item classes aren't serialized its writing and pulling directly from the db. The rest though is being serialized because the code for that is over in scripts and I don't want to modify that just yet but I will later on. It will take a lot of work though because I'd have to code for all the base classes and create tables for each of them. I might find a lazy way to do it though with reflection. I'm also looking into using JSON for the data that needs to be serialized so its easy for web and other apps to hook in a read the data as well as being more human readable.

Right now I'm going to do some load testing because its ~10x slower reading/writing to the db than it is to a flat file although I've yet to really start optimizing things. I'm doing a xmlspawn now but it takes forever but once it finishes I'm going to save it to the db and load it back and do some bench marking.

As far as creating a save editor it should be pretty easy to create an app or even a web app to do it. One step at a time lol.
 
Just some updates, I have added some threading and buffering to the code to increase performance but its still slower than regular save/load. With a fully populated world, save times are around 2-3 seconds. Still way too slow to turn off world save pauses unfortunately so I'll have to come up with some other ideas to eliminate world save pauses. World loads take around 6-7 seconds even with multi-threading using 100% cpu. If you look at the attachment you can see around 3 seconds spent loading data from the db then 3 seconds spent just processing them. I might look more into lazy loading that way things just get queried and loaded on the fly. I'll mess around with performance more later. Next thing I'm going to work on is saving/loading guild data. I have designed the tables already so next part is to write the code.

At some point I might setup a public test shard for people to log on and play around. Then I can setup a simple website with stats or functions for people to edit data via the site.
 

Attachments

  • sqlload.jpg
    sqlload.jpg
    415.8 KB · Views: 37
I have added guild data save/load functionality but I need to test it more to make sure it works. I have also optimized the save so the buffers are created in ~50ms which unfortunately still not fast enough to remove save state pauses but still pretty damn fast. I'll have to look into c# programming further to see if I can do some sort of differential to speed it up. I could try multi threading but I'm afraid it will just slow it down. Next to add is just custom data which is pretty straight forward and the account info then I think most if not all data will be saving/loading to the SQL db.

EDIT: Well 50 ms isn't too slow to remove save pauses but it would just add 50 ms lag during saves which isn't too bad. But I'll leave it in there for now.
 
Last edited:
This is a very interesting project. I was wondering if you were familiar with No SQL / Big Table types of databases?

If not you may want to have a quick look at it :

https://en.wikipedia.org/wiki/NoSQL
http://nosql-database.org/

Nosql databases sacrifice relational consistency for speed. Considering that the bulk of the game data is a set of properties associated to a unique serial id it might be preferable to go with no-sql.

Don't know if this is useful but I thought I'd share this info in case.
 
When I initially started this project I was looking at NoSQL solutions like object oriented database and other databases like cassandra but ended up deciding against it for several reasons. Mainly because I was a SQL DBA and that is what I am familiar with but also it would make things like reporting and data analysis with external applications more difficult. Its possible to export a NoSQL db to a relational one but that seemed redundant to me.The UO database isn't all that big either (however id like to hear from some larger shard owners how much space their saves directory takes) but for a fully populated world it is only like 3-400mb. It only takes about 2 seconds to query the data. A object oriented database would probably be as fast as the current serialization method but with buffering and some optimizations the SQL db works pretty well so far. There is still even more techniques that can be implemented to query and update data on the fly and pretty much remove load times altogether. If some shards get large enough to get like 100gb+ of data then it might be worth looking into. It would be interesting to have something like where every action gets recorded to a db to the point where you could replay all actions on the server for a given time period, although it would take a lot of work and I wasn't that ambitious lol.
 
Makes sense, if you have more complex plans for the shard data. Lots of interesting potential for web based companion app to go along with the game.

Indeed if you could save to db all the actions and changes then you'd be able to do away with saves all together. As for load time its a lesser issue as it only happens when the server boots up.
 
OK, so I finished coding the custom savedata save/load. I have set up a test server at 52.3.247.37 if anyone wants to log in and test it out. If it crashes I can check the logs and try to fix bugs. Eventually I'll get around to making a SQL script to create the db and some console commands to save/load to the SQL db so people can easily convert their data over. Eventually I'll go through the serialization methods in the scripts folder and add the base methods to the db and code it for storing/loading to the SQL db. Right now that extra data is just being serialized and added on to a column at the end of the mob/item/savedata. I'll try to add a little sample website soon so you can query the db and see all the data however you want it. I know there is some cool stuff I'll probably try out for data visualization at https://d3js.org/ . Once I get some bugs out and code it be deployed easier, I'll write up a guide on how to set up a shard using a SQL db.

I still need to design something for what to do with schema changes/updates using views in the db also a better backup or archive strategy. One thing at a time lol.
 
I added dynamic database creation, so if a db doesn't exist it auto creates it with the appropriate tables and column mappings. It should be fairly easy to setup a new shard all you need to do is have a local sql database server and make sure servuo has permissions to it and off you go. I will implement the console commands shortly so people can convert their regular world save to the sql db. The world saves pause states are pretty fast and not even noticeable because most of the work is done in the background, which is nice. I'll also have to add something to a configuration file in the future so servuo knows to load from a db or not and also something for setting the connection string if you want to use a remote db with authentication.

I have done some testing and there is an issue with item save/load. What happens is something isn't being save or initialized properly and the smart cleanup system deletes certain items and eventually you run into a problem where some water elemental somewhere doesn't have a backpack and it causes a null reference error. So I'll have to fix that and do more testing before its considered stable as well.
 
Sorry for lack of updates, I've been lazy and debuging this code is hard and I gave up for a bit lol. Anyways, I've been researching how other MMO's do persistance and it seems that relational databases arent the best because games tend to have a lot of hierarchical data. Although a lot still use SQL like eve-online although they store everything in ram. I started noticing this when loading the world from db takes twice as long as saving it and its because of all the nesting that goes on. So say I have a bag within a bag within a chest in my bank that contains bone armor. That item ends up being related to several different things and the server has to process all that when loading the object. One way of handling this I was reading about is basically including a column that lists the hierarchy path for the object. So the armor would have a path column with like world->playerid->bankboxid->bagid->etc...so it makes it easier say if the player moves the bag containing the armor to another player all that would need to change is the playerid.

I was looking this up because the error I'm getting has to do with the map property not being set on some objects and causing null refrence errors because the game doesn't know where the object is and this would make sense in fixing the bug. It would also help with implementing a normal db model that the server can lazy load everything to keep ram usage down and the next step would be to implement some sort of transaction system for writes to have a continually persistant world. Right now the server just loads everything to ram and dumps everything to the db when saving.
 
I fixed the null reference item bug, well sort of, the code just removes the null items I don't know why they are being loaded up as null yet though but I have done some testing and there is very few that end up being removed and I believe its a normal part of the cleanup process. But the server has been stable for me so far now.

I was doing some research on some more database designs and ran into something that would work well for all the extended attributes for all the serialization that goes on in the scrips folder. Designing a new table for say every different type of creature is really inefficient however I could implement a EAV type table for say extended attributes for mobiles and one for items.
https://en.wikipedia.org/wiki/Entity–attribute–value_model
they use this for medical databases where any number of patients could have any number of conditions, which research is always changing. It would make it easy to add new features to the game without having to go redesign the whole db schema every time. It would also work with Hierarchy pathing I had in my previous post. It would make searches pretty easy to do so you could run a search on a character and it would return everything related to that character.

My next steps though right now are to make the code more user friendly and include stuff in the config files and methods to easily convert existing worlds into a sql database and some sample web applications to modify and query the db. I might be able to peak the interest of some shards to start using my code so they can test it for me and find any bugs or give feedback.
[doublepost=1466410345][/doublepost]I added the SQL configurations in the autosave.cfg file. To port your shard over just install SQLexpress on your server and make sure you got a backup of your Saves folder or rather just copy your saves folder over to wherever you put the new servuo and run it. Once the world loads, just type save in the console and see if it saves to the sql database. Once it does go to the config folder and edit autosave.cfg and set SQLLoadEnabled=true and then you can run servuo again and it will will now load from the sql save you just made. I don't reccomend using this on a full time shard just yet because it has limited error checking and handling so if stuff is wrong it will just crash.

Next I wanna mess with Voxpire's art client API and use it with the database to make a simple site that dynamically shows anyones paperdoll on the shard. Having something that modifies the db would be a bit more complex because the server will just overwrite any changes if its already running. I will have to implement some sort of transaction system to have database edits be useful, otherwise you would need to shut the server down and make the edits then start it back up.
 
For the latter part you could at worse make a new table for "update_those_craps" with only a column containing the serial number of the "entity", so in the server it would do checks once in a while and update according to what you added, so something along those lines.
What are your thoughts on that?
 
For the latter part you could at worse make a new table for "update_those_craps" with only a column containing the serial number of the "entity", so in the server it would do checks once in a while and update according to what you added, so something along those lines.
What are your thoughts on that?

Yeah, that is kind of the idea behind EAV tables its basically a table that will contain either mobile serial or item serial plus a column for an attribute (well I'll probably have 2 columns for it) and a column for the value of the attribute. So say for example, I have a gargish documents book "ChronicleOfTheGargoyleQueen1" I want to write to the db. I will put a row in a table called Item EAVs and column 1 will be the item serial, column 2 will be the item serialization level because there is also a base book serialization so for this row I will put "ChronicleOfTheGargoyleQueen1", column 3 will be the attribute "charges", column 4 will be the value for example "10". I might add another column for version number if people want to version their save data or whatever but it seems wasteful to me since most items are at version 0 and they can add another row if they want to have versions. Also if certain values are null or contain no userful info I can create some sort of default constructor to fill in default values so there isn't a bunch of useless rows in the db to query. So ultimately this item will look like the following in the EAV table

Serial Level Attribute Value
115324 ChronicleOfTheGargoyleQueen1 Charges 10
115324 BaseBook Content This is the greatest book of all time blah blah
115324 BaseBook Version 4
115324 BaseBook m_SecureLevel 1
115324 BaseBook flags 4f
115324 BaseBook m_Author 23454


I could add another column for heirarchy path so it would be super easy to query all information related to something like if I wanted to look up your character I could query your mobile serial and it would return everything related to that mobile in one query instead of having to write a new query for each item you own etc.

Right now I'm sort of stuck though because my knowledge of C# is still newb. I have to pretty much add my own serialization class for the database. This is also a ton of coding work because I have to go through all the serialize code and add a similar database write code. If you do a simple search in the scripts folder you can see the words "serialize" and "deserialize" littered throughout pretty much every file.
 

Attachments

  • eav.PNG
    eav.PNG
    15.4 KB · Views: 16
Thought I would make a post since its been a year lol. I'm just getting back into this because I want to work on my c# skills some more. Anyway I wanted to test my code on a linux server and I can get it to compile and run in mono but getting it connect to a mssql db from there is tricky. Plus if I wanted to get away from windows completely I would need to use a linux db like mysql or postgreSQL. This requires a 3rd party add on which just happen to come with mono assemblies called dblinq and Npgsql. So I will need to add this to my list of things to work on to get it working completely in linux. I've rented a dedicated server running my latest build which is actually old and I need to update to the latest commits. 192.151.158.220 is the ip of the server if anyone wants to log in and see the system in action with my 0 second world saves thanks to multi threading.

I've been also thinking a lot on how to get rid of the rest of the serialized data and have it all in a human readable format so it can be queried easily. I want to be lazy as possible though and thought about using reflection to get the property names and values and then just dumping all that info to a table in the db. There are some problems though because some things end up referencing themselves and cause the reflection recursion to go to an infinite loop. Another problem with this is it gives way too much information that doesn't need to be saved and reflection is notoriously slow. I might try to implement it anyways just to test it out for science but realistically I can't go about changing the serialization of every item and entity in the scripts folder.

I was also looking into some other frameworks or other "micro-ORM" and NHibernate and Nfluent seem really interesting as you can just map objects directly to the database so I might just throw everything out and restart using that lol. I can just create maps for entities and items and just give it to Nhibernate to figure out and not worry about it. Although I suspect it won't be that easy.
 
Back