Breaking ties between data and its origin

Jul 5, 2014 at 2:44 PM
Edited Jul 6, 2014 at 10:44 AM

I apologize in advance for the wall of text, but please don't skip ahead.

We've always had public classes that are littered with all sorts of attributes from the Newtonsoft.Json and System.Runtime.Serialization namespaces. These attributes are obviously needed in order to parse API data, but they are completely useless beyond that:
  • [DataContract]
  • [JsonProperty]
  • [EnumMember]
  • [JsonConverter]
  • ...
Also, the class layouts closely match those of the JSON documents. Not because we have to, but because writing custom converters is difficult and not worth it.

This tight cohesion between the data and its source (i.e. /v1) has been annoying me increasingly ever since I first started thinking about a database solution. I've had massive headaches trying to make things work with EF6 as-is. And for the most part, they did work, but nothing seemed to work completely how I want it.

My wish now is to make API implementation details invisible outside of the library.
What I'd like to do is completely rip out all parsing logic and move it to separate namespaces. This involves getting rid of the attributes that I talked out before.

How? To accomplish this, I'd first bring back the original Models namespaces. Those classes (let's call them API classes) are one-to-one mappings between the JSON values and their .NET types, which is perfect. Then, the service implementations would be updated to use these API classes as an intermediary layer. Finally, the API classes would be mapped to more suitable, publicly visible classes.

  1. Maximum freedom in class design, no longer limited to the source JSON schema
  2. Classes can be re-used for both /v1 and /v2
  1. Code is more difficult to maintain: a single API change may require updates in 2 files
  2. Mapping one object to another causes some amount of overhead
Thoughts? Note that these changes are internal. High-level access through ServiceManager remains pretty much the same as it is now.


I'm going to think about this some more and not do anything until /v2 is released.
Jul 8, 2014 at 12:39 PM
I thought about it too. The whole API (/v1 at least) is designed inconstant, sometimes the ID is outside the object, sometimes it is inside, other endpoints return a collection where it is not needed, etc. (you get my drift). Most of said problems are really minor, but cause some headaches for the end user. I think it is our job to solve those problems for the end user.

In my eyes your suggestions sound solid, and I have no problem using them. However we should first make a concept, so we don't rush into problems while developing it.

Additionally I looked into LINQ support (I am a great fan of the whole LINQ concepts, with it's deferred evaluation and extension methods). If we would go ahead with these internal changes I'd like to add LINQ support as well (much in the style LinqToTwitter does it).

I'll look some of the tutorials I found on the net and see how we can implement it later on.
Jul 8, 2014 at 2:18 PM
Do you have access to a premium version of Visual Studio? It has some very nice modeling tools that I'd like to use once I figured out how. I love diagrams.

More information:
Jul 8, 2014 at 5:18 PM
Edited Jul 8, 2014 at 5:20 PM
Yep, I have Vs2013 Ultimate, curtsey of my University :) Will look into the link you sent me.
Jul 9, 2014 at 3:38 PM
Edited Jul 9, 2014 at 3:39 PM
Playing around with UML diagrams, I'm wondering if it's possible (or even useful) to link code to class diagrams.

Consider this class diagram for the "trinket" model as represented by the API. These diagrams are definitely pretty to look at, but it doesn't seem like I can edit the diagram and have it automatically update the code (or vice versa). But I'll keep looking.

Jul 9, 2014 at 3:46 PM
I was reading the LINQ Implementation Tutorial on MSDN and we really should look into it a bit more.
Jul 9, 2014 at 3:49 PM
MSDN has a tutorial for that? Where?
Jul 10, 2014 at 1:29 PM
We seriously should look into this stuff. I was always a fan of LINQ, but mostly form the end user perspective, seeing how much easier and readable it made my code, but now since I looked into the "behind" it made me realize how much it benefits the programmer, too!

One example:
Since the whole LINQ work is done in a provider and the rest is mostly generic we can write a provider for each caching method we use. The whole query system is abstracted, thus we can inject the code we need. Since the provider also handles the transformation from source data to target data we also solve the problem we face when mapping the API to our C# objects.

If I understood the MSDN tutorial and blog correctly, we wouldn't need a second class, since the provider handles the transformation.
Jul 10, 2014 at 2:27 PM
Edited Jul 10, 2014 at 3:02 PM
I can sort of envision how it would work on a very high level, but I've never programmed with expression trees before. It looks a bit overwhelming.

We do still need a second set of classes, purely for JSON conversion. Otherwise we're back at the beginning. I suppose you could use LINQ to JSON instead, but then you're giving up a lot of functionality in the Json.NET framework. Most importantly error logging.
Jul 11, 2014 at 6:51 PM
Edited Jul 11, 2014 at 8:20 PM
Here's a layer diagram that clearly shows the dependencies between various layers in our existing architecture. There are more than a few arrows that shouldn't be there.

layer diagram

In contrast, here is a layer diagram of what I'm working towards.

layer diagram
Jul 14, 2014 at 12:48 PM
Looks good.

I'm still trying out the LINQ provider tutorial, will report how it goes.
Jul 23, 2014 at 2:06 PM
Any news? I'm about ready to upload the improved service implementation. Just need some time to fix regressions and to clean up a few other things, like making sure that infrastructure classes are marked internal.
Jul 23, 2014 at 8:00 PM
Upload it! The LINQ provider seems to be a bit more tricky then I thought, I guess it will not be implemented anytime soon, but more like a silent development alongside the main branch.
Jul 24, 2014 at 12:04 AM
Okay, it's done. The last couple of changesets don't seem like much, but they are. I cleaned up so many things that probably shouldn't have been there in the first place. Sometimes it's hard to find the balance between frameworking all the things or just getting the job done. But I think we have a good balance now.
Jul 26, 2014 at 2:39 PM
Just saw the 37215 commit... What in the name of everything that is holy, that is one big commit.
Jul 26, 2014 at 2:51 PM
Yup. :)

Most of the work is in getting rid of custom json converters. They were a bitch to maintain, and aren't compatible with other serialization engines. Now, we only use standard FCL attributes (DataContract and DataMember). These are supported by other serializers. One of my ideas was to use the DataContractJsonSerializer by default so that we can move the Json.NET stuff to a separate DLL. Similar to what we're already doing with the RestSharp stuff.
Aug 3, 2014 at 8:25 PM
Edited Aug 3, 2014 at 11:20 PM
We got ourselves a little situation here.

I put everything in place to move out the Json.NET stuff and use DataContractJsonSerializer by default. But here's the thing: it can't deal with dictionaries such as the ones that are returned by the /v1 APIs.

This means that it cannot parse the following endpoints, partly or whole:
  • /v1/event_details.json
  • /v1/continents.json
  • /v1/maps.json
  • /v1/map_floor.json
  • /v1/colors.json
  • /v1/files.json
I think that's a pretty impressive list of features that depend entirely on the people behind Json.NET...

Luckily, /v2 has no such problems. But we don't know when that will be released.

What's your opinion on this? My idea was to provide Json.NET support as an optional component because of its enhanced performance. I didn't realize that we were so dependent on it.


Apparently, this problem has been addressed in .NET 4.5 with the addition of the DataContractJsonSerializer.UseSimpleDictionaryFormat Property.

So I guess we can...
  • keep the Json.NET dependency, throw our hands up and say fuck it
  • upgrade to 4.5 for an improved built-in serializer
  • keep using 4.0 and roll our own JSON parser ( way I'm doing that)
  • do something I haven't thought of.
Aug 4, 2014 at 7:52 PM
I think we should stay with JSON.NET. Microsoft is using it too in it's WebAPI2 releases. I also don't see a benefit of moving it out into a separate library. In my oppinion JSON.NET is still the fastest and most reliable way.
Aug 12, 2014 at 6:29 PM
Ruhrpottpatriot wrote:
I also don't see a benefit of moving it out into a separate library.
Only a tiny part of the library (2 files to be precise) actually require a reference to the the Json.NET stuff. That's why I wanted to move it to a smaller library. And I did. There is now a GW2.NET.Newtonsoft library that contains the only two files that require Json.NET.

It is still the default serializer, but now you're free to delete the Json.NET stuff and replace it with any other serializer. All you have to do is make a class that implements ISerializer, and then another class that implements ISerializerFactory for creating instances of that serializer. Wiring it all together is the only tricky part (multiple levels of dependency injections).
Aug 14, 2014 at 7:32 PM
That I can live with. We will still deploy the JSON.NET librtary with the main package and not in a separate library like GW2.NET R#. JOSN.NET is actively developed and fast enough, so not many users would want to change it, but we give them the possibilities.