This project is read-only.

RestSharp usage and Newtonsoft.Json dependency

May 22, 2013 at 9:28 PM
Since the project already references RestSharp, we can pretty easily remove the Newtonsoft.Json dependency and just let RestSharp deserialize the objects. I just checked in changeset 27949 that shows how to do this in EventData.cs.

So do we really need Newtonsoft.Json?
May 22, 2013 at 11:48 PM
Initially I used HttpWebRequest to get the Data from the ArenaNet servers and needed Newtonsoft.Json for de-serialisation. Now we use Rest# to fetch the data which removes the dependency to Newtonsoft.Json.

However the models I use are more close to the "official" .Net Coding Style (e.g. EventID instead of Event_Id). I'd like to keep as close to the "official" way as possible. This said I used Rest# to get the data and then de-serialize it with Newtonsoft.Json to the custom types.
If this is also possible with Rest# I see no problem in removing Newtonsoft.Json. If this is not possible we'd either have to move away from the models in the .Net style and name everything exactly like ANet did (not my favourite), or we'd have to use the Types in APITypes.cs as intermediates before converting them to the old types (I'd rather not do that as this greatly increases the maintenance required), or we'd have to stick with Newtonsoft.Json to convert to the models.

I'm not that experienced with Rest# so I don't know if a de-serialisation in models with different type names is possible. If it is not I'd probably stick with option 3.

However I will monitor the progress and might be able to change my mind later. For now I wills tick to the Rest# de-serialisation and compare it to the way I did it before (and will do for the moment in the Map and World calls.

I'm going to refactor some of the calls and the caching out of the current class into a new one and try to generalize it so we can use it for multiple types.
May 23, 2013 at 12:07 AM
I dug through their documentation a bit and found this:

When searching data for a matching element, the default JSON deserializer will look for elements in the following order:
1.Exact name match
2.Camel-cased version of property name
3.Lower-cased version of property name
4.Property name with underscores added (e.g. ProductId → Product_Id)
5.Lower-cased version of “underscored” name (e.g. Product_id → product_id)

Since most of the API names have underscores, I think that means we'll have to rely on an exact match.

Hmm, well maybe not. I just did some testing, and RestSharp did successfully take world_id and stick it into an int named WorldId. So I guess we can use .NET-style names after all, and RestSharp will find them.

However, the access to those variables has to be public so RestSharp can access it. So either we have to make everything on, say, the GwEvent class public and let RestSharp deserialize directly to those objects (and leave open the possibility that the caller will change stuff), or we have to use an in-between class like my APIEvent class where they are public, and then copy the results into a GwEvent class where the caller can't change anything.
May 23, 2013 at 12:32 AM
The problem is, the models are not classes but structs (Difference) so making setters public is a no go. That's why I went with the "constructor detour"
We could change it to class and make everything public but I'm not too fond of the idea.
May 23, 2013 at 12:44 AM
Is there a particular reason we're using structs instead of classes? I read over that article but I'm still not clear why structs are better for the models. I don't really care either way. Just curious. :)
May 23, 2013 at 2:17 AM
I think best practice nowadays is to use classes and just treat them like they are immutable...too many hidden pitfalls performance-wise when structs get involved, and just when you think you won't need functionality that only exists for a class you'll find out you do.

JSON.NET is I believe regarded as the gold-standard in .NET JSON deserialization performance, so if you're planning to do something like watch the Event feed, you're going to need everything you can get. But nothing stops you from putting a light wrapper around deserializing that uses default deserialization that you can replace at run-time.

If you still want my code's up for offer, but it doesn't really look anything like yours so I don't know if you'd want it at this point. Stupid GW2 forum won't let you PM within 3 days of registering your account :P
May 23, 2013 at 1:47 PM
Edited May 23, 2013 at 3:44 PM
I think the main reason why I used structs was the fact that they are value types not reference types. I googled again and found this excellent explanation:
In .NET, there are two categories of types, reference types and value types.
Structs are value types and classes are reference types.
The general different is that a reference type lives on the heap, and a value type lives inline, that is, wherever it is your variable or field is defined.
A variable containing a value type contains the entire value type value. For a struct, that means that the variable contains the entire struct, with all its fields.
A variable containing a reference type contains a pointer, or a reference to somewhere else in memory where the actual value resides.
This has one benefit, to begin with:
  • value types always contains a value
  • reference types can contain a null-reference, meaning that they don't refer to anything at all at the moment
Internally, reference types are implemented as pointers, and knowing that, and knowing how variable assignment works, there are other behavioral patterns:
  • copying the contents of a value type variable into another variable, copies the entire contents into the new variable, making the two distinct. In other words, after the copy, changes to one won't affect the other
  • copying the contents of a reference type variable into another variable, copies the reference, which means you now have two references to the same somewhere else storage of the actual data. In other words, after the copy, changing the data in one reference will appear to affect the other as well, but only because you're really just looking at the same data both places
When you declare variables or fields, here's how the two types differ:
  • variable: value type lives on the stack, reference type lives on the stack as a pointer to somewhere in heap memory where the actual memory lives
  • class/struct-field: value type lives inside the class, reference type lives inside the class as a pointer to somewhere in heap memory where the actual memory lives.
Meaning I primarily used them because the original instance is immutable and every change will make a copy of that instance.

As for Json.Net: I believe we can make a wrapper around the default de-serializer of Rest# so that if we call it it will use JSON.Net deserializer (at least this is possible with serialisation). This would give us the advantages of using JSON.Net without having it to reference anywhere except in Rest#.

Refim: I added you as a coder so you have write acces to the Solution. But could you upload your ideas via patch (I know that the Team foundation server does not support patches so you either have to build your patch with SVN or use this:

Bilong: I have one favour to ask of you: Could you you install Style cope and configure it like described here so if I update a code file I don't run into tons of warnings ;)
May 23, 2013 at 5:30 PM
Ugh, StyleCop sure is picky. Checked in some changes to make it happy.
May 23, 2013 at 5:52 PM
Edited May 23, 2013 at 5:53 PM
Yeah I know but this is the best way to ensure consistency. ANd you get used to it really fast, especially if you have ReSharper installed which makes everything much easier ;)

On another note: I got the deserialisation with Json.NET inside the CallAPI method working.
As code review on Codeplex is not working here is what I did:

I annotated the world Model with Json.Nets Attribute JsonProperty. Here is what it looks like now:
 /// <summary>
        /// Gets the name of the world.
        /// </summary>
        [JsonProperty("name")] // New
        public string Name { get; private set; }
This enabled me to change the CallApi method to the following:
public static T CallApiNew<T>(string apiMethod, List<KeyValuePair<string, object>> arguments) where T : new()
            var client = new RestClient("");

            var restRequest = new RestRequest(apiMethod, Method.GET)
                RequestFormat = DataFormat.Json

            if (arguments != null)
                foreach (var keyValuePair in arguments)
                    restRequest.AddParameter(keyValuePair.Key, keyValuePair.Value);
            // Get the response string, for now separated. Will be moded together later.
            var requestResponse = client.Execute(restRequest).Content;

            // Deserialize into the type specified.
            return JsonConvert.DeserializeObject<T>(requestResponse);
This works fine. There is no need for an immediate class or something else. The only thing I had to do is to make the id field writeable again and give the Id property a private set accessor. The project builds and all tests run fine. This is the only thing that currently bugs me. Everything else is fine. All worlds are returned with their proper Id and name. For now I see this s the best solution to our problem. Opinions?
May 23, 2013 at 6:18 PM
I also uploaded the changes into the source control depot. Please review them.
May 23, 2013 at 7:14 PM
Looks good to me. With similar changes to GwEvent, we can throw away APIClasses.cs.
May 23, 2013 at 7:17 PM
OK, will do the conversion asap.

Any Idea on how to solve the problem with the "not-read-only-anymore" id field for GetHashCode()? Should we leave it that way? I mean they are value types so the value inside them is not changing, but I could be wrong?

Any opinion on this?.
May 23, 2013 at 8:23 PM
I read through this blog:

And this thread:

It sounds like it just needs to return an integer that never changes, but you also don't want to give every object the same integer, or performance of Dictionary lookups and other hash table operations will be no better than a linked list.

But because our objects are structs, if the caller does change the id, wouldn't that just result in a new struct anyway? So technically the original object never changed. For our purposes here, I think we can just return

But if we wanted to do something that would also work for a class, one solution would be to just have the constructor generate a random int and stick it in a private readonly hashint variable, and return that int's hashcode in response to GetHashCode(). This way we have a hash code that never changes, even if the caller changes the id for some silly reason.

Another option is to just say screw it, return, and expect the caller to not be crazy. :)
May 23, 2013 at 9:46 PM
Edited May 23, 2013 at 10:17 PM
Will read the two sites later and comment on this.

I just uploaded the changes to the repository. As stated in the commit comment I'm not 100% fond of the current solution as I still have to deserialize into a dictionary, but this is a minor annoyance and I'm certain I'll find a solution for that too.

/edit: To clarify: Some APPI methods return a key and then their type (e.g. events.json) however some only return an array or a list of types (e.g. event_names.json or recipe_detail.json). While the de-serialisation onto the last one is not a problem since we can specify the type directly the first and second require us to use a dictionary.

I'll see what I can do about that.
May 23, 2013 at 11:28 PM
Since mine is so different, I'll just drop it off here so you can look at it and do whatever ya want with it (NuGet Restore to get the packages)


Strongly-typed models, replaceable (de)serializer (sadly lowest-common-denominator means no deserializing Enums).

The Examples project is a kind of demo of what it can pull down so far, and what the next layer on top would probably look like (storage object with events to subscribe to). If you want to max out your bandwidth lower the delay and get rid of the Where restricting it to a single server :P

TODO: Handle tasks correctly, maybe make task-based, and add the async/await stuff. Pretty much anything that would be required to actually make it viable to run at scale on a server.
May 24, 2013 at 9:13 AM
If I come back from university I'll look into it and see what I can do. You are still welcome to work with us on the project (maybe start implementing the WvW part of the api is something for you?