DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Curious about the future of data-driven systems? Join our Data Engineering roundtable and learn how to build scalable data platforms.

Data Engineering: The industry has come a long way from organizing unstructured data to adopting today's modern data pipelines. See how.

Threat Detection: Learn core practices for managing security risks and vulnerabilities in your organization — don't regret those threats!

Managing API integrations: Assess your use case and needs — plus learn patterns for the design, build, and maintenance of your integrations.

Avatar

Mikael Koskinen

Technical Lead at Adafy Oy

Jyväskylä, FI

Joined Dec 2011

About

Mikael is the founder of Adafy Oy, a Finnish software startup providing development services for Windows Phone, Windows 8 and Windows Azure. Adafy is available for hire: www.adafy.com.

Stats

Reputation: 151
Pageviews: 228.4K
Articles: 3
Comments: 0
  • Articles

Articles

article thumbnail
Event Aggregator for ASP.NET Core 3 Razor Components/Blazor
We take a look at this new, lightweight event aggregator and how to implement it in our ASP.NET Core projects using Razor Components.
February 20, 2019
· 12,428 Views · 1 Like
article thumbnail
MongoDB Aggregation Framework Examples in C#
MongoDB version 2.2 was released in late August and the biggest change it brought was the addition of the Aggregation Framework. Previously the aggregations required the usage of map/reduce, which in MongoDB doesn’t perform that well, mainly because of the single-threaded Javascript-based execution. The aggregation framework steps away from the Javascript and is implemented in C++, with an aim to accelerate performance of analytics and reporting up to 80 percent compared to using MapReduce. The aim of this post is to show examples of running the MongoDB Aggregation Framework with the official MongoDB C# drivers. Aggregation Framework and Linq Even though the current version of the MongoDB C# drivers (1.6) supports Linq, the support doesn’t extend to the aggregation framework. It’s highly probable that the Linq-support will be added later on and there’s already some hints about this in the driver’s source code. But at this point the execution of the aggregations requires the usage of the BsonDocument-objects. Aggregation Framework and GUIDs If you use GUIDs in your documents, the aggregation framework doesn’t work. This is because by default the GUIDs are stored in binary format and the aggregations won’t work against documents which contain binary data.. The solution is to store the GUIDs as strings. You can force the C# drivers to make this conversion automatically by configuring the mapping. Given that your C# class has Id-property defined as a GUID, the following code tells the driver to serialize the GUID as a string: BsonClassMap.RegisterClassMap(cm => { cm.AutoMap(); cm.GetMemberMap(c => c.Id) .SetRepresentation( BsonType.String); }); The example data These examples use the following documents: > db.examples.find() { "_id" : "1", "User" : "Tom", "Country" : "Finland", "Count" : 1 } { "_id" : "2", "User" : "Tom", "Country" : "Finland", "Count" : 3 } { "_id" : "3", "User" : "Tom", "Country" : "Finland", "Count" : 2 } { "_id" : "4", "User" : "Mary", "Country" : "Sweden", "Count" : 1 } { "_id" : "5", "User" : "Mary", "Country" : "Sweden", "Count" : 7 } Example 1: Aggregation Framework Basic usage This example shows how the aggregation framework can be executed through C#. We’re not going run any calculations to the data, we’re just going to filter it by the User. To run the aggregations, you can use either the MongoDatabase.RunCommand –method or the helper MongoCollection.Aggregate. We’re going to use the latter: var coll = localDb.GetCollection("examples"); ... coll.Aggregate(pipeline); The hardest part when working with Aggregation Framework through C# is building the pipeline. The pipeline is similar concept to the piping in PowerShell. Each operation in the pipeline will make modifications to the data: the operations can for example filter, group and project the data. In C#, the pipeline is a collection of BsonDocument object. Each document represents one operation. In our first example we need to do only one operation: $match. This operator will filter out the given documents. The following BsonDocument is a pipeline operation which filters out all the documents which don’t have User-field set to “Tom”. var match = new BsonDocument { { "$match", new BsonDocument { {"User", "Tom"} } } }; To execute this operation we add it to an array and pass the array to the MongoCollection.Aggregate-method: var pipeline = new[] { match }; var result = coll.Aggregate(pipeline); The MongoCollection.Aggregate-method returns an AggregateResult-object. It’s ResultDocuments-property (IEnumarable) contains the documents which are the output of the aggregation. To check how many results there were, we can get the Count: var result = coll.Aggregate(pipeline); Console.WriteLine(result.ResultDocuments.Count()); The result documents are BsonDocument-objects. If you have a C#-class which represent the documents, you can cast the results: var matchingExamples = result.ResultDocuments .Select(BsonSerializer.Deserialize) .ToList(); foreach (var example in matchingExamples) { var message = string.Format("{0} - {1}", example.User, example.Count); Console.WriteLine(message); } Another alternative is to use C#’s dynamic type. The following extension method uses JSON.net to convert a BsonDocument into a dynamic: public static class MongoExtensions { public static dynamic ToDynamic(this BsonDocument doc) { var json = doc.ToJson(); dynamic obj = JToken.Parse(json); return obj; } } Here’s a way to convert all the result documents into dynamic objects: var matchingExamples = result.ResultDocuments .Select(x => x.ToDynamic()) .ToList(); Example 2: Multiple filters & comparison operators This example filters the data with the following criteria: User: Tom Count: >= 2 var match = new BsonDocument { { "$match", new BsonDocument { {"User", "Tom"}, {"Count", new BsonDocument { { "$gte", 2 } } } } }; The execution of this operation is identical to the first example: var pipeline = new[] { match }; var result = coll.Aggregate(pipeline); var matchingExamples = result.ResultDocuments .Select(x => x.ToDynamic()) .ToList(); Also the result are as expected: foreach (var example in matchingExamples) { var message = string.Format("{0} - {1}", example.User, example.Count); Console.WriteLine(message); } Example 3: Multiple operations In our first two examples, the pipeline was as simple as possible: It contained only one operation. This example will filter the data with the same exact criteria as the second example, but this time using two $match operations: User: Tom Count: >= 2 var match = new BsonDocument { { "$match", new BsonDocument { {"User", "Tom"} } } }; var match2 = new BsonDocument { { "$match", new BsonDocument { {"Count", new BsonDocument { { "$gte", 2 } } } } }; var pipeline = new[] { match, match2 }; The output stays the same: The first operation “match” takes all the documents from the examples collection and removes every document which doesn’t match the criteria User = Tom. The output of this operation (3 documents) then moves to the second operation “match2” of the pipeline. This operation only sees those 3 documents, not the original collection. The operation filters out these documents based on its criteria and moves the result (2 documents) forward. This is where our pipeline ends and this is also our result. Example 4: Group and sum Thus far we’ve used the aggregation framework to just filter out the data. The true strength of the framework is its ability to run calculations on the documents. This example shows how we can calculate how many documents there are in the collection, grouped by the user. This is done using the $group-operator: var group = new BsonDocument { { "$group", new BsonDocument { { "_id", new BsonDocument { { "MyUser","$User" } } }, { "Count", new BsonDocument { { "$sum", 1 } } } } } }; The grouping key (in our case the User-field) is defined with the _id. The above example states that the grouping key has one field (“MyUser”) and the value for that field comes from the document’s User-field ($User). In the $group operation the other fields are aggregate functions. This example defines the field “Count” and adds 1 to it for every document that matches the group key (_id). var pipeline = new[] { group }; var result = coll.Aggregate(pipeline); var matchingExamples = result.ResultDocuments .Select(x => x.ToDynamic()) .ToList(); foreach (var example in matchingExamples) { var message = string.Format("{0} - {1}", example._id.MyUser, example.Count); Console.WriteLine(message); } Note the format in which the results are outputted: The user’s name is accessed through _id.MyUser-property. Example 5: Group and sum by field This example is similar to example 4. But instead of calculating the amount of documents, we calculate the sum of the Count-fields by the user: var group = new BsonDocument { { "$group", new BsonDocument { { "_id", new BsonDocument { { "MyUser","$User" } } }, { "Count", new BsonDocument { { "$sum", "$Count" } } } } } }; The only change is that instead of adding 1, we add the value from the Count-field (“$Count”). Example 6: Projections This example shows how the $project operator can be used to change the format of the output. The grouping in example 5 works well, but to access the user’s name we currently have to point to the _id.MyUser-property. Let’s change this so that user’s name is available directly through UserName-property: var group = new BsonDocument { { "$group", new BsonDocument { { "_id", new BsonDocument { { "MyUser","$User" } } }, { "Count", new BsonDocument { { "$sum", "$Count" } } } } } }; var project = new BsonDocument { { "$project", new BsonDocument { {"_id", 0}, {"UserName","$_id.MyUser"}, {"Count", 1}, } } }; var pipeline = new[] { group, project }; The code removes the _id –property from the output. It adds the UserName-property, which value is accessed from field _id.MyUser. The projection operations also states that the Count-value should stay as it is. var matchingExamples = result.ResultDocuments .Select(x => x.ToDynamic()) .ToList(); foreach (var example in matchingExamples) { var message = string.Format("{0} - {1}", example.UserName, example.Count); Console.WriteLine(message); } Example 7: Group with multiple fields in the keys For this example we add a new row into our document collection, leaving us with the following: { "_id" : "1", "User" : "Tom", "Country" : "Finland", "Count" : 1 } { "_id" : "2", "User" : "Tom", "Country" : "Finland", "Count" : 3 } { "_id" : "3", "User" : "Tom", "Country" : "Finland", "Count" : 2 } { "_id" : "4", "User" : "Mary", "Country" : "Sweden", "Count" : 1 } { "_id" : "5", "User" : "Mary", "Country" : "Sweden", "Count" : 7 } { "_id" : "6", "User" : "Tom", "Country" : "England", "Count" : 3 } This example shows how you can group the data by using multiple fields in the grouping key: var group = new BsonDocument { { "$group", new BsonDocument { { "_id", new BsonDocument { { "MyUser","$User" }, { "Country","$Country" }, } }, { "Count", new BsonDocument { { "$sum", "$Count" } } } } } }; var project = new BsonDocument { { "$project", new BsonDocument { {"_id", 0}, {"UserName","$_id.MyUser"}, {"Country", "$_id.Country"}, {"Count", 1}, } } }; var pipeline = new[] { group, project }; var result = coll.Aggregate(pipeline); var matchingExamples = result.ResultDocuments .Select(x => x.ToDynamic()) .ToList(); foreach (var example in matchingExamples) { var message = string.Format("{0} - {1} - {2}", example.UserName, example.Country, example.Count); Console.WriteLine(message); } Example 8: Match, group and project This example shows how you can combine many different pipeline operations. The data is first filtered ($match) by User=Tom, then grouped by the Country (“$group”) and finally the output is formatted into a readable format ($project). Match: var match = new BsonDocument { { "$match", new BsonDocument { {"User", "Tom"} } } }; Group: var group = new BsonDocument { { "$group", new BsonDocument { { "_id", new BsonDocument { { "Country","$Country" }, } }, { "Count", new BsonDocument { { "$sum", "$Count" } } } } } }; Project: var project = new BsonDocument { { "$project", new BsonDocument { {"_id", 0}, {"Country", "$_id.Country"}, {"Count", 1}, } } }; Result: var pipeline = new[] { match, group, project }; var result = coll.Aggregate(pipeline); var matchingExamples = result.ResultDocuments .Select(x => x.ToDynamic()) .ToList(); foreach (var example in matchingExamples) { var message = string.Format("{0} - {1}", example.Country, example.Count); Console.WriteLine(message); } More There are many other interesting operators in the MongoDB Aggregation Framework, like $unwind and $sort. The usage of these operators is identical to ones we used above so it should be possible to copy-paste one of the examples and use it as a basis for these other operations. Links MongoDB C# Language Center MongoDB Aggregation Framework Easy to follow blog post about the aggregation framework
October 11, 2012
· 47,084 Views · 2 Likes
article thumbnail
Unity: Passing Constructor Parameters to Resolve
In this tutorial we will go through of couple different ways of using custom constructor parameters when resolving an instance with Unity: By using the built-in ParameterOverride By creating a custom ResolverOverride. Background When you’re using a DI-container like Unity, you normally don’t have to worry about how the container resolves the new instance. You have configured the container and the container will act based on your configuration. But there may be cases where you have pass in custom constructor parameters for the resolve operation. Some may argue that this screams of bad architecture but there’s situations like bringing a DI-container to a legacy system which may require these kind of actions. Resolved class In this tutorial we are resolving the following test class: public class MyClass { public string Hello { get; set; } public int Number { get; set; } public MyClass(string hello, int number) { Hello = hello; Number = number; } It is registered to the container using RegisterType-method and without passing in any parameters: var unity = new UnityContainer(); unity.RegisterType(); So let’s see how we can pass in the “hello” and “number” variables for the MyClass’ constructor when calling Unity’s Resolve. Unity ResolverOverride Unity allows us to pass in a “ResolverOverride” when the container’s Resolve-method is called. ResolverOverride is an abstract base class and Unity comes with few of these built-in. One of them is ParameterOverride which “lets you override a named parameter passed to a constructor.” So knowing that we need to pass in a string named “hello” and an integer called “number”, we can resolve the instance with the help of ParameterOverride: [Test] public void Test() { var unity = new UnityContainer(); unity.RegisterType(); var myObj = unity.Resolve(new ResolverOverride[] { new ParameterOverride("hello", "hi there"), new ParameterOverride("number", 21) }); Assert.That(myObj.Hello, Is.EqualTo("hi there")); Assert.That(myObj.Number, Is.EqualTo(21)); } We pass in two instances of ParameterOverride. Both of these take in the name and the value of the parameter. Custom ResolverOverride: OrderedParametersOverride But what if you don’t like passing in the parameter names and instead you want to pass in just the parameter values, in correct order? In order to achieve this we can create a custom ResolverOverride. Here’s one way to do it: public class OrderedParametersOverride : ResolverOverride { private readonly Queue parameterValues; public OrderedParametersOverride(IEnumerable(); foreach (var parameterValue in parameterValues) { this.parameterValues.Enqueue(InjectionParameterValue.ToParameter(parameterValue)); } } public override IDependencyResolverPolicy GetResolver(IBuilderContext context, Type dependencyType) { if (parameterValues.Count < 1) return null; var value = this.parameterValues.Dequeue(); return value.GetResolverPolicy(dependencyType); } The parameter values are passed in through the constructor and put into a queue. When the container is resolving an instance, the parameters are used in the order which they were given to the OrderedParametersOverride. Here’s a sample usage of the new OrderedParametersOverride: [Test] public void TestOrderedParametersOverride() { var unity = new UnityContainer(); unity.RegisterType(); var myObj = unity.Resolve(new OrderedParametersOverride(new object[] {"greetings", 24 })); Assert.That(myObj.Hello, Is.EqualTo("greetings")); Assert.That(myObj.Number, Is.EqualTo(24)); } Sample code The above examples can be found from GitHub.
May 4, 2012
· 22,847 Views

User has been successfully modified

Failed to modify user

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: