AutoMapper: What-Why-How

An open source library used to map objects, AutoMapper’s Nuget package has been downloaded more than 21.5 million times and is usually one of the top 20 downloaded community Nuget package. It has a convention based auto mapping of properties with same name & type, from which it derives its name. It has a plethora of useful features & auto mapping may actually not be advisable, as covered below.

Screen Shot 2018-08-05 at 09.49.25

This is an attempt to lay out why I think AutoMapper is good & how best to use it.
For the first timers, following is a quick example of AutoMapper code.Screen Shot 2018-08-05 at 20.02.13

As can be seen, the properties can be mapped using a simple syntax that allows evaluation of conditions. This is a small window of the capabilities that include the possibility of calling injected services and manipulating data before and after the mapping is done.

The author of AutoMapper generated it to tackle problems he faced daily, and is still using it, to my knowledge. Such dog fooding has lead to relevant software that’s evolved based on feedback and community input. It has moved away from its static API and use of reflection that it relied on heavily before.

It lends itself to putting the mappings creation in different classes (called Profiles, which are unit testable) as needed by the domain. These can be usually picked up the IOC container so any new Profile added to the project is auto included. In fact, the extension provided for .Net Core does that with one line of code.

The best way to use AutoMapper is to inject its mapping interface into the calling classes.

The biggest advantage of using it for me is the ability to allow the calling code to fully delegate the responsibility of mapping to AutoMapper, in all but exceptional cases. As such its invaluable tool in implementing the SOLID principles. As an example, a typical controller method using AutoMapper may be as below.

Screen Shot 2018-08-04 at 20.59.31

The above code clearly gives the flow of the method, abstracting away any irrelevant   details that are not the responsibility of the controller. It is also easily unit testable by passing mocks, which don’t have to setup the models being passed around. This code and its tests will not need to change if the properties of the models or the way they’re mapped is altered.

Mappings created in AutoMapper are reused where needed without any extra configuration. Which means in the example below AutoMapper will look for mappings from AddressDto to Address class and OrderDto to Order class and use them. Of course there will be a detailed error at runtime if any of those are not found so please do call AssertConfigurationIsValid() at the start of the application.Screen Shot 2018-08-06 at 06.46.18

AutoMapper also allows using inheritance if both the source and destination classes inherit from another set of classes which have been mapped, as the Include<> line does below. This can also be done in derived classes by calling IncludeBase<> method.      Screen Shot 2018-08-06 at 06.54.35

If all that is not enough AutoMapper also takes care of converting collections. Any collection can be converted to any other collection so long as the classes are mapped.  Screen Shot 2018-08-06 at 07.03.46

AutoMapper also has support for generics, which allows mapping a generic class to another one.

The above are reasons enough for me to utilise AutoMapper. Quite often issues arise if its not used in a way conducive to its capabilities, which I’ll try and highlight in the Dos & Donts below.


  1. Inject the mapping interface IMapper (previously IMappingEngine) to map the objects, instead of calling the static Mapper.
  2. Create the mappings in classes inheriting from AutoMapper’s Profile class, using it’s CreateMap method in the constructor (in the latest version).
  3. Use the AssertConfigurationIsValid() method on the mapper configuration after all the profiles have been added. It should also be used in the Profiles’ unit tests. This method raises an exception with detailed messages is any of the property of the destination class has not been mapped. Not perfect but still indispensable.
  4. Map each property individually even if the other property is of same name & type. This is very useful in the long run for maintainability & traceability of code.
  5. When mapping properties of an object order them alphabetically, it makes it easier to track things if the objects become bigger.
  6. Tell, don’t ask. The FullName mapping in the first example (or the last one) could be done as below as long as types from one layer don’t seep into another.Screen Shot 2018-08-05 at 22.17.48
  7. Use custom value resolvers  and custom type converters. If AutoMapper is setup properly, services can be injected into these to allow complex mappings. This will prevent partial mapping of classes in AutoMapper and then calling services to map the remaining properties in the calling method.
  8. Before and After Map actions can be used to tackle complex mappings that need setup or manipulation post mapping, which is not the concern of calling object.


  1. Don’t use the static API. Using Mapper.CreateMap & Mapper.Map prevents injecting AutoMapper and that means all mappings will have to setup for unit tests that call AutoMapper, instead of mocking them out, which can be major burden as projects get larger.
  2. The convention based automatic mapping of same named properties should be avoided for better maintainability & easier tracking of properties.
  3. Avoid one line ignoring of the unmapped properties for objects you own, as shown below. This feature should only be used for large third party objects where we don’t care what may change. For the classes we own, the properties should be ignored individually for better tracking and maintenance. Screen Shot 2018-08-05 at 22.07.36
  4. Avoid database calls directly from AutoMapper code, of which I’ve seen examples on the internet. AutoMapper should only call services, if needed to map objects.
  5. Personally, I avoid using Queryable extensions of AutoMapper which allow calling AutoMapper projections while querying the database using an ORM like  Nhibernate or Entity Framework. Getting the objects from database & then mapping them explicitly is more maintainable IMHO.

What the user needs

“… the standard search for the reports, this should be quicker than usual, we can reuse the search functionality we removed earlier…” Nearing completion of the marketing preferences system, this suggestion came up during the discussion for the inevitable reporting requirements.

It occurred to me that the reason for removal of the search functionality – searching for marketing preferences by first/last name etc – would be relevant to the reporting as well. Just as no one would be interested in searching for marketing preferences of users with some first/last name, there should be no need to put together reports for such conditions.

All the marketing department would like to know is data relevant to the campaigns they put out i.e. the total numbers of people who signed up or said no for a particular channel (email/post/sms/phone) would suffice. So a few text boxes or select lists to allow the selections and printing out the number with the conditions would do the trick. The marketing department agreed and the work took a few days instead of weeks.

The Scrum process tends to get people set in a ‘mode’, for want of a better word. Stories generated by a single person (Product Owner or Analyst), maybe looked at by one/two other people (senior dev or architect), are fed into the gristmill for the breakdown, estimation et al. Quite often what are missed, as above, are the actual needs of the users. The things they may not be able to tell you since they don’t know what’s possible.

Its quite possible the reporting system mentioned above would’ve been built like all others and the users would’ve accepted it since they wouldn’t have known better. Weeks would’ve been spent writing code, testing & sorting out the inevitable bugs, likely to be more in a more complex interface.

The best way to avoid such wastage is to be in sync with the users. As many of the team as possible should know them as well as possible. Sit with them when they work, chat with them regularly and have them with you in as many stand ups/meetings as possible.

Unless the team members, or at least some of them, can recall relevant details easily and can lay out the processes, business and the code side, from memory, they wouldn’t be knowledgable enough to bring up the options their users haven’t thought of, nor would they have the confidence to do so.

That tomorrow was today

What was going to be yet another lazy day turned out to be the “tomorrow” I promised to restart my running, and yesterday affirmed to someone.

And so, on a bit gloomy day nearly a month after my last run, I ended up running almost 10K today.


The “almost” came about since my trusted Bluetooth thingy decided to ditch me midway. The experience was a bit surreal. I don’t remember the last time I ran without music and distance and timings every 5 mins and every kilometre respectively about how I’m doing.


I honestly can’t say which one is better – the silence makes me appreciate the run a lot better and be in tune with my physical self, but my mind does tend to wander more nothing to listen to; plus it’s difficult to push myself if I can’t get (hear?) the results in short durations.

So I was fairly upbeat when I started my 40 mins drive for work. Then came Apple intervention (for want of a better phrase) in the form of “I’ll meet you at midnight” as the first song of my drive.

A peppy but melancholy song, it was followed by a remix of “Tu hi meri shab hai“, another (Hindi) one in the same vein, and one which has memories of a close relative no longer with us attached to it. (auto correct nearly drove me crazy respelling the Hindi song as “Tu hi mere stab hai”)

The contemplative mood that set in led me to remember some forgotten losses and continuation of some wanderings my mind went to during the “silent” part of my run.

Aah! An emotional wringer, and the day had just about begun.

Friday frivolities – the Perspective Man strikes

Winding down hours last Friday, I was thankful to have my head down coding after days of analysis. I noticed Chris, the senior-most infrastructure guy, lurking around suspiciously long on Nick’s (the dev team leader who sits diagonally behind me) empty desk. I briefly considered challenging him but was having way too much fun coding my way to the first (jubilatory) end to end working call of a service I was working on.


A while later, my euphoria was cut short by Nick’s growl: “Who’s been messing with my desk, I can’t move my $%@*%£# mouse?” Turns out the mouse had been very firmly glued to the desk by the aforementioned Chris. Cue general hilarity while Nick and others, including me, tried to move the mouse. The mouse was firm in its Corbyn/Trump-esque refusal to yield to increasing amount of force. After a frustrating amount of time (& swearing), Nick bit the bullet and employed a careful but firm kick (yes, believe it: it was an actual boot to the mouse kick) to get it unstuck.

So far so good; comments ranged the usual spectrum and threats of retribution abound. Picking up about Chris’ dog, I proffered a waking-up-with-dog’s-head solution, which was met with the standard hilarity: till the Perspective Man stepped in.

A word about Andy: a proficient & extremely helpful senior developer, he also has deep pockets of knowledge of a wide range of non software related subject. I’m talking about a level/range from sub-atomic forces to effect of types of soils on growing orchids. Yup: it’s about areas no-one knows they (definitely) don’t care about till they hear about them. That hasn’t stopped him from providing out-of-the-blue perspectives on situations leaving gobsmacked listeners in his wake.

So the situation: retaliation against a guy, a dog and maybe a horse thrown in metaphorically.

Andy’s take: “You know, there’s not much meat in a dog’s head. With a horse’s head I could feed my family for a week probably.

That brought the house down. The earlier raucousness, dying down to the end of the working week gentle hubbub, was re-energised. So the horse’s dead, and they may be coming after you next but the family food is sorted for the week – overall a mixed day! How’s that for a fresh perspective?

As is usual, the Prospective Man was not finished – providing us with something not featured in Jamie’s kitchen:


Dinner is served!

Double figures run (finally)

Run on 05 Apr 2015

My 10 miles run on 05 Apr 15

For last Sunday’s run, recorded here, the route was simple, run 5 miles, run back!

It went so well I finished with nearly the exact pace I was aiming for:

Timing for my 05 Apr 15 run

My last long(er) run was half a decade ago – 14 Feb 2010 to be exact, when I managed to stumble across the finish of Liversedge half marathon:

Timing 2010 Liversedge half marathon

The timing, as mentioned in the picture, was after hitting the wall at around 8 miles mark. For the last 5 miles, it took all my will power to continue “running” (i.e. not walk) – fun times! Things didn’t get better after that. I hadn’t started recording my runs so… a few longer runs later I pulled my leg muscle, a week or two before the marathon I was supposed to run.

Distances runLong story short, I’ve not had a really long satisfying run after that, for whatever reason, as can be witnessed on my Runtastic statistics page, shown here.

This 10 miles run may not be noteworthy, but running that distance easily has given me more confidence than I ever imagined. Suddenly running a marathon is not so damn daunting.

As the pace was not taxing, even with just one energy gel I didn’t hit the wall thingy, but now I have the option to do so. If you’re not aware, energy gels are used to provide immediate energy to the body and stave of hitting the wall for as long as possible. I wasn’t aware of these when I ran at Liversedge, after all I’d run 8 miles without any problem – what could possibly go wrong!

I tried out faster paces in the gym yesterday – till 4 mins/km for a short while. It does seem I can handle a bit more this Sunday, when I run the Sheffield half marathon – hopefully a better performance to talk about.