Dependency Injection in Hadoop Mapper
This is a common dilemma on Hadoop because the Mapper and Reducer are handed to you by the framework. I found it best to call out to a lightweight DI framework from the setup() methods. Read my blog post about Dependency Injection on Hadoop. I wrote a single class to handle DI called Spit-DI which is available on github and uses the JSR-250 @Resource annotation for injections.
It ends up looking like this:
class MovieMapper extends Mapper { @Resource private Movie movie; @Override protected void setup(Context context) { DependencyInjector.instance().using(context).injectOn(this); }}class Movie { @Resource private Counter numMoviesRequested; public Integer getYear(String title) { numMoviesRequested.increment(1); // more code... }}/** * You can have a wrapper class around Spit-DI for all your configuration. * (We have a TestDependencyInjector as well for the context of unit testing.) */class DependencyInjector { private SpitDI spit = new SpitDI(); public void injectOn(Object instance) { spit.inject(instance); } public DependencyInjector using(final Mapper.Context context) { spit.bindByType(Movie.class, new Movie()); spit.bindByName(Counter.class, "numMoviesRequested", context.getCounter("movies", "numMoviesRequested"); return this; }}
The default way to injection in spring is base on type of the object. In your case you can't use this kind to injection because you have two different implementation of the same interface. Then in your case you can the following strategy to inject these object ( I'm supponing that you have a xml configuration to Spring)
<beans> <context:annotation-config/> <bean class="example.MyFirstImpl"> <qualifier value="first"/> </bean> <bean class="example.MySecondImpl"> <qualifier value="second"/> </bean> <bean class="example.TestComponent" /></beans>
Then on your interface you can use
public class TestComponent { @Autowired @Qualifier("first") MyInterface myInterface}