Monday, December 22, 2014

Authenticate through Active Directory in Spring

There is an web application that use a very basic authentication, or I should said it was pre-code in Spring configuration just as show below:

   


 
    
       
          
            
          
      
    

Assuming the front end code is using JSF:
 
When user is trigger the login, following code will get executed.
 public String doLogin() throws IOException, ServletException {
  
  ExternalContext context = FacesContext.getCurrentInstance().getExternalContext();
  RequestDispatcher dispatcher = ((ServletRequest) context.getRequest()).getRequestDispatcher("/j_spring_security_check");
  dispatcher.forward((ServletRequest) context.getRequest(), (ServletResponse) context.getResponse());
  FacesContext.getCurrentInstance().responseComplete();
  
  return null;
 }
The code above is a great help, it will do the authentication as what have been define in Spring configuration, and also validate on which URL does allow to access. Thanks Spring save a great effort for me as I don't need to write crazy logic on authentication. Now I want to convert this mechanism to a more elegant way, which is to integrate this application with Active Directory. This sound easy to me as there are plenty of tutorials on Internet talking about Spring security with LDAP. Wait!! It is LDAP, not Active Directory and it was clueless to me. It took me sometime reading the Spring security documentation for few times only I got to realize this -  ActiveDirectoryLdapAuthenticationProvider.
   
      
      
   
Somehow the code above is not yet perfect as it'll search through all the entry which might be a trouble to me if the entries are huge. I haven't got a clue on how could I filter base on certain group. But for now, let's get things work. Since I'm dedicating my work to let Spring handle it for me, it would be easier when comes to exception handling. The code are shown below:
   
      
         
            /faces/badCredential.xhtml
            /faces/credentialsExpired
            /faces/accountLocked
            /faces/accountDisabled
            /faces/unauthorized
         
      
   
Note that I have declare this bean as authenticationFailureHandler. In order to get that piece to work, I need to tied up this bean in authentication-failure-handler-ref as shown below:

 
 

Great! My code able to handle failure result. Let's move on. Upon successfully authenticate, authorization need to be grant. The challenge part is I'm not using the group provided in LDAP to categories access role, instead I'm making a custom one. In other words, I have my data store in database that govern the rules who have access and who don't. The best place to put this code is via authentication-success-handler-ref of form-login, and then tied this up with a Spring bean as shown below:
   
      
      
   

   
   
AuthenticateSuccessHandler is a custom made class that extends from SavedRequestAwareAuthenticationSuccessHandler. It provide me the facility to set the default page to go after a successful login. For my case, I have the user status stored in database. authorizationBo is the guy who responsible to retrieve the user status. If the status return true, I'll grant this user a ROLE_USER, otherwise no role will be granted. And I'm using the Authentication object in onAuthenticationSuccess() to determine which landing page should a user go after the authentication process. The code below reveal the logic behind the scene:
public class AuthenticateSuccessHandler extends SavedRequestAwareAuthenticationSuccessHandler {

 @Override
 public void onAuthenticationSuccess(final HttpServletRequest request, final HttpServletResponse response, final Authentication authentication) throws ServletException, IOException {
  
  if( authentication.getPrincipal() instanceof LdapUserDetailsImpl ) {
   
   boolean status = authorizationBo.checkAuthorization(authentication.getName());
   
   if( status ) {
    List<grantedauthority> grantedAuth = new ArrayList<>();
    grantedAuth.add(new SimpleGrantedAuthority("ROLE_USER"));
    UserDetails userDetails = new User(authentication.getName(), "", grantedAuth);
    
    SecurityContextHolder.getContext().setAuthentication(new UsernamePasswordAuthenticationToken(userDetails, "", grantedAuth));
   }
  }
  
  if( authentication.getName().equals("admin") )
   setDefaultTargetUrl("/faces/administrative.xhtml");
  else
   setDefaultTargetUrl("/faces/dashboard.xhtml");
  
  super.onAuthenticationSuccess(request, response, authentication);
 }
}
Finally I have cover both the success and failure parts. My initial though is to do some crazy logic in doLogin() such as authenticating a user and then granting a user role, but I don't at the end. Spring is a great assistant in this area, just let Spring handle it all.

Tuesday, December 16, 2014

Why cmake isn't compiling?

Something has caught my attention recently, just come across the CMake and I find it interesting when having some refreshment on C++. When I'm coding, I'm very rely on IDE stuff especially on Visual Studio C++ 6.0, I used to master every single short-cut key on this tool. However I didn't aware of there is a process working behind the scene, until I knew it in one day, and I'm so curious about it.

When I know Makefile, I find that this is the “engine” working behind the scene before compiler can even start to compile. After a long learning curve on Makefile, I realize that this tool definitely not for an idiot! It is a tool that only genius will know how to use it. Until I found CMake, something seem reasonable to understand.

But be cautious, I made some stupid mistake when I first using this tool:
  1. I accidentally miss spelled the file to CMakeList. The correct file name is CMakeLists.txt. The file name end with an s and have txt extension on it.
  2. Missing path-to-source argument when issuing the command cmake. The correct format is cmake .. This is to tell cmake that the source file can be find at current level.

One nice thing about this tool is it will generate a Makefile for me which save me a lot of time. Besides Makefile it generated, it also generate other files which I don't know its usage.

Monday, December 1, 2014

NoSuchMethodError on org.hamcrest.Matcher.describeMismatch

I found an interesting error when I have the following code being used in my unit test:

Assert.assertThat(((LoggingEvent) loggingEvent.getValue()).getLevel(), CoreMatchers.is(Level.DEBUG));

Following stacktrace could be seen:
java.lang.NoSuchMethodError: org.hamcrest.Matcher.describeMismatch(Ljava/lang/Object;Lorg/hamcrest/Description;)V 
	at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18) 
	at org.junit.Assert.assertThat(Assert.java:865) 
	at org.junit.Assert.assertThat(Assert.java:832) 
	at org.huahsin68.EmployeeImplTest.test(EmployeeImplTest.java:62) 
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
	at java.lang.reflect.Method.invoke(Method.java:622) 
	at org.junit.internal.runners.TestMethod.invoke(TestMethod.java:68) 
	at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl$PowerMockJUnit44MethodRunner.runTestMethod(PowerMockJUnit44RunnerDelegateImpl.java:310) 
	at org.junit.internal.runners.MethodRoadie$2.run(MethodRoadie.java:88) 
	at org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:96) 
	at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl$PowerMockJUnit44MethodRunner.executeTest(PowerMockJUnit44RunnerDelegateImpl.java:294) 
	at org.powermock.modules.junit4.internal.impl.PowerMockJUnit47RunnerDelegateImpl$PowerMockJUnit47MethodRunner.executeTestInSuper(PowerMockJUnit47RunnerDelegateImpl.java:127) 
	at org.powermock.modules.junit4.internal.impl.PowerMockJUnit47RunnerDelegateImpl$PowerMockJUnit47MethodRunner.executeTest(PowerMockJUnit47RunnerDelegateImpl.java:82) 
	at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl$PowerMockJUnit44MethodRunner.runBeforesThenTestThenAfters(PowerMockJUnit44RunnerDelegateImpl.java:282) 
	at org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:86) 
	at org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:49) 
	at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl.invokeTestMethod(PowerMockJUnit44RunnerDelegateImpl.java:207) 
	at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl.runMethods(PowerMockJUnit44RunnerDelegateImpl.java:146) 
	at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl$1.run(PowerMockJUnit44RunnerDelegateImpl.java:120) 
	at org.junit.internal.runners.ClassRoadie.runUnprotected(ClassRoadie.java:33) 
	at org.junit.internal.runners.ClassRoadie.runProtected(ClassRoadie.java:45) 
	at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl.run(PowerMockJUnit44RunnerDelegateImpl.java:118) 
	at org.powermock.modules.junit4.common.internal.impl.JUnit4TestSuiteChunkerImpl.run(JUnit4TestSuiteChunkerImpl.java:101) 
	at org.powermock.modules.junit4.common.internal.impl.AbstractCommonPowerMockRunner.run(AbstractCommonPowerMockRunner.java:53) 
	at org.powermock.modules.junit4.PowerMockRunner.run(PowerMockRunner.java:53) 
	at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50) 
	at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) 
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467) 
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683) 
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390) 
	at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197) 

An expert told me that never use the hamcrest came from JUnit 4, I'm require to download a separate hamcrest library (it is version 1.3 as of this writing), and it must come before JUnit 4 in the build path. What he told is true.

How could I unit test on the message came from log4j?

I was ask to unit test every single function that I wrote, and I was so unlucky that every function I wrote do invoke log4j. I was thinking whether I should cover those in my unit test as well? There is not much resources found on Internet, mat be due to the reason this really waste of effort or they are more concern on the logic design than this. But I'm so curious on how the code is if I'm insist to go with it. For the purpose of this, I have create an POC as shown below. Let's assume I'm going to unit test on funcA():
public class EmployeeImpl { 

 private static Logger log; 

 public static void initLog() { 
   
  log = Logger.getLogger(EmployeeImpl.class); 
  log.setLevel(Level.INFO); 
  log.setAdditivity(false); 
   
  ConsoleAppender ca = new ConsoleAppender(); 
  ca.setWriter(new OutputStreamWriter(System.out)); 
  ca.setLayout(new PatternLayout("%-5p [%t]: %m%n")); 
  log.addAppender(ca); 
 } 
  
 public static void funA() { 
  log.info("Entering into funcA"); 
 } 
}

Next I have my unit test as shown below targeting on funcA():
@RunWith(PowerMockRunner.class)
public class EmployeeImplTest { 
  
 @Mock 
 private Appender appenderMock; 
  
 @Captor 
 private ArgumentCaptor loggingEvent; 

 @Test 
 public void test() throws SecurityException, NoSuchFieldException, IllegalArgumentException, IllegalAccessException { 
   
  Logger.getRootLogger().addAppender(appenderMock); 
   
  Field log = EmployeeImpl.class.getDeclaredField("log"); 
  log.setAccessible(true); 
  log.set(null, LogManager.getLogger(EmployeeImpl.class)); 
   
  EmployeeImpl.funA(); 
   
  Mockito.verify(appenderMock, Mockito.times(1)).doAppend((LoggingEvent) loggingEvent.capture()); 
  Assert.assertThat(((LoggingEvent) loggingEvent.getValue()).getLevel(), CoreMatchers.is(Level.INFO)); 
  Assert.assertThat(((LoggingEvent) loggingEvent.getAllValues().get(0)).getRenderedMessage(), CoreMatchers.equalTo("Entering into funcA")); 
 } 
}
Tada!! The test case above test 2 things; 1) Ensure the log level is INFO, 2) Ensure the message Entering into funcA. were shown. Either one is incorrect will failed the test case. In this case, the result will be pass.

Friday, September 5, 2014

No room to fit Maven-Ear-Plugin in one pom.xml

After I have the WAR file build completed, then I'm going to build the EAR file. Unfortunately it doesn’t work as in ANT. In Ant I can do all stuff of work in just one build.xml but not in Maven. Too bad huh?!! OK, I understand this can't be done but I still insist want to do it. Anyhow, there is still no EAR file being generate after the build. What a joke? No joking, Maven doesn't allow me to do that. Take the following code snippet as my use case:

    4.0.0
 
    org.huahsin
    MyWebService
    0.0.1-SNAPSHOT
    war
 
    MyWebService

    
        
         org.huahsin
         MyWebService
         0.0.1-SNAPSHOT
         war
        

        ...
        ...

    

    
        
            
             org.apache.maven.plugins
             maven-ear-plugin
             2.9.1
             
                 7
                 
                     
                         org.huahsin
                         MyWebService
                         MyWebService.war
                     
                 
                 MyWebService
             
            
 
            ...
            ...
 
        
    

The <packaging> will cause by the build failed. I mean it would working fine for building WAR file but not for building EAR file. Now I have knew the root cause, changing to ear in <packaging> line but I'm still not satisfied with the final output. Because I have a special requirement to this EAR file not to contain all other libraries except the custom build library. Now the EAR file has mess up all others libraries in it, this could a disaster when I deploy to WebSphere Application Server.

To make it clean, I create another pom.xml that only perform one task, which is EAR packaging. And this pom.xml contain only one dependency which is my newly created WAR. To make the picture clear, the WAR file should contain all only the libraries, whereas EAR file should contain only the WAR file. This is the primary objective of having a separate pom.xml, following code snippet worth thousand of these nonsense.

    4.0.0

    org.huahsin
    MyWebServiceEar
    0.0.1-SNAPSHOT
    ear

    MyWebServiceEar

    
        
         org.huahsin
         MyWebService
         0.0.1-SNAPSHOT
         war
        
    

    
        
            
            org.apache.maven.plugins
            maven-ear-plugin
            2.9.1
            
                7
                
                    
                        org.huahsin
                        MyWebService
                        MyWebService.war
                    
                
                MyWebService
            
            
        
    

Now I have another question, how could I fit 2 pom.xml in one project? I don't know??? I just put them somewhere as long as there wouldn't crash each other in the project. Am I doing this in the right approach?

Wednesday, September 3, 2014

Maven default deployment path locate at target/tomcat/webapps?

In order for me to establish a JNDI connection in Tomcat, I have the data source declare inside server.xml as shown in the code snippet below:
     
         
         
     
And then I have the JNDI data source declared in Spring like this:
    
        
        
        
        
    
Let’s do some experiment on maven-war-plugin, if this plugin went missing, an error message mention Name [comp/env] is not bound in this Context. Unable to find [comp]. as shown in the following stack trace could be seen:
javax.naming.NameNotFoundException: Name [comp/env] is not bound in this Context. Unable to find [comp].
 at org.apache.naming.NamingContext.lookup(NamingContext.java:820)
 at org.apache.naming.NamingContext.lookup(NamingContext.java:168)
 at org.apache.catalina.deploy.NamingResources.cleanUp(NamingResources.java:988)
 at org.apache.catalina.deploy.NamingResources.stopInternal(NamingResources.java:970)
 at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:232)
 at org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5495)
 at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:232)
 at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:160)
 at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
 at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
 at java.util.concurrent.FutureTask.run(FutureTask.java:262)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:745)
If the plugin were there (as shown in following code snippet) but without any configuration being done, the above error message could be seen too.
 
  org.apache.maven.plugins
  maven-war-plugin
  2.2
 
Now if <outputDirectory> is configure to the plugin as shown in following code snippet:
 
  org.apache.maven.plugins
  maven-war-plugin
  2.2
  
   ${project.basedir}/target/tomcat/webapps
  
 
Tada! It works. Now change to use <warSourceDirectory> in the configuration as shown in following code snippet and start up the Tomcat, the same error could be seen as well.
 
  org.apache.maven.plugins
  maven-war-plugin
  2.2
  
   ${project.basedir}/src/main/webapp/
  
 
Try make some adjustment to the docBase attribute of <Context> in server.xml as shown in following code snippet, it will works again.
     
         
    ...

     
What a surprise?! If I pay close attention on the stack trace, I would notice MyService folder wasn’t there actually as mention in another message trace as shown in below:
java.lang.IllegalArgumentException: Document base C:\MyService\target\tomcat\webapps\MyService does not exist or is not a readable directory
 at org.apache.naming.resources.FileDirContext.setDocBase(FileDirContext.java:140)
 at org.apache.catalina.core.StandardContext.resourcesStart(StandardContext.java:4906)
 at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5086)
 at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
 at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
 at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
 at java.util.concurrent.FutureTask.run(FutureTask.java:262)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:745)
Another message I got was whenever Maven build is trigger, Maven will expect the output to be deploy into target/tomcat/webapps folder. Since I didn’t configure <outputdirectory> in POM.xml, then Maven will go for the default path searching my web app. By adjusting the docBase attribute 2 lever up will tell Maven "Hey! go search my app there".

Tuesday, September 2, 2014

Let’s see how profiles could help in Maven build?

Taking the following code snippet as my use case:

 ${project.name}
     
 
  
   org.apache.tomcat.maven
   tomcat7-maven-plugin
   ...
   ...
  

  
   org.apache.maven.plugins
   maven-compiler-plugin
   ...
   ...
  

  
   org.apache.maven.plugins
   maven-war-plugin
   ...
   ...
  
 

In most situation, I will just run mvn clean install tomcat7:run to bring up my application instance. But somehow, in some situation, I would like to have some special configuration on the WAR packaging to be different from the regular build. For example, to exclude some particular JAR out from the build during packaging stage, if I'm using the regular configuration as shown in the code snippet above, I will hit error when I bring up my application instance due to some libraries were gone missing during packaging stage.

In order to play well on both mvn clean install tomcat7:run and mvn clean package, my colleague suggest me to use profiles for this situation. As shown in the following code snippet, another piece of maven-war-plugins configuration is created inside the <profile>:
 
    
       CUSTOM
       
         
            
               org.apache.maven.plugins
               maven-war-plugin
               2.2
               
                  %regex[WEB-INF/lib/(?!mycustomlibrary).*.jar]
               
            
         
         myfinalName
       
    
 
With this new configuration, the regular build process will continue to work as expected while I have another separate piece to build a custom made build process for packaging. But separated piece need to be done with following command:

mvn clean package –PCUSTOM

Do take note on the <finalName> usage, it allows me to specify a custom package output file name that is different from the regular ugly Maven style name.