Source Tree resolve conflicts with an external tool

2018-05-21 12_55_29-Sourcetree-blue - Internet Explorer

Source Tree supports multiple external tools to compare differences and resolve conflicts between files. By default you can choose any of this:

Source tree tools

But In this entry I’m going to show how to use it with Meld. 

Meld is a visual diff and merge tool targeted at developers. Meld helps you compare files, directories, and version controlled projects.

In linux systems it’s very useful and it’s my preffered tool (only when I can’t use diff and merge tools from my IDE).

How to configure

Got to main menú an select Tools > Options > Tab Diff  and into selection option External Diff  pick Custom and into field Difff Command set the path to your meld binary (I’m on windows and my case is Meld.exe), next into field Arguments set this:

\"$LOCAL\" \"$REMOTE\"

Now for Merge Tool, select the same binary (Meld.exe) and set this into filed Arguments:

 
--auto-merge \"$LOCAL\" \"$BASE\" \"$REMOTE\" --output=\"$MERGED\" 

Now if you choose anyfile in your project you can check the diff selecting External Diff or CTRL+D over your file, and meld it’s executed:

extdifmeld

And it’s the same for files in conflict, select your file with the context menu Resolve conflicts > Launch External Merge Tool, now you can view side to side the differences between versions.

extmergemeld

And thats all, cheers

 

Anuncios

Spring and JPA with two data sources (with annotations)

A few days I recevied a comment from my friend @sock_osg (you can follow on twitter) he recomends me to rewrite my previous post with annotations.

And well here is it. But the are more few things to comment about it, for example in the previous post there are not a Transactional capabilities between DB’s because I’m not using JTA transaction type.

Now for this example I write the code to support transactions between multple data bases with different data sources, like I read on stack overflow:

if you find yourself with multiple entity managers, with corresponding tx managers, then you should consider using a single JtaTransactionManager instead. The entity managers should be able to participate in JTA transactions, and this will give you full transactionality across both entity managers, without having to worry about which entity manager you’re in at any one time.

You can download all code here, it’s hosted on my Github account, pull the code from the branch named “annotations”.

Let’s review the most important files, frst the DAO classes: Seguir leyendo

Spring and JPA with two data sources

JPA it’s the most used standar in java to manage the persistence and I ever want to write about this topic.

In this entry I want to share how configure a java project to use JPA with spring with two data sources, you can view all source of this project on my github repository.

Most of the configuration is in spring, in the application context you need declare:

  • 2 Data Sources
  • 2 Entity Manager Factories
  • 2 Transaction Manager

This is explained by self:

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:tx="http://www.springframework.org/schema/tx"
       xmlns:p="http://www.springframework.org/schema/p"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
       xmlns:context="http://www.springframework.org/schema/context"
       xmlns:util="http://www.springframework.org/schema/util"
       xsi:schemaLocation="http://www.springframework.org/schema/beans
        http://www.springframework.org/schema/beans/spring-beans.xsd
        http://www.springframework.org/schema/context
        http://www.springframework.org/schema/context/spring-context.xsd
        http://www.springframework.org/schema/tx
		http://www.springframework.org/schema/tx/spring-tx.xsd
        http://www.springframework.org/schema/util
        http://www.springframework.org/schema/util/spring-util.xsd
        http://www.springframework.org/schema/jdbc
        http://www.springframework.org/schema/jdbc/spring-jdbc.xsd">

    <context:property-placeholder location="classpath:app-props.properties" />

    <context:component-scan base-package="org.oz" />
    <context:annotation-config/>

    <bean id="pum" class="org.springframework.orm.jpa.persistenceunit.DefaultPersistenceUnitManager">
        <property name="persistenceXmlLocations">
            <list>
                <value>classpath:META-INF/persistence.xml</value>
            </list>
        </property>
        <property name="dataSources">
            <map>
                <entry key="localDataSource" value-ref="dataSource1"/>
                <entry key="remoteDataSource" value-ref="dataSource2"/>
            </map>
        </property>

        <!-- if no datasource is specified, use this one -->
        <property name="defaultDataSource" ref="dataSource1"/>
        <property name="defaultPersistenceUnitName" value="unit1"/>

    </bean>

    <bean id="dataSource1" class="org.springframework.jdbc.datasource.DriverManagerDataSource" lazy-init="true" primary="true"
          p:driverClassName="${jdbc1.driver}"
          p:url="${jdbc1.url}"
          p:username="${jdbc1.user}"
          p:password="${jdbc1.pass}"
    />

    <bean id="entityManagerFactory1" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean"
          p:persistenceXmlLocation="classpath:META-INF/persistence.xml"
          p:persistenceUnitName="unit1"
          p:dataSource-ref="dataSource1"
          p:packagesToScan="org.oz.persistence.dao.db1"
          lazy-init="true"/>

    <bean id="transactionManager1" class="org.springframework.orm.jpa.JpaTransactionManager"
          p:entityManagerFactory-ref="entityManagerFactory1"
          lazy-init="true"/>

    <bean id="dataSource2" class="org.springframework.jdbc.datasource.DriverManagerDataSource" lazy-init="true"
          p:driverClassName="${jdbc2.driver}"
          p:url="${jdbc2.url}"
          p:username="${jdbc2.user}"
          p:password="${jdbc2.pass}"
    />

    <bean id="entityManagerFactory2" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean"
          p:persistenceXmlLocation="classpath:META-INF/persistence.xml"
          p:persistenceUnitName="unit2"
          p:dataSource-ref="dataSource2"
          p:packagesToScan="org.oz.persistence.dao.db2"
          lazy-init="true"/>

    <bean id="transactionManager2" class="org.springframework.orm.jpa.JpaTransactionManager"
          p:entityManagerFactory-ref="entityManagerFactory2"
          lazy-init="true"/>

    <tx:annotation-driven transaction-manager="transactionManager1"  />
    <tx:annotation-driven transaction-manager="transactionManager2" />

    <bean class="org.springframework.dao.annotation.PersistenceExceptionTranslationPostProcessor"/>

    <bean id="customerDao" class="org.oz.persistence.dao.db1.CustomerDao"/>
    <bean id="productDao" class="org.oz.persistence.dao.db2.ProductDao"/>

</beans>

 

Now in peresistence.xml we going to declare two persistence units:

<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence"
             xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
             xsi:schemaLocation="http://java.sun.com/xml/ns/persistence
             http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">

    <persistence-unit name="unit1" transaction-type="RESOURCE_LOCAL">
        <provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>

        <mapping-file>META-INF/native-querys.xml</mapping-file>
        <class>org.oz.persistence.dao.db1.model.Customer</class>

        <exclude-unlisted-classes>true</exclude-unlisted-classes>
        <properties>
            <property name="hibernate.show_sql" value="false" />
            <property name="hibernate.dialect" value="org.hibernate.dialect.H2Dialect"/>
            <property name="hibernate.connection.driver_class" value="org.h2.Driver"/>
            <property name="hibernate.archive.autodetection" value="class,hbm"/>
            <property name="useUnicode" value="true"/>
            <property name="characterSetResults" value="UTF8"/>
            <property name="characterEncoding" value="UTF8"/>
            <property name="hibernate.format_sql" value="false"/>
            <property name="hibernate.use_sql_comments" value="false"/>
            <property name="hibernate.hbm2ddl.keywords" value="auto-quote"/>
            <property name="hibernate.bytecode.use_reflection_optimizer" value="true"/>
            <property name="hibernate.connection.useUnicode" value="true"/>
            <property name="hibernate.connection.characterEncoding" value="UTF8"/>
            <property name="hibernate.connection.charSet" value="UTF8"/>
            <property name="hibernate.connection.characterSetResults" value="UTF8"/>

            <property name="hibernate.default_schema" value="BASEA"/>
        </properties>

    </persistence-unit>

    <persistence-unit name="unit2" transaction-type="RESOURCE_LOCAL">
        <provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>

        <mapping-file>META-INF/native-querys.xml</mapping-file>
        <class>org.oz.persistence.dao.db2.model.Product</class>

        <exclude-unlisted-classes>true</exclude-unlisted-classes>
        <properties>
            <property name="hibernate.show_sql" value="false" />
            <property name="hibernate.dialect" value="org.hibernate.dialect.H2Dialect"/>
            <property name="hibernate.connection.driver_class" value="org.h2.Driver"/>
            <property name="hibernate.archive.autodetection" value="class,hbm"/>
            <property name="useUnicode" value="true"/>
            <property name="characterSetResults" value="UTF8"/>
            <property name="characterEncoding" value="UTF8"/>
            <property name="hibernate.format_sql" value="false"/>
            <property name="hibernate.use_sql_comments" value="false"/>
            <property name="hibernate.hbm2ddl.keywords" value="auto-quote"/>
            <property name="hibernate.bytecode.use_reflection_optimizer" value="true"/>
            <property name="hibernate.connection.useUnicode" value="true"/>
            <property name="hibernate.connection.characterEncoding" value="UTF8"/>
            <property name="hibernate.connection.charSet" value="UTF8"/>
            <property name="hibernate.connection.characterSetResults" value="UTF8"/>

            <property name="hibernate.default_schema" value="BASEB"/>
        </properties>

    </persistence-unit>
</persistence>

Then only we’re need to do is set the reference of the persistence unit in every DAO to inject the EntityManager:

package org.oz.persistence.dao.db1;

import javax.persistence.*;
import java.util.Collection;

/**
 * Created by <a href="https://twitter.com/jaehoox">jaehoo</a> on 16/03/2018
 */
public class CustomerDao {

    public static final String SEL_TABLES="select.tablesh2";

    @PersistenceContext(unitName = "unit1" , type = PersistenceContextType.TRANSACTION)
    private EntityManager em;

    public Collection loadCustomers() {
        Query query = em.createQuery("FROM Customer");
        return query.getResultList();

    }

    public Collection getTables(){
        return em.createNamedQuery(SEL_TABLES).getResultList();
    }

}

package org.oz.persistence.dao.db2;

import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.PersistenceContextType;
import javax.persistence.Query;
import java.util.Collection;

/**
 * Created by <a href="https://twitter.com/jaehoox">jaehoo</a> on 16/03/2018
 */
public class ProductDao {

    public static final String SEL_TABLES="select.tablesh2";

    @PersistenceContext(unitName = "unit2", type = PersistenceContextType.TRANSACTION, name = "unit2")
    private EntityManager em;

    public Collection loadProducts() {
        Query query = em.createQuery("FROM Product");
        return query.getResultList();

    }

    public Collection getTables(){
        return em.createNamedQuery(SEL_TABLES).getResultList();
    }

}

The interesting point of this is in the use of the transaction manager, lets take a look on test class:

package org.oz.persistence.dao.db1;

import lombok.extern.slf4j.Slf4j;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.oz.persistence.dao.db1.model.Customer;
import org.oz.persistence.dao.db2.ProductDao;
import org.oz.persistence.dao.db2.model.Product;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.TestExecutionListeners;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.context.transaction.TransactionConfiguration;
import org.springframework.test.context.transaction.TransactionalTestExecutionListener;
import org.springframework.transaction.annotation.Transactional;

import javax.annotation.Resource;

import java.util.ArrayList;
import java.util.List;

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = {"classpath:app-ctx-test.xml"})
@TransactionConfiguration
@Slf4j
public class CustomerDaoTest {

    @Resource(name = "customerDao")
    private CustomerDao customerDao;

    @Resource(name = "productDao")
    private ProductDao productDao;

    @Test
    @Transactional("transactionManager1")
    public void loadCustomers() throws Exception {

        List customers = (List) customerDao.loadCustomers();

        log.info("customers:{}",customers.size());

        for(Customer c : customers){
            log.info("{}",c);
        }

        List tables = (List) customerDao.getTables();

        log.info("tables:{}",tables.size());
        for(Object c : tables){
            log.info("{}",c);
        }

    }

    @Test
    @Transactional("transactionManager2")
    public void loadProducts() throws Exception {

        List products = (List) productDao.loadProducts();

        log.info("products:{}",products.size());
        log.info("{}",products.get(0));

        List tables = (List) productDao.getTables();

        log.info("tables:{}",tables.size());
        for(Object c : tables){
            log.info("{}",c);
        }

    }

    @Test
    public void queryngTwoSoruces() throws Exception {

        log.info("getting data from two DS in one method");
        List tables = new ArrayList();

        tables.addAll(productDao.getTables());
        tables.addAll(customerDao.getTables());

        log.info("tables:{}",tables.size());
        for(Object c : tables){
            log.info("{}",c);
        }

    }

}

The first lines load the application context and inject DAO classes with the Entity Manager.

The first and the second method get all records of the customer and Product table and get all data base table names, each of one is using their corresponding transaction manager (as you can see on @Transactional annotation), until here everything is clear you can querying data from each data base.

But what happens in the third method??

Interesting rigth? and the anwser is… there is not a Transaction implicit, yeah so simple! it works but I don’t have transactional capabilities, that means that I need to manage each transaction in separated ways.

But I read this in stackoverflow:

The javadoc for JpaTransactionManager has some advice on this:

“This transaction manager is appropriate for applications that use a single JPA EntityManagerFactory for transactional data access. JTA (usually through JtaTransactionManager) is necessary for accessing multiple transactional resources within the same transaction. Note that you need to configure your JPA provider accordingly in order to make it participate in JTA transactions.”

In other words, if you find yourself with multiple entity managers, with corresponding tx managers, then you should consider using a single JtaTransactionManager instead. The entity managers should be able to participate in JTA transactions, and this will give you full transactionality across both entity managers, without having to worry about which entity manager you’re in at any one time.

Of course, JtaTransactionManager does require a full JTA-supporting application server, rather than a vanilla servlet engine like Tomcat.

Maybe one day I’ll write about how to do this with the server container but now this is enough.

Cheers

Sonar Scanner inspect multi module java project (Mixed eclipse with maven style) from command line

Well, recently I was need to review legacy project to do meassuring the code quality java.

There are a project with multi module structure based on old eclipse syle mixed with maven structure, I have 3 modules (2 in eclipse and 1 in maven) all of them are used to build the final artifact and I needed to do code inspection with sonarqube.

For example:

\tmp\root
├───module1
├───module2
├───mvnmodule
    └───src

Fortunly sonar can manage this, let’s gets started.

Requirements

  • A sonarqube instance (version 6.6)
  • Sonar scanner tool (version 3.0.3)
  • Compiled classes for projects to analize

Steps

  1. Login into sonar instance and select your profile (at the top rigth corner) then click on My profile
  2. Create a new token on tab Security, write a name for the token and click on Generate and copy the token string, becareful but you won’t be able to see this string again.
  3. Conigure your sonar scanner, open the instalation folder (where you uncompress it) and edit the file conf/sonar-scanner.properties, add the host and login token string:
    #Configure here general information about the environment, such as SonarQube DB details for example
    #No information about specific project should appear here
    
    #----- Default SonarQube server
    sonar.host.url=http://localhost:9000
    
    #----- Default source code encoding
    sonar.sourceEncoding=UTF-8
    
    #----- Security (when 'sonar.forceAuthentication' is set to 'true')
    sonar.login=57e0bf00a0af633f5c0534fc72535c16f2f0fc3b
    
    
  4. Create a project configuration file into your source code folder, go to project folder and create a file named sonar-project.properties, set the properties to binaries (compiled classes) and the source code per module, previously you need it compile the project with their modules. I use eclipse to build each of them. The content of file is something like these:
    #Required project data fron sonar
    sonar.projectKey=com.abc:my-project
    sonar.projectName=abc-my-project
    sonar.projectVersion=1.0
    sonar.sourceEncoding=UTF-8
    
    #sonar.modules=PalacioHierro
    sonar.modules=module1,module2,mvnmodule
    
    sonar.java.source=1.7
    
    #Lib Dir (Opcional) ej: path/library.jar,path/to/classes/dir
    sonar.libraries=module2/lib
    
    # Project Language ( by default is Java)
    sonar.language=java
    
    # Properties can obviously be overriden for
    # each module - just prefix them with the module ID
    module1.sonar.projectName=my-project-m1
    module1.sonar.java.source=1.7
    module1.sonar.sources=src,WebContent
    module1.sonar.java.binaries=build/classes
    
    module2.sonar.projectName=my-projct-m2
    module2.sonar.java.source=1.7
    module2.sonar.sources=src
    module2.sonar.java.binaries=build/classes
    
    mvnmodule.sonar.projectName=mvn-webapp
    mvnmodule.sonar.java.source=1.7
    mvnmodule.sonar.sources=src/main/java,src/main/resources,src/main/webapp
    mvnmodule.sonar.java.binaries=target/classes
    
    sonar.skipDesign=true
    sonar.skipPackageDesign=true
    sonar.profile=my-profile
    
  5. Start the inspection, open an terminal and go to your root project folder and execute sonnar-scanner, this will start the code inspection (maybe take a long time, depend of  size of their projects) at the end you must see in the log something like this “ANALYSIS SUCCESSFUL, you can browse http://loaclhost:9000/dashboard/index/com.abc
  6. In the dashboard, on the code cactegory, you can see the result for the three modules.
    2018-01-25 19_06_14-Code - eph-tarjeta-palacio

That’s it.

Cheers!

Setup CI Server in Cloud for Java projects with code coverage and inspection

logos CI

Yep!  I was wating a long time to wirte about this topic, but finally I’m going to show  how to setup your own Continous Integration Server (aka CI) using cloud services, all of them with free accounts.

Only consider the scope of the functionality is very limited because all services we’re using are free, but if pay for it you can do much more.

I don’t going to explain what is or what are the feautres of the CI server, only I want to say the CI is a concept created by Martin Fowler in 2006 and like he’s mention in her website:

Continuous Integration is a software development practice where members of a team integrate their work frequently, usually each person integrates at least daily – leading to multiple integrations per day. Each integration is verified by an automated build (including test) to detect integration errors as quickly as possible. Many teams find that this approach leads to significantly reduced integration problems and allows a team to develop cohesive software more rapidly.”

Well, let’s get started. We’re going to configure and integrate the next services, so you only need your Github or Bitbucket account.

The demo project

Seguir leyendo

SAP PI/PO Enable Web Service Compression

Many people don’t know the web services has a feature to compress the information for request and response to exchange data between client and server.

The advantages of using this feature are that you can send high volume information in web services without reducing the performance, for example, you can send an XML payload of size 40 MB compressed in some 800 kb.

To enable this you need to add some headers in your web service consumer client. In case to PI/PO the equivalent is adding these parameters in SOAP module adapter configuration:Screenshot_20170814_174631 Like this:

Screenshot_20170814_173354

But you need check which direction of the communication channel is configured, in the previous SAP PI documentation describe the configuration:

The receiver SOAP adapter uses these parameters for the request message; the sender
SOAP adapter uses these parameters for the response message.

If you’re using a Java client with axis you could enable this feature adding this lines into your Out Binding Stub class:

org.apache.axis.client.Call _call = createCall();
// other conf props...
_call.setProperty(HTTPConstants.MC_ACCEPT_GZIP, Boolean.TRUE);
_call.setProperty(HTTPConstants.MC_GZIP_REQUEST, Boolean.TRUE);

This is imperceptible for PI/PO because into the PIMON you can see the message size as normal size, you can check the web service consumer to validate if the request is compressed or not.

Regards