Friday, August 30, 2013

How to migrate SVN with history to a new Git repository?

Magic:
$ git svn clone http://svn/repo/here/trunk
Git and SVN operate very differently. You need to learn Git, and if you want to track changes from SVN upstream, you need to learn git-svn. The git-svn man page has a good examples section:
$ git svn --help
http://stackoverflow.com/questions/79165/how-to-migrate-svn-with-history-to-a-new-git-repository

Git - Won't add files?

I found myself in a similar situation as the poster:
If I call "git add .", and then "git status" and it keeps saying "working directory clean" and has nothing to commit.
But I had a different solution than what's here. Since I came to this first, I hope to save others some time.
From the above answers and what I've seen elsewhere, the usual fixes to this problem are:
  • Ensure there are actually saved changes on the file in question
  • Ensure the file doesn't meet your exclude rules in .gitignore and .git/info/exclude
  • You're not trying to add an empty folder. Git won't track those. Standard solution is to place a blank file named .gitignore as a placeholder so git will track the folder.
In my case, I had originally tried to create a git repo around an existing repo (not knowing it was there). I had removed the .git folder from this sub repo a while ago, but I didn't realize that it was too late, and git was already tracking it as a submodule. You can read more about how these behave and how to remove them here, but
  • the solution for me was to simply run git rm --cached path_to_submodule.

Thursday, August 29, 2013

When pushing to remote Git repo using EGit in Eclipse, what should I choose?

You see this screen in Egit Push URI documentation:
Push Ref Specification
That is where you define the refspecs:
A "refspec" is used by fetch and push operations to describe the mapping between remote Ref and local Ref.
Semantically they define how local branches or tags are mapped to branches or tags in a remote repository.
In native git they are combined with a colon in the format :, preceded by an optional plus sign, + to denote forced update.
In EGit they can be displayed and also edited in tabular form in the Push Ref Specification and the Fetch Ref Specification and other dialogs.
The "left-hand" side of a RefSpec is called source and the "right-hand" side is called destination.
Depending on whether the RefSpec is used for fetch or for push, the semantics of source and destination differ:
For a Push RefSpec, the source denotes a Ref in the source Repository and the destination denotes a Ref in the target Repository.

Push Refspecs

A typical example for a Push RefSpec could be
HEAD:refs/heads/master
This means that the currently checked out branch (as signified by the HEAD Reference, see Git References) will be pushed into the master branch of the remote repository. 
http://stackoverflow.com/questions/10365958/when-pushing-to-remote-git-repo-using-egit-in-eclipse-what-should-i-choose

Problems setting up a dynamic web project in eclipse using Java EE and Tomcat

I created a maven project using the command mvn archetype:generate ... to generate the project structure which I then imported into Eclipse. I then added a dynamic web facet to the project.
You shouldn't have to add any facet, things should just work if your project has a packaging of type war.
It would thus have been nice to provide the full command you used to create your project with the archetype plugin. Did you use the maven-archetype-webapp archetype? Did you ran something like that:
mvn archetype:generate -DarchetypeArtifactId=maven-archetype-webapp \
  -DgroupId=my.group.id -DartifactId=my-artifact -Dversion=1.0-SNAPSHOT
If not, then be sure that your pom has a war and that you use the default structure for a war project (see Usage for an example).
Then, what plugin are you using for the Eclipse integration? How did you import the project into Eclipse?
If you are using the maven-eclipse-plugin (if you ran eclipse:eclipse), then you need to configure it for WTP support. You need to pass the wtpversion on the command line (or to configure the plugin in the POM):
mvn -Dwtpversion=2.0 eclipse:eclipse
If you are using m2eclipse, then just import your project as a Maven Project (right-click the Package Explorerthen Import... > Maven Projects).
In both case, your project should be recognized as a Dynamic Web Module that you can Run on Server). There is nothing manual to configure for this (no facet to add).
Update: Did you install the Maven Integration for WTP (Optional) when installing m2eclipse?

mvn eclipse:eclipse within Eclipse

Yup, when you right-click over the project, in the Maven sub-menu, you have an Update Project Configuration command which does precisely that.

http://stackoverflow.com/questions/3288005/mvn-eclipseeclipse-within-eclipse

Wednesday, August 28, 2013

How do I find out if an oracle database is set to autocommit?

There is no such thing as autocommit in Oracle (server). Some client applications however default to autocommit (meaning they deliberately issue a commit between each statement). You will have to read the documentation of your application in order to determine if this is the case.

http://stackoverflow.com/questions/1366851/how-do-i-find-out-if-an-oracle-database-is-set-to-autocommit

Friday, August 23, 2013

What is a JPA implementation?

JPA
JPA is just an API (hence Java Persistence API) that requires an implementation to use.
An analogy would be using JDBC. JDBC is an API for accessing databases, but you need an implementation (a driver jar file) to be able to connect to a database. On its own, without a driver, you cannot do anything with a database.
With JPA, as I said, you need an implementation, a set of classes that lie "below" JPA, and such an implementation will do what you want.
Your application uses the JPA API (this wording is a bit cubersome, but I hope you get the idea), which then communicates with the underlying implementation.
Popular implementations include HibernateEclipseLinkOpenJPA and others.
Every one of them implement the JPA API, so if you use only JPA, every implementation should act the same.
But! The functionality provided by these implementations might go beyond the standard JPA API.
If you want to use this particular functionality, you will have to use vendor specific API that will not be compatible with others.
For example, even though JPA defines the @Id annotation with ID generation options, when using Hibernate, you can use also @org.hibernate.annotations.GenericGenerator for Hibernate specific generation strategies.
Using this annotation will not work unless you're using Hibernate as the underlying implementation.
The bottom line is: JPA is the "least common denominator" which every vendor implements, and every implementation might have some more advanced features that aren't standard.
If you want your application to be portable, use only JPA. If you are sure you won't change your mind later on and switch implementations, use JPA + vendor specific features.

Differences Between Hibernate and JPA

Fans of the Hibernate object-relational mapping (ORM) framework will realize that the Java Persistence API (JPA) is basically a standardization of that framework. And, if you’re like me, you really haven’t given much thought to JPA, because, gee, isn’t it just a watered-down Hibernate? Maybe, maybe not. (I’m not going to get into that here and now.)
It can be useful, though (even if only politically or procedurally), to code to JPA instead of the Hibernate API. If you find yourself in that situation, you may find this compare/contrast between the two API’s to be useful:
HibernateJPA
SessionFactoryEntityManagerFactory
SessionEntityManager
sessionFactory.getCurrentSession().[method]()entityManager.[method]()
saveOrUpdate()persist()
Query.setInteger/String/Entity()Query.setParameter()
list()getResultList()
uniqueResult()getSingleResult()
uniqueResult() returns nullgetSingleResult() throws NoResultException
CriteriaQueries – yesCriteriaQueries – no
Additionally, there are a couple Hibernate-specific JPA-isms to keep in mind:
  • If the underlying JPA implementation is Hibernate, either/both annotations or/and mapping files may be used – at the same time. In such a situation, I believe the mapping files act as an override for the annotations.
  • The best of both worlds (in my mind) is to base the code (at an interface- or API-level) on JPA and its EntityManager, but to have the implementation interact with the Hibernate Session, which can be obtained by calling getDelegate() on the EntityManager.

Multiple projects in one git repo?

In Git, it is better to have each project in its own repo, and the libraries in another repo if needed and used as submodules ( equivalent of svn externals) in each project.
This is because in Git, a repo is a much more lightweight concept than SVN and also, more importantly, there is no ability to clone individual folders ( not to be confused with sparse checkout) within a repo separately like you are able to checkout and work on individual folders in SVN. So if you had all projects in a single repo, you will have to clone them all.
Serving Git repos, using smart-http, git daemon and ssh is pretty straightforward. There is also Gitolitefor managing multiple repos ( including authorization and authentication). Read the linked chapter on ProGit on serving Git repos- http://progit.org/book/ch4-2.html
Coming to your grouping, you can put the repos in folders as per your grouping structure, and serve using, say smart http method, whereby the repo urls will look like the urls that you would have used with SVN, will all projects seeming to be under the grouping etc.

How can I delete a file from git repo?

Use git rm:
git rm file1.txt
git commit -m "remove file1.txt"
http://stackoverflow.com/questions/2047465/how-can-i-delete-a-file-from-git-repo

Thursday, August 22, 2013

Difference between JSP include directive and JSP include action

  • <%@ include file=”filename” %>   is the JSP include directive.
    At JSP page translation time, the content of the file given in the include directive is ‘pasted’ as it is, in the place where the JSP include directive is used. Then the source JSP page is converted into a java servlet class. The included file can be a static resource or a JSP page. Generally JSP include directive is used to include header banners and footers.The JSP compilation procedure is that, the source JSP page gets compiled only if that page has changed. If there is a change in the included JSP file, the source JSP file will not be compiled and therefore the modification will not get reflected in the output.
  • is the JSP include action element.
    The jsp:include action element is like a function call. At runtime, the included file will be ‘executed’ and the result content will be included with the soure JSP page. When the included JSP page is called, both the request and response objects are passed as parameters.If there is a need to pass additional parameters, then jsp:param element can be used. If the resource is static, its content is inserted into the calling JSP file, since there is no processing needed.

How to create my own java library(API)?

Create a JAR. Then include the JAR. Any classes in that JAR will be available. Just make sure you protect your code if you are giving out an API. Don't expose any methods / properties to the end user that shouldn't be used.
Edit: In response to your comment, make sure you don't include the source when you package the JAR. Only include the class files. That's the best you can really do.

Wednesday, August 21, 2013

Normalization in DOM parsing with java - how does it work?

The rest of the sentence is:
where only structure (e.g., elements, comments, processing instructions, CDATA sections, and entity references) separates Text nodes, i.e., there are neither adjacent Text nodes nor empty Text nodes.
This basically means that the following XML element
hello 
wor
ld
could be represented like this in a denormalized node:
Element foo
    Text node: ""
    Text node: "Hello "
    Text node: "wor"
    Text node: "ld"
When normalized, the node will look like this
Element foo
    Text node: "Hello world"
And the same goes for attributes: , comments, etc.

Tuesday, August 20, 2013

Which is the best library for XML parsing in java [closed]

Actually Java supports 4 methods to parse XML out of the box:
DOM Parser/Builder: The whole XML structure is loaded into memory and you can use the well known DOM methods to work with it. DOM also allows you to write to the document with Xslt transformations. Example:
    DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
    factory.setValidating(true);
    factory.setIgnoringElementContentWhitespace(true);
    try {
        DocumentBuilder builder = factory.newDocumentBuilder();
        File file = new File("test.xml");
        Document doc = builder.parse(file);
        // Do something with the document here.
    } catch (ParserConfigurationException e) {
    } catch (SAXException e) {
    } catch (IOException e) { 
    }
SAX Parser: Solely to read a XML document. The Sax parser runs through the document and calls callback methods of the user. There are methods for start/end of a document, element and so on. They're defined in org.xml.sax.ContentHandler and there's an empty helper class DefaultHandler.
    SAXParserFactory factory = SAXParserFactory.newInstance();
    factory.setValidating(true);
    try {
        SAXParser saxParser = factory.newSAXParser();
        File file = new File("test.xml");
        saxParser.parse(file, new ElementHandler());    // specify handler
    }
    catch(ParserConfigurationException e1) {
    }
    catch(SAXException e1) {
    }
    catch(IOException e) {
    }
StAx Reader/Writer: This works with a datastream oriented interface. The program asks for the next element when it's ready just like a cursor/iterator. You can also create documents with it. Read document:
    FileInputStream fis = null;
    try {
        fis = new FileInputStream("test.xml");
        XMLInputFactory xmlInFact = XMLInputFactory.newInstance();
        XMLStreamReader reader = xmlInFact.createXMLStreamReader(fis);
        while(reader.hasNext()) {
            reader.next(); // do something here
        }
    }
    catch(IOException exc) {
    }
    catch(XMLStreamException exc) {
    }
Write document:
    FileOutputStream fos = null;
    try {
        fos = new FileOutputStream("test.xml");
        XMLOutputFactory xmlOutFact = XMLOutputFactory.newInstance();
        XMLStreamWriter writer = xmlOutFact.createXMLStreamWriter(fos);
        writer.writeStartDocument();
        writer.writeStartElement("test");
        // write stuff
        writer.writeEndElement();
        writer.flush();
    }
    catch(IOException exc) {
    }
    catch(XMLStreamException exc) {
    }
    finally {
    }
JAXB: The newest implementation to read XML documents: Is part of Java 6 in v2. This allows us to serialize java objects from a document. You read the document with a class that implements a interface to javax.xml.bind.Unmarshaller (you get a class for this from JAXBContext.newInstance). The context has to be initialized with the used classes, but you just have to specify the root classes and don't have to worry about static referenced classes. You use annotations to specify which classes should be elements (@XmlRootElement) and which fields are elements(@XmlElement) or attributes (@XmlAttribute, what a surprise!)
    RootElementClass adr = new RootElementClass();
    FileInputStream adrFile = null;
    try {
        adrFile = new FileInputStream("test");
        JAXBContext ctx = JAXBContext.newInstance(RootElementClass.class);
        Unmarshaller um = ctx.createUnmarshaller();
        adr = (RootElementClass) um.unmarshal(adrFile);
    }
    catch(IOException exc) {
    }
    catch(JAXBException exc) {
    }
    finally {
    }
Write document:
    FileOutputStream adrFile = null;
    try {
        adrFile = new FileOutputStream("test.xml");
        JAXBContext ctx = JAXBContext.newInstance(RootElementClass.class);
        Marshaller ma = ctx.createMarshaller();
        ma.marshal(..);
    }
    catch(IOException exc) {
    }
    catch(JAXBException exc) {
    }
    finally {
    }
Examples shamelessly copied from some old lecture slides ;-)
Edit: About "which API shoild I use?". Well it depends - not all APIs have the same capabilities as you see, but if you have control over the classes you use to map the XML document JAXB is my personal favorite, really elegant and simple solution (though I haven't used it for really large documents, it could get a bit complex). SAX is pretty easy to use too and just stay away from DOM if you don't have a really good reason to use it - old, clunky API in my opinion. I don't think there are any modern 3rd party libraries that feature anything especially useful that's missing from the stl and the standard libraries have the usual advantages of being extremely well tested, documented and stable.