miércoles, 31 de agosto de 2011

Jenkins, the definitive guide, extract.

Today I am not giving my opinion about this book, "Jenkins: The Definitive Guide", but, if I would, I would be a good opinion, it is a definitive guide, it is quite complete and didactic. I read it through in my company today, and I want to remark the next paragraph:

"Introducing Continuous Integration into Your Organization

Continuous Integration is not an all-or-nothing affair. In fact, introducing CI into an
organization takes you on a path that progresses through several distinct phases. Each
of these phases involves incremental improvements to the technical infrastructure as
well as, perhaps more importantly, improvements in the practices and culture of the
development team itself. In the following paragraphs, I have tried to paint an approx-
imate picture of each phase.

Phase 1—No Build Server

Initially, the team has no central build server of any kind. Software is built manually
on a developer’s machine, though it may use an Ant script or similar to do so. Source
code may be stored in a central source code repository, but developers do not neces-
sarily commit their changes on a regular basis. Some time before a release is scheduled,
a developer manually integrates the changes, a process which is generally associated
with pain and suffering.

Phase 2—Nightly Builds

In this phase, the team has a build server, and automated builds are scheduled on a
regular (typically nightly) basis. This build simply compiles the code, as there are no
reliable or repeatable unit tests. Indeed, automated tests, if they are written, are not a
mandatory part of the build process, and may well not run correctly at all. However
developers now commit their changes regularly, at least at the end of every day. If a
developer commits code changes that conflict with another developer’s work, the build
server alerts the team via email the following morning. Nevertheless, the team still tends
to use the build server for information purposes only—they feel little obligation to fix
a broken build immediately, and builds may stay broken on the build server for some
time.

Phase 3—Nightly Builds and Basic Automated Tests

The team is now starting to take Continuous Integration and automated testing more
seriously. The build server is configured to kick off a build whenever new code is com-
mitted to the version control system, and team members are able to easily see what
changes in the source code triggered a particular build, and what issues these changes
address. In addition, the build script compiles the application and runs a set of auto-
mated unit and/or integration tests. In addition to email, the build server also alerts
team members of integration issues using more proactive channels such as Instant
Messaging. Broken builds are now generally fixed quickly.

6 | Chapter 1: Introducing Jenkins

Phase 4—Enter the Metrics

Automated code quality and code coverage metrics are now run to help evaluate the
quality of the code base and (to some extent, at least) the relevance and effectiveness
of the tests. The code quality build also automatically generates API documentation
for the application. All this helps teams keep the quality of the code base high, alerting
team members if good testing practices are slipping. The team has also set up a “build
radiator,” a dashboard view of the project status that is displayed on a prominent screen
visible to all team members.

Phase 5—Getting More Serious About Testing

The benefits of Continuous Integration are closely related to solid testing practices.
Now, practices like Test-Driven Development are more widely practiced, resulting in
a growing confidence in the results of the automated builds. The application is no longer
simply compiled and tested, but if the tests pass, it is automatically deployed to an
application server for more comprehensive end-to-end tests and performance tests.

Phase 6—Automated Acceptance Tests and More Automated
Deployment

Acceptance-Test Driven Development is practiced, guiding development efforts and
providing high-level reporting on the state of the project. These automated tests use
Behavior-Driven Development and Acceptance-Test Driven Development tools to act
as communication and documentation tools and documentation as much as testing
tools, publishing reports on test results in business terms that non-developers can un-
derstand. Since these high-level tests are automated at an early stage in the development
process, they also provide a clear idea of what features have been implemented, and
which remain to be done. The application is automatically deployed into test environ-
ments for testing by the QA team either as changes are committed, or on a nightly basis;
a version can be deployed (or “promoted”) to UAT and possibly production environ-
ments using a manually-triggered build when testers consider it ready. The team is also
capable of using the build server to back out a release, rolling back to a previous release,
if something goes horribly wrong.

Phase 7—Continuous Deployment

Confidence in the automated unit, integration and acceptance tests is now such that
teams can apply the automated deployment techniques developed in the previous phase
to push out new changes directly into production.

Introducing Continuous Integration into Your Organization | 7

The progression between levels here is of course somewhat approximate, and may not
always match real-world situations. For example, you may well introduce automated
web tests before integrating code quality and code coverage reporting. However, it
should give a general idea of how implementing a Continuous Integration strategy in
a real world organization generally works."

One of the best explanations about how to get into Continuous Integration in a coherent way. I hope the author (John Ferguson Smart) don't mind I took this paragraph, I am advertising the book in exchange!

miércoles, 10 de agosto de 2011

Roll your own continuous integration system - Artifactory and MySQL

Roll your own Continuous Integration System (C.I.S.)

Content:
Abstract
Install Tomcat
Basic Tomcat configuration - Memory
Basic Tomcat configuration - JMX
Basic Tomcat configuration - Application Manager and permissions
Apache and uSVN 
Installing Artifactory from WAR
Configure Artifactory and MySQL
Configuring Artifactory security and repositories 

Configuracion de MySQL

This is a very short post, and the reason is that this is more like a remainder that the default installation cannot be used for production environments without changing default database manager (Derby) for any other. My choice is MySQL + filesystem.

Artifactory guys explained it perfectly in their wiki, just take a look, it is easy and I have just tested it, no problem.

http://wiki.jfrog.org/confluence/display/RTF/Running+Artifactory+on+MySQL

It is supposed that you have just installed it from the WAR, stop Tomcat, erase from your artifactory_home everything but "etc", it is the only folder you will need for the derby->anyother change.

Now follow the steps, if you did them alright, you will find the erased folders re-created again, and the following command will show you that some tables were created in the MySQL database: mysqlshow -u root -p artifactory .

Now continue tunning a bit this software in the next entry.

lunes, 25 de julio de 2011

Lombok and Slf4j

I already talked  a bit about Slf4j, as a recommended interface to any implementor logger.

I have already talked about Lombok, and the fantastic way it creates java bytecode in compile time in order to make you safe some time.

Now let's reduce the time and code we need in almost every class:
private static final org.slf4j.Logger log = org.slf4j.LoggerFactory.getLogger(LogExample.class);
with:
@Slf4j

That's all, put that annotation in your class, and you'll have your logger, with name "log", referring that class, and using Slf4j interface. Simply perfect ;)

If you are still a log4j lover, or maybe for other loggers, Lombok provides these annotations:

@CommonsLog
Creates private static final org.apache.commons.logging.Log log = org.apache.commons.logging.LogFactory.getLog(LogExample.class);
@Log
Creates private static final java.util.logging.Logger log = java.util.logging.Logger.getLogger(LogExample.class.getName());
@Log4j
Creates private static final org.apache.log4j.Logger log = org.apache.log4j.Logger.getLogger(LogExample.class);

More information here!

See you soon..

jueves, 21 de julio de 2011

Choosing a Java logger, never use System.out in web applications!

Almost without notice it, loggers appeared everywhere, under that stone, behing the tree, look, another one! As an example, here we can find a handful of them (implementations, facades and factories).

Until now, we had a good reference implementation, log4j, and JUL (java.util.logging), although I always used log4j. An article talking about them. Something is clear, never use System.out in web applications, because it all goes down the drain of catalina.out, that useful file full of garbaje when other thing but server information is written.

But today we are here to talk about loggers. What should I choose? It is not as important as.. what the fuck should I do if some library uses log4j, another JUL, other MyNiceLogger, and so on...?
Two solutions for this agony, "apache commons logging" (JCL) is designed to be used as a thin bridge to the final runtime-chosen logger.

"When writing a library it is very useful to log information. However there are many logging implementations out there, and a library cannot impose the use of a particular one on the overall application that the library is a part of.

The Logging package is an ultra-thin bridge between different logging implementations. A library that uses the commons-logging API can be used with any logging implementation at runtime. Commons-logging comes with support for a number of popular logging implementations, and writing adapters for others is a reasonably simple task.

Applications (rather than libraries) may also choose to use commons-logging. While logging-implementation independence is not as important for applications as it is for libraries, using commons-logging does allow the application to change to a different logging implementation without recompiling code.

 Note that commons-logging does not attempt to initialise or terminate the underlying logging implementation that is used at runtime; that is the responsibility of the application. However many popular logging implementations do automatically initialise themselves; in this case an application may be able to avoid containing any code that is specific to the logging implementation used."


JCL has to be chosen to be effective, if a library chooses the JCL, then its logging is going to pass through your logger and you will be able to control those messages. But no solution if that library chose log4j for its messages.

Another solution, Simple Log Facade for Java (Slf4j) has the same goal, but going even further. It brings two bridges instead of one.
- You are supposed to use Slf4j, that is an interface/facade, and logger can be finally chosen at runtime.
- If your library uses log4j, a bridge called log4j-over-slf4j should be provided, it replaces log4j library, then those messages are driven through Slf4j, and finally redirected to your runtime-chosen logger again.
- An implementation should be finally included, for example, if you want your Slf4j to log into SimpleLog4j, provide slf4j-simplelog4j to drive your messages to that logger, and the runtime library simplelog4j.

Obviously, you didn't understand me.. check this out here.

Finally, by now, Slf4j is better for simplicity and flexibility, it adapts to the main log implementations, even JCL!! you don't have to change your libraries implementation whenever possible. Provide the bridges and all messages goes to your logger automatically.

Now let's dive into the final logger, now you have all your messages driven through Slf4j.
In my company, we had two main options, log4j and logback. It is funny to discover that both were created by the same person, as Slf4j was! This mess is the opus of the same unique person!!

Log4j is a good library, but Logback is better, called for replace the former, and it is said by the author of both. A list of improvements here. Mainly, it is a Slf4j native implementation, no more bridge needed, it is faster, more robust, it smells better... I knew this library, and this mess, only few weeks ago, and I have to admit my error on ignoring the "logger" mess, ignoring the libraries log and not managing it well.

My recommendation, code for Slf4j, use Logback in runtime, and use as much bridges as you need for your libraries to Slf4j. Log4j is deprecated by Logback.

See you soon...

domingo, 10 de julio de 2011

Lombok, cleaning up your code

Much time without writing, ok, I have been working hard in company projects a bit, and the rest of the time trying to improve our tools and processes.

I want to introduce you to "Lombok", a nice tool that could bring light to the darkness of certain classes everyone has seen at least once.

The gain in this case is almost out of discussion, you annotate your code and magically it gets powers and kicks the ass to the bad guys. They are not dependencies and either aspects, because Lombok works in compilation time almost always. Indeed, inclusion of lombok.jar is not necessary in all cases, a few ones doesn't need it.


Some examples from their web:

@Getter and @Setter, it is like using Eclipse "generate getter and setter automatically" but Lombok does even easier and cleaner:



 import lombok.AccessLevel;
 import lombok.Getter;
 import lombok.Setter;
 
 public class GetterSetterExample {
   @Getter @Setter private int age = 10;
   @Setter(AccessLevel.PROTECTED) private String name;
   
   @Override public String toString() {
     return String.format("%s (age: %d)", name, age);
   }
 }


turns in



 public class GetterSetterExample {
   private int age = 10;
   private String name;
   
   @Override public String toString() {
     return String.format("%s (age: %d)", name, age);
   }
   
   public int getAge() {
     return age;
   }
   
   public void setAge(int age) {
     this.age = age;
   }
   
   protected void setName(String name) {
     this.name = name;
   }
 }



Those changes are made in compiled bytecode, although you can decompile this changes (delombok) and see them in Java.

Another example, let's avoid the tricky and ugly closing file even inside and exception catch.



 import lombok.Cleanup;
 import java.io.*;
 
 public class CleanupExample {
   public static void main(String[] args) throws IOException {
     @Cleanup InputStream in = new FileInputStream(args[0]);
     @Cleanup OutputStream out = new FileOutputStream(args[1]);
     byte[] b = new byte[10000];
     while (true) {
       int r = in.read(b);
       if (r == -1) break;
       out.write(b, 0, r);
     }
   }
 }


turns in



 import java.io.*;
 
 public class CleanupExample {
   public static void main(String[] args) throws IOException {
     InputStream in = new FileInputStream(args[0]);
     try {
       OutputStream out = new FileOutputStream(args[1]);
       try {
         byte[] b = new byte[10000];
         while (true) {
           int r = in.read(b);
           if (r == -1) break;
           out.write(b, 0, r);
         }
       } finally {
         if (out != null) {
           out.close();
         }
       }
     } finally {
       if (in != null) {
         in.close();
       }
     }
   }
 }


More: automatic log declaration



 import lombok.extern.slf4j.Log;
 
 @Log
 public class LogExample {
   
   public static void main(String... args) {
     log.error("Something's wrong here");
   }
 }
 
 @Log(java.util.List.class)
 public class LogExampleOther {
   
   public static void main(String... args) {
     log.warn("Something might be wrong here");
   }
 }



turns in



 public class LogExample {
   private static final org.slf4j.Logger log = org.slf4j.LoggerFactory.getLogger(LogExample.class);
   
   public static void main(String... args) {
     log.error("Something's wrong here");
   }
 }
 
 public class LogExampleOther {
   private static final org.slf4j.Logger log = org.slf4j.LoggerFactory.getLogger(java.util.List.class);
   
   public static void main(String... args) {
     log.warn("Something might be wrong here");
   }
 }




(Several implementations of log are provided).

And so on and on... check the features list

http://projectlombok.org/features/index.html

This tool is also integrated with Eclipse (through installation) and Eclipse (only by including it as a dependency). Use it wisely, but use it :)

miércoles, 18 de mayo de 2011

What if I want to deploy/upload several files to artifactory or nexus with Maven?

Regardless of the repository you are using to store generated artifacts, the default Maven configuration for, let's say, WAR file, is upload a WAR file.. interesting..

A common configuration commented here also uploads the source code as a jar file, and that is very recommendable for debugging.

Another uploaded or auto-generated file is a "pom.xml" project descriptor file for your artifact.

But we want to go further, we want to upload the sql file needed to generate our initial database, or the pdf with the documentation of this release.. how to get this?

<plugin>   
  <groupId>org.codehaus.mojo</groupId>   
  <artifactId>build-helper-maven-plugin</artifactId>
  <executions>
    <execution>

      <id>attach-artifacts</id>
      <phase>package</phase>
      <goals> 

        <goal>attach-artifact</goal>
      </goals> 
      <configuration>
        <artifacts> 
          <artifact> 
            <file>src\main\sql\initdb.sql</file>
            <type>sql</type>
            <classifier>create</classifier>
          </artifact>
          <artifact>
            <file>target\site\docs\document.pdf</file>   
            <type>pdf</type>
            <classifier>doc</classifier>
          </artifact>
        </artifacts>
      </configuration>
    </execution>
  </executions>
</plugin>


With this plugin, you are associating with the same artifact the "document.pdf" file, the "initdb.sql" file, the war, the sources jar, the pom.xml file... all them with the same version, and with the possibility of declaring any of them as a dependency.

It will be executed in package phase, beware because you may mistake the phases and attach something not yet created, for example, the pdf may be created in pre-site phase instead of .. compile, for example.

In the case of multiple configurations, you can configure it like this:


<plugin>   
  <groupId>org.codehaus.mojo</groupId>   
  <artifactId>build-helper-maven-plugin</artifactId>
  <executions>
    <execution>

      <id>attach-artifacts1</id>
      <phase>package</phase>
      <goals> 

        <goal>attach-artifact</goal>
      </goals> 
      <configuration>
        <artifacts> 
          <artifact> 
            <file>src\main\sql\initdb.sql</file>
            <type>sql</type>
            <classifier>create</classifier>
          </artifact>
        </artifacts>
      </configuration>
    </execution>

    <execution>
      <id>attach-artifacts2</id>
      <phase>site</phase>
      <goals> 

        <goal>attach-artifact</goal>
      </goals> 
      <configuration>
        <artifacts> 
          <artifact>
            <file>target\site\docs\document.pdf</file>   
            <type>pdf</type>
            <classifier>doc</classifier>
          </artifact>
        </artifacts>
      </configuration>
    </execution>

  </executions>
</plugin>

In this case, if we execute "mvn deploy", only "initdb.sql" will be attached, if we execute "mvn site-deploy", it will be only attached "document.pdf" (because it is supposed to be created in pre-site phase, but I am not sure if it will be deployed in "artifactory" or "nexus", because you don't pass through "deploy" phase, I would say it won't.

The best, if you execute "mvn site-deploy deploy", firstly "document.pdf" is attached in "site" phase, then "initbd.sql" is attached in "package" phase, and finally all them are deployed in "deploy" phase. Really useful!

This is the solution for a problem we wanted to solve in my job, the solution was taken here, thanks!

lunes, 9 de mayo de 2011

Opinion: Simulated annealing algorithm, the ignored biological algorithm

This is a personal opinion article, based on my own experience and alienation.

I knew this algorithm for first time in Artificial Intelligence subject of my Master Degree in Computer Sciences in 2005. I liked it so much that I repeated one more year, only to increment my marks, a FAIL was not enough for me. I know, sometimes my ambition scares me the hell out too.

This algorithm models a universe where the solution to a problem does not try to be perfect, only good (my bro sometimes says that something good is the best enemy of the perfection). The approach is highly random at first, and, little by little, people (ups, I meant the search of the solution) go behaving as they are supposed to, and finally, it is like any other algorithm that reach a similar solution in a fashion way (faster, yes, but not better).

What I really love of this algorithm is not my implementation 5 years ago, I do not really remember what I solved, but it has not gone down in history. What I loved about that algorithm is the ease of application to the psychology of human life:

Example 1:

A child, with all that potential, could be anything in life, and, according with chaos theory, little changes makes big differences, an opinion, an idea or an image could make a child empathize or left aside a sport, a hobby, a passion...
Years later, with less  of that potential, it is not possible for a teenager to be the best in several fields, it is too late for him/her, but it is still headed for other disciplines.
Finally, the child grew adult, and the possibilities are within a cone that gets narrower and narrower. Of course, changes are always possible, my own mother got a job when she was 52 in something she never excelled. When she tried to poison my bro and me several times with her "cuisine", now she cooks very well in a restaurant.

Example 2:

A project, in its first design phase, it is a flower and fruit garden, a unexplored world full of possibilities, the childhood.
Only defining the base language, initial technology and a framework, most possibilities really disappear, and other grow strong. Java, Spring, AOP for Tomcat deploying weakens the possibilities of being portlet and destroys the possibilities of being a C++ desktop application. There are still many things to define and create with patterns and algorithms.
A month to deliver, what can we really change?

Those are my arguments for consider this algorithm essential for a global understanding of any creation, specially for engineers that makes of creation their primary task. Personally, I consider it a biological algorithm, until a God, Creator or BigBang algorithm category were created.


==========================================
Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution.

The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one.

By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends both on the difference between the corresponding function values and also on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves potentially saves the method from becoming stuck at local optima—which are the bane of greedier methods.

The method was independently described by Scott Kirkpatrick, C. Daniel Gelatt and Mario P. Vecchi in 1983,[1] and by Vlado Černý in 1985.[2] The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by M.N. Rosenbluth in a paper by N. Metropolis et al. in 1953.[3]

From wikipedia

jueves, 5 de mayo de 2011

Roll your own Continuous Integration System (C.I.S.): Artifactory repositories configuration: snapshot, releases and security.

Roll your own Continuous Integration System (C.I.S.)

Content:
Abstract
Install Tomcat
Basic Tomcat configuration - Memory
Basic Tomcat configuration - JMX
Basic Tomcat configuration - Application Manager and permissions
Apache and uSVN 
Installing Artifactory from WAR
Configure Artifactory and MySQL
Configuring Artifactory security and repositories 


In order to ease the explanation about security and maven deploy, lets create the simplest Maven project possible.

Create a directory
Create a "pom.xml" file with the following content:

<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://maven.apache.org/POM/4.0.0" xsi:schemalocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelversion>4.0.0</modelversion>

<groupid>com.mycompany</groupid>
<artifactid>mavenproject</artifactid>
<version>1.0-SNAPSHOT</version>
<packaging>pom</packaging>

</project>


Now that you have the out-of-the-box Artifactory installation and a Maven project, try to upload your project to the Artifactory repository manager.

Remember: You must have a Maven installation in order to run those examples. As far as you have followed this blog, you should have installed SpringSource Tool Suite, that brings with Maven. Maybe you need insert "mvn" executable in the PATH.

Try to execute "mvn deploy" in the same directory than "pom.xml".


[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building mavenproject 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ mavenproject ---
[INFO] Installing /home/***/NetBeansProjects/mavenproject10/pom.xml to /home/***/.m2/repository/com/mycompany/mavenproject/1.0-SNAPSHOT/mavenproject-1.0-SNAPSHOT.pom
[INFO]
[INFO] --- maven-deploy-plugin:2.5:deploy (default-deploy) @ mavenproject ---
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.586s
[INFO] Finished at: Tue May 03 22:36:23 CEST 2011
[INFO] Final Memory: 3M/74M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.5:deploy (default-deploy) on project mavenproject: Deployment failed: repository element was not specified in the POM inside distributionManagement element or in -DaltDeploymentRepository=id::layout::url parameter -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException


Read the bold, it means that either in your project or in other possible settings file, you are not specifying where are you willing to deploy your project.

Note: not explained yet, but there is a lifecycle somewhere in Internet that says "deploying requires compiling first" for instance. So, when you ask Maven to do "deploy", you are also asking it to compile, testing and some other actions.

Note: The examples are based in basic configurations and no network troubles. Everything in the same host and so on. Take it as it is, an example.

Go to Artifactory, "Artifacts" tabs, and click in the second repository, "libs-snapshot-local"


By clicking the repository, its information is displayed at the right side. We want "Distribution Management" section. 


Copy it to your "pom.xml" (your project, remember?)


<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.mycompany</groupId>
<artifactId>mavenproject</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>pom</packaging>

<distributionManagement>
<repository>
<id>dhcppc24</id>
<name>dhcppc24-releases</name>
<url>http://localhost:8080/artifactory/libs-release-local</url>
</repository>
</distributionManagement>

</project>


Now execute again "mvn deploy", let's check the response: (Yes, it fails again, I do not like suspense)

[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building mavenproject 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ mavenproject ---
[INFO] Installing /home/***/NetBeansProjects/mavenproject10/pom.xml to /home/***/.m2/repository/com/mycompany/mavenproject/1.0-SNAPSHOT/mavenproject-1.0-SNAPSHOT.pom
[INFO]
[INFO] --- maven-deploy-plugin:2.5:deploy (default-deploy) @ mavenproject ---
Downloading: http://localhost:8080/artifactory/libs-release-local/com/mycompany/mavenproject/1.0-SNAPSHOT/maven-metadata.xml
Uploading: http://localhost:8080/artifactory/libs-release-local/com/mycompany/mavenproject/1.0-SNAPSHOT/mavenproject-1.0-20110503.214739-1.pom
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.660s
[INFO] Finished at: Tue May 03 23:47:39 CEST 2011
[INFO] Final Memory: 3M/74M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.5:deploy (default-deploy) on project mavenproject: Failed to deploy artifacts: Could not transfer artifact com.mycompany:mavenproject:pom:1.0-20110503.214739-1 from/to dhcppc24 (http://localhost:8080/artifactory/libs-release-local): Failed to transfer file: http://localhost:8080/artifactory/libs-release-local/com/mycompany/mavenproject/1.0-SNAPSHOT/mavenproject-1.0-20110503.214739-1.pom. Return code is: 409 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException


This error means that you are not allowed to do the upload. That is because you are trying to upload an SNAPSHOT in a RELEASE repository. WTF?

See your pom.xml, <version>1.0-SNAPSHOT</version>. That -SNAPSHOT is being read by Artifactory in order to determine whether it is a release or not.
Now travel to Artifactory. Log in (user "admin", password "password"), and go to Admin tab.



Inside Admin tab, Repositories menu and put your mouse over "libs-release-local", click "Edit".




There it is, the guts of the repository configuration. See how out-of-the-box "libs-release-local" repository can only handle with RELEASES, and you are trying to upload a SNAPSHOT. It is not possible!!.


First option, change <version>1.0-SNAPSHOT</version> for <version>1.0</version>. Come on! do it! it will fail xD. Why? You do not have permissions yet :P
It is not a good idea anyway. Just do not release anything until you know what are really doing.

Second option, in that window, you can tick "Handle Snapshots" and it will fail, because of permissions. It is a matter of time.

Third option, see above now to copy the "distribution management" information of a repository, and take the information from "libs-snapshot-local". This is the best option right now. Your "pom.xml" should look like:


<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.mycompany</groupId>
<artifactId>mavenproject</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>pom</packaging>

<distributionManagement>
<repository>
<id>dhcppc24</id>
<name>dhcppc24-releases</name>
<url>http://localhost:8080/artifactory/libs-release-local</url>
</repository>
<snapshotRepository>
<id>dhcppc24</id>
<name>dhcppc24-snapshots</name>
<url>http://localhost:8080/artifactory/libs-snapshot-local</url>
</snapshotRepository>

</distributionManagement>

</project>


Now try "mvn deploy" in order to confirm that I was right, permissions are not set yet.


[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building mavenproject 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ mavenproject ---
[INFO] Installing /home/***/NetBeansProjects/mavenproject10/pom.xml to /home/***/.m2/repository/com/mycompany/mavenproject/1.0-SNAPSHOT/mavenproject-1.0-SNAPSHOT.pom
[INFO]
[INFO] --- maven-deploy-plugin:2.5:deploy (default-deploy) @ mavenproject ---
Downloading: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/1.0-SNAPSHOT/maven-metadata.xml
Downloaded: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/1.0-SNAPSHOT/maven-metadata.xml (359 B at 7.5 KB/sec)
Uploading: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/1.0-SNAPSHOT/mavenproject-1.0-20110505.213434-2.pom
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.637s
[INFO] Finished at: Thu May 05 23:34:34 CEST 2011
[INFO] Final Memory: 3M/74M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.5:deploy (default-deploy) on project mavenproject: Failed to deploy artifacts: Could not transfer artifact com.mycompany:mavenproject:pom:1.0-20110505.213434-2 from/to dhcppc24 (http://localhost:8080/artifactory/libs-snapshot-local): Failed to transfer file: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/1.0-SNAPSHOT/mavenproject-1.0-20110505.213434-2.pom. Return code is: 401 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException


Now let's set the damn permissions!

"Admin" tab, menu "Permissions"


New (in the right side)



We are going to configure the permissions for all our local repositories:

Name: Local repositories
Include Patterns: Any
Exclude Patterns: None
Repositories: Any Local Repository


Users tab: Anonymous users must be able to Deploy, automatically will be able to Annotate and Read.


And "Create" it, it is done.

Repeat the operation, "mvn deploy", and it should be able to compile and deploy your project to Artifactory repository :)


[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building mavenproject 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ mavenproject ---
[INFO] Installing /home/***/NetBeansProjects/mavenproject10/pom.xml to /home/***/.m2/repository/com/mycompany/mavenproject/1.0-SNAPSHOT/mavenproject-1.0-SNAPSHOT.pom
[INFO]
[INFO] --- maven-deploy-plugin:2.5:deploy (default-deploy) @ mavenproject ---
Downloading: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/1.0-SNAPSHOT/maven-metadata.xml
Downloaded: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/1.0-SNAPSHOT/maven-metadata.xml (359 B at 5.7 KB/sec)
Uploading: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/1.0-SNAPSHOT/mavenproject-1.0-20110505.214558-2.pom
Uploaded: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/1.0-SNAPSHOT/mavenproject-1.0-20110505.214558-2.pom (823 B at 3.4 KB/sec)
Downloading: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/maven-metadata.xml
Downloaded: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/maven-metadata.xml (351 B at 8.6 KB/sec)
Uploading: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/1.0-SNAPSHOT/maven-metadata.xml
Uploaded: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/1.0-SNAPSHOT/maven-metadata.xml (598 B at 20.1 KB/sec)
Uploading: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/maven-metadata.xml
Uploaded: http://localhost:8080/artifactory/libs-snapshot-local/com/mycompany/mavenproject/maven-metadata.xml (317 B at 4.1 KB/sec)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.034s
[INFO] Finished at: Thu May 05 23:45:58 CEST 2011
[INFO] Final Memory: 3M/74M
[INFO] ------------------------------------------------------------------------




Last travel to Artifactory, let's find the uploaded artifact.


In the "Artifacts" tab, navigate through "libs-snapshot-local" and you will arrive to your project.

By clicking in the artifact itself, you will get the Maven code for declaring it as dependency for other projects.


<dependency>
<groupId>com.mycompany</groupId>
<artifactId>mavenproject</artifactId>
<version>1.0-20110505.214856-1</version>
<type>pom</type>
</dependency>



The end... quite long, isn't it?

sábado, 30 de abril de 2011

Surveillance testing

We are far from being experts in testing but we detected a necessity and we did not know how to name it exactly.

While we were creating a library for an easy use of a web service, and you may find it hard to believe but we were using unit testing in its creation. Finally we agreed in the next points.

- The WS we are using is a risky item, due it was done by others for us, and the service might change, fail or degrade its performance.
- Unit testing should not use the WS, because it must be unit, isn't it?
- The testing we need is close to "system testing", but system testing aren't so prone to automatize, and the compilation of a library shouldn't depend on a test against a remote server.

With those and several more ideas, we thought of automate it through unit testing a series of tests, using our library, against the WS server, so we could monitor the server, its response times, its availability. So we will finally create a different project only to check the server, using our library, and potentially that project will be completed with other test for other dependencies.

There will be multiple pros with this approach, going further than a monitoring tool could reach.
- Development department scope, for those companies that look a battlefield.
- Unit test executed regularly with Jenkins, no more software needed.
- Sometimes providers make mistakes, we must know the sooner possible.

Result:
A new source empty project, with some unit test using our library, using the external web service, and checking it is ok. This project is inside our software cycle, using SVN, Artifactory and Jenkins, and it is scheduled for a midnight execution. If a test fails, an email will be sent telling us about it.
Now we talk about a new kind of test in our department :)

Keep learning!

lunes, 25 de abril de 2011

Book of the month: Ender's Game

It did not take me three months to read it, neh? Indeed, I have not finished reading it yet, but I have to make the most of my spare time and I will write as much as I could today.

Ender's Game tell us about the solitude of the power, the hard of being a highly-gifted person and the jealousies it brings. How respect is won through excellence and not by fear or threatens. It is a model where the best are isolated in order to get the most of them.

It is curious that, during the reading of this book, I have also read this:

http://www.stanfordalumni.org/news/magazine/2007/marapr/features/dweck.html
Explanation in Spanish: http://www.cookingideas.es/el-efecto-del-esfuerzo-20110424.html

They, a sci-fi book about highly-gifted and an English research, met in a similar way to educate children in order to keep high the motivation (or how to squeeze their brains up). The main idea, do not tag a child as bad, do not tag a child as good. Although a craft may be perfect, it is the craftsman can always learn. Do not tell your child he is in the top because he can stop learning.

ps: in the last months all I've read is Isaac Asimov's some novels, they are enjoyable, but except "I Robot", the rest of the Robots novels do not teach so much. That's the reason I have not write a word about book recommendations.

domingo, 10 de abril de 2011

Roll your own Continuous Integration System (C.I.S.): Artifactory installation, Linux, MySQL and Tomcat 6.x

Roll your own Continuous Integration System (C.I.S.)

Content:
Abstract
Install Tomcat
Basic Tomcat configuration - Memory
Basic Tomcat configuration - JMX
Basic Tomcat configuration - Application Manager and permissions
Apache and uSVN 
Installing Artifactory from WAR
Configure Artifactory and MySQL
Configuring Artifactory security and repositories 

War installation

There are numerous well-formed tutorials about how to install Artifactory as the chosen Maven Dependencies Repository of our Continuous Integration System.

I will summarize their steps as concisely as possible:
  • Create a directory with owner tomcat.tomcat (user/group) where Artifactory is meant to put all its information and artifacts. Let's say... /var/artifactory
  • Edit /usr/share/tomcat6/conf/tomcat6.conf and add the variable anywhere in the file. I prefer at top. ARTIFACTORY_HOME="/var/artifactory"
  • Download last version of OSS Artifactory (Or pay the license if you find it worthy, of course).
  • Unzip and install artifactory.war into your Tomcat6, remember how?
    • you may copy artifactory.war into /usr/share/tomcat6/webapps/ if you have direct control over the filesystem, or
    • you may use the Manager (/manager/html) application if you installed it previously
  • Once artifactory is copied, automatically is run, and you should see in your ARTIFACTORY_HOME how some directories have appeared. Now you have a secureless, derby-managed Artifactory installation not ready for production, but perfect for testing.

A trip to a default configured Artifactory

Some default features:
- It supports Maven, Ivy and Gradle systems. We will only use Maven here.
- Artifactory provide several remote dependency repositories, and it will act as a proxy downloading and storing from those repositories for you.
- A dead-simple security schema. It is possible only for admins to upload files. User:admin, Pass: password. Shhh, it's a secret ;)
- Search engine for classes, packages and other files.


Welcome Page:

Do you need more explanations?

Maven settings:

As said before, Artifactory can serve to Maven, Gradle and Ivy.
See the Home tab -> Left menu -> Client settings -> Maven Settings. (You are allowed to nose around Ivy and Gradle, but don't tell me :P )
Next, you have some sections to declare and put in your settings.xml configuration file. It's a bit early for this, don't worry, only keep it in mind.

Artifacts:

Some layouts to improve your feeling in searches (What the fuck, so much time reading Microsoft marketing), mainly they are offered to you both tree and list structure.

The tree layout is created through the groupId + artifactId, being the default repositories the roots of the trees, one for each repository.

Notice that if you click over a repository, you will get a Maven snippet for your pom.xml, for example:


<distributionManagement>
    <repository>
        <id>linux-wdx0</id>
        <name>linux-wdx0-releases</name>
        <url>http://localhost:8080/artifactory/libs-3rd-party</url>
    </repository>
</distributionManagement>


If you paste the snippet in your pom.xml (remember, only one distributionManagement per pom.xml, so, if you already have one, you will have to merge them), you will be able to upload (if you are authorized, of course) generated artifacts to that repository with "mvn deploy" in deploy phase.

Default repositories in Artifactory:

libs-release-local: Your libraries and products go here, only releases.
libs-snapshots-local: Your libraries and products go here, snapshots.
plugins-release-local: Your plugins, or plugins needed by you, releases, here.
plugins-snapshots-local: Your plugins, or plugins needed by you, releases, here.
ext-releases-local: 3rd party libraries, releases, needed by your projects. (i.e. Spring, ojdbc driver.. )
ext-snapshot-local: 3rd party libraries, snapshots, needed by your projects.

Artifact resolution in Artifactory:
I do not like to simply repeat what is already written and it is not improvable by me:
http://wiki.jfrog.org/confluence/display/RTF/Understanding+Repositories

Here finishes first chapter about Artifactory. Coming soon:  Simple security and customization of repositories.



miércoles, 6 de abril de 2011

Enhacing your Testing, Performance, Load, Stress...

I have been asked for something I wanted to learn. Stress testing of our applications before publishing them on the web.

I don't know a word yet about it, I have first to evaluate some tools, some methodology, and then design the test and ... by then, surely the application will have been already published.

First stop:


http://www.pylot.org
http://clif.ow2.org/
http://sourceforge.net/projects/dieseltest/
http://java.net/projects/faban/
http://code.google.com/p/grinderstone/
http://www.hpl.hp.com/research/linux/httperf/
http://iperf.sourceforge.net/
http://www.ixorarms.com/
http://jchav.blogspot.com/
http://jcrawler.sourceforge.net/
http://www.loadui.org/
http://opensta.org/
http://www.joedog.org/
http://tsung.erlang-projects.org/
http://jakarta.apache.org/jmeter/
 
I have much to read. 

miércoles, 30 de marzo de 2011

Other useful Maven plugins, Reporting and Site (Part III)


FindBugs Maven Plugin

FindBugs looks for bugs in Java programs. It is based on the concept of bug patterns. A bug pattern is a code idiom that is often an error. Bug patterns arise for a variety of reasons:
Difficult language features
Misunderstood API methods
Misunderstood invariants when code is modified during maintenance
Garden variety mistakes: typos, use of the wrong boolean operator

FindBugs uses static analysis to inspect Java bytecode for occurrences of bug patterns. We have found that FindBugs finds real errors in most Java software. Because its analysis is sometimes imprecise, FindBugs can report false warnings, which are warnings that do not indicate real errors. In practice, the rate of false warnings reported by FindBugs is generally less than 50%.

Usage:
[<reporting>]
<plugins>
    <plugin>
        <groupId>org.codehaus.mojo</groupId>
        <artifactId>findbugs-maven-plugin</artifactId>
        <version>2.3.1</version>
        <configuration>
            <excludeFilterFile>${findBugs.excludeFilterFile}</excludeFilterFile>
        </configuration>
    </plugin>
</plugins>
[</reporting>]

Notice that you can make ignore some classes in its analysis, by putting some value on excludeFilterFile property.

Source: http://mojo.codehaus.org/findbugs-maven-plugin/index.html


Maven CheckStyle Plugin

The Checkstyle Plugin generates a report regarding the code style used by the developers. For more information about Checkstyle, see http://checkstyle.sourceforge.net/. This version of the plugin uses Checkstyle 5.0.

The plugin can be configured in the project's POM. Predefined rulesets are included with the plugin, these are: sun_checks.xml, turbine_checks.xml, avalon_checks.xml and maven_checks.xml. You can also use a custom ruleset by specifying it in the plugin configuration.

Usage:
[<reporting>]
<plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-checkstyle-plugin</artifactId>
        <version>2.6</version>
        <configuration>
            <configLocation>PUT HERE YOUR CONFIGURATION</configLocation>
            <consoleOutput>true</consoleOutput>
        </configuration>

The PMD plugin allows you to automatically run the PMD code analysis tool on your project's source code and generate a site report with its results. It also supports the separate Copy/Paste Detector tool (or CPD) distributed with PMD.

The plugin accepts configuration parameters that can be used to customize the execution of the PMD tool.

Usage:
[<reporting>]
<plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-pmd-plugin</artifactId>
        <version>2.5</version>
        <configuration>
            <targetJdk>${java-version}</targetJdk>
            <excludes>
                <exclude>${pmd.excludes}</exclude>
            </excludes>
        </configuration>
    </plugin>
</plugins>
[</reporting>]

Notice you can make PMD plugin to ignore some classes.


Maven Javadoc Plugin

The Javadoc Plugin uses the Javadoc tool to generate javadocs for the specified project. For more information about the standard Javadoc tool, please refer to Reference Guide.

Usage:
[<reporting>]
<plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-javadoc-plugin</artifactId>
        <version>2.7</version>
    </plugin>
</plugins>
[</reporting>]

Source: http://maven.apache.org/plugins/maven-javadoc-plugin/


JMR Maven Plugin

The JXR Plugin produces a cross-reference of the project's sources. The generated reports make it easier for the user to reference or find specific lines of code. It is also handy when used with the PMD Plugin for referencing errors found in the code.

Usage:
[<reporting>]
<plugins>
    <plugin>
        <groupId>org.codehaus.mojo</groupId>
        <artifactId>jxr-maven-plugin</artifactId>
        <version>2.0-beta-1</version>
    </plugin>
</plugins>
[</reporting>]


Taglist Maven Plugin


The Taglist Maven Plugin generates a report on various tags found in the code, like @todo or //TODO tags.

Usage:
[<reporting>]
<plugins>
    <plugin>
        <groupId>org.codehaus.mojo</groupId>
        <artifactId>taglist-maven-plugin</artifactId>
        <version>2.4</version>
    </plugin>
</plugins>
[</reporting>]

Maven Site Plugin

All the previous plugins are meant to provide data to three possible targets. Humans, Hudson/Jenkins or Maven Site

The Site Plugin is used to generate a site for the project. The generated site also includes the project's reports that were configured in the <reporting> section of the POM.

Once you execute "mvn site", a html site is created in your target/site directory, where all or most of previous reports are presented, together with project information like developers, scm or issues.

Usage:
[<reporting>]
<plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-site-plugin</artifactId>
        <version>2.0-beta-6</version>
        <configuration>
            <locales>en</locales>
            <outputEncoding>${project.reporting.outputEncoding}</outputEncoding>
        </configuration>
    </plugin>
</plugins>
[</reporting>]

sábado, 26 de marzo de 2011

Q outside the computer, wires organization

I feel proud of these two wire-organizers I have assembled, one of them is "original" (as far as I have never seen it anywhere), and other taken from a web (I do not remember what).

I was fed up of having an entire drawer only for wires and chargers, of bending themselves before every use, and I finally gave up of doing so. This way I designed these two solutions:

For USB wires, I hanged them using some papers-pin that, in turn, is grabbing a piece of cardboard stuck to a cupboard. It is easy to distinguish  which of the wires you need and easy to relocate once it is not necessary. Furthermore, it is nearby the computer, their maximum consumer.







For the mobile and other device chargers, the system was improved. As they cannot be seized by a paper-pin, I stuck a magnet to the chargers' head, and glued another magnets, correctly oriented, to another cardboard stuck to the cupboard.

I really love this system because it is really fast, tidy, easy to maintain and ... smart! And black over black does not attract attention.





Conclusion: This works because this is already being used. That's the key of success of a methodology or system.

domingo, 20 de marzo de 2011

Managing Maven dependencies with an external dependencies repository

Content:

Maven abstract.
Tunning your Maven proyect
Maven standard folders
Managing dependencies with Maven
Adding a nature in Eclipse
Maven profiles inheritance
Managing Maven dependencies with an external dependencies repository


I have been talking a lot about Maven dependencies for a local machine. It is high time to start talking about remote dependencies.

When you define a dependency in your pom.xml file, that dependency is sought in repo1.maven.org. But your libraries and internal dependencies are not there. Certainly they are in your local repository, but not in your workmate's own.

I do not find necessary to speak in favour of a code repository, the reasons are widely known. Reasons for a dependency repository might be less known because code is always used, but Maven is not (always).
Basic scenario is easy to find nevertheless:
  • You create a library for a project with Maven.
  • You design that library as a dependency for that project.
  • You might want your workmate to compile the project in his/her computer, but...
    • Binaries should never be uploaded to code repository!
    • Shared folders are system dependant and difficult to automatize, trace and export.
    • We are looking for zero specific configuration in a compilation.
If you could only upload to a private repo1.maven.org repository in order to make accessible an internal dependency for your company... You can!

There are two main open-source implementation for this task: Nexus and Artifactory. Both seem alike, but I chose to know deeply the last one, maybe I could find some time for comparing both in installation, configuration and performance. 

If you had in your pom.xml a configuration like this:

    <repositories>
        <repository>
            <snapshots />
            <id>snapshots</id>
            <name>libs-internal</name>
            <url>http://localhost:8080/artifactory/libs-internal</url>
        </repository>
    </repositories>

With this entry in pom.xml, we are asking Maven to search in this URL for any dependency we might need, and, for being in pom.xml, that configuration is project-scoped, being shared with the project itself in code repository.

All we have to keep in mind is that the server MUST be accessible by any of your partners who you are expecting to compile the project. In this example, our partners will find that they do not have Artifactory installed in their localhost, and you will find lots of complains for creating a local-dependant configuration for a department project.

Your second try made it better:

        <repository>
            <snapshots />
            <id>snapshots</id>
            <name>libs-internal</name>
            <url>http://srvmachine:8080/artifactory/libs-internal</url>
        </repository>

and now, if that machine is accessible for all your department, and permissions are well configured, so firewall is, and half a dozen more configurations, your dependencies would be able to download broadly.

However, as you find comfortable to download dependencies, you will find awful to upload your libraries manually through web interface. It is possible as well to improve this.

With the correct Artifactory and Nexus (trying to be neutral by now) settings, you can automatize your system for uploading a library once it is compiled by Maven:

<distributionManagement>
    <repository>
        <id>linux-wdx0</id>
        <name>linux-wdx0-releases</name>
        <url>http://srvmachine:8080/artifactory/libs-release-local</url>
    </repository>
</distributionManagement>


by typing "mvn deploy" instead of "mvn install" in your Maven console or IDE compilation button.

Now you can enjoy of an easy way to share and download department-scoped libraries within your department in a new fashion way of making things work. Your objetive: your project should compile and run only by downloading it in any machine with zero configuration.