Tuesday, October 21, 2014

The quest for an OSGI friendly web framework: Day 2 => Apache Karaf

Day 2 =>Apache Karaf


Karaf is an OSGI Container, but then it works on top of Eclipse Equinox or Apache Felix. mmh.. I feel my eyebrows raising as what it really is then. The website promises to support what I am after, which is getting a webframework which works nicely in OSGI.

This blog is one of the articles in a series named Quest for an OSGI friendly Webframework, this as OSGI bundles which can be started/stopped etc.. and we have 1 day for it!.


To work along this exercise, you will need:
- An Eclipse installation (Kepler or Luna).
- Maven Plugin for Eclipse (m2e) and the p2-maven-plugin
- Knowledge of Eclipse PDE Concepts (Features, Plugins, Target Platform, P2 Repositories, Products).

WARNING: This is not a step-by-step instruction, you will need to make sure how to add missing OSGI and other features and bundles to make it all work. 


Karaf is amazing, it takes OSGI to the next level. However there are several issues for now.
First the EIK tooling is behind the Eclipse releases, or as the bug report points out, it's EIK in combination with later Karaf versions. And as it turns out, tooling is essential, as our Eclipse bundles, don't fit as Karaf features. It would be nice if Karaf could process P2 repositories, but that's not the case. It's very maven centric as to be expected and not Eclipse centric. Different worlds again....
As for the objective to deploy a web-app, well the attempt to use the Eclipse implementation of HTTPService failed. It simply won't load easily in Apache, the error I get still unresolved.


The first question I asked myself after having browsed the Karaf website is: How do I bundle my application with Karaf? The instructions to get going with Karaf is to download it and run it, so why not give it a try:

Christophes-MacBook-Pro:bin Christophe$ ./karaf
        __ __                  ____     
       / //_/____ __________ _/ __/     
      / ,<  / __ `/ ___/ __ `/ /_       
     / /| |/ /_/ / /  / /_/ / __/       
    /_/ |_|\__,_/_/   \__,_/_/        

  Apache Karaf (4.0.0.M1)

Hit '<tab>' for a list of available commands
and '[cmd] --help' for help on a specific command.
Hit '<ctrl-d>' or type 'system:shutdown' or 'logout' to shutdown Karaf.

Hey we are up and running, that's pretty cool already.
So let's try a couple of Karaf commands:

Show the installed features:

karaf@root()> feature:list -i
Name            | Version  | Required | Installed | Repository        | Description                                      
aries-proxy     | 4.0.0.M1 |          | x         | standard-4.0.0.M1 | Aries Proxy                                      
aries-blueprint | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Aries Blueprint                                  
shell           | 4.0.0.M1 |          | x         | standard-4.0.0.M1 | Karaf Shell                                      
shell-compat    | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Karaf Shell Compatibility                        
deployer        | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Karaf Deployer                                   
bundle          | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide Bundle support                           
config          | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide OSGi ConfigAdmin support                 
diagnostic      | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide Diagnostic support                       
instance        | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide Instance support                         
jaas            | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide JAAS support                             
log             | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide Log support                              
package         | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Package commands and mbeans                      
service         | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide Service support                          
system          | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide System support                           
kar             | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide KAR (KARaf archive) support              
ssh             | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide a SSHd server on Karaf                   
management      | 4.0.0.M1 | x        | x         | standard-4.0.0.M1 | Provide a JMX MBeanServer and a set of MBeans in K
wrap            | 0.0.0    | x        | x         | standard-4.0.0.M1 | Wrap URL handler                  

Getting my bundles in there (with EIK)

Now, I am still puzzled, how I will package the whole application, or will I have to tell my users to install Karaf and then deploy my bundles. Well, that's not even such a bad idea after all, so let's try to get a couple of bundles in Karaf.

One of the things I look out for, is the availability of tooling for Eclipse and shabang, there is EIK, which stands for Eclipse Integration Karaf, which can be downloaded and installed.

It's documented here

Not sure about what this tooling can do, one hint is to type karaf in the "Quick Access" search box.
EIK comes with a Perspective which shows the Bundles and Services View also contributed by karaf.

It also has a wizard to create a project and tell EIK about the Karaf Installation:

Browse to the installation of Karaf and click finish.

Unfortunately, the wizard throws an error. (This is the Karaf builder). 

java.lang.NoClassDefFoundError: org/eclipse/pde/internal/core/target/provisional/ITargetPlatformService
    at org.apache.karaf.eik.ui.project.KarafProjectBuilder.createTargetPlatform(KarafProjectBuilder.java:332)
    at org.apache.karaf.eik.ui.project.KarafProjectBuilder.fullBuild(KarafProjectBuilder.java:165)
    at org.apache.karaf.eik.ui.project.KarafProjectBuilder.incrementalBuild(KarafProjectBuilder.java:386)
    at org.apache.karaf.eik.ui.project.KarafProjectBuilder.build(KarafProjectBuilder.java:85)
    at org.eclipse.core.internal.events.BuildManager$2.run(BuildManager.java:733)
    at org.eclipse.core.runtime.SafeRunner.run(SafeRunner.java:42)
    at org.eclipse.core.internal.events.BuildManager.basicBuild(BuildManager.java:206)
    at org.eclipse.core.internal.events.BuildManager.basicBuild(BuildManager.java:246)
    at org.eclipse.core.internal.events.BuildManager$1.run(BuildManager.java:299)
    at org.eclipse.core.runtime.SafeRunner.run(SafeRunner.java:42)
    at org.eclipse.core.internal.events.BuildManager.basicBuild(BuildManager.java:302)
    at org.eclipse.core.internal.events.BuildManager.basicBuildLoop(BuildManager.java:358)
    at org.eclipse.core.internal.events.BuildManager.build(BuildManager.java:381)
    at org.eclipse.core.internal.events.AutoBuildJob.doBuild(AutoBuildJob.java:143)
    at org.eclipse.core.internal.events.AutoBuildJob.run(AutoBuildJob.java:241)
    at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)

And I am never the first of course: https://issues.apache.org/jira/browse/KARAF-2668

We can still continue though....

The create project, will basically link to various folders in the Karaf installation:

According to the documentation of EIK, it should be possible to deploy bundles in the workspace to Karaf with the EIK tooling. Also there is should be a Karafspecific .target, but the error above suggests the .target was not installed properly...

The EIK documentation (Which is not Apache worthy btw, the wording is so bad it hurts!),  lists compatibility with Eclipse versions, and I am surprised to see Kepler and Luna missing. (Kepler has been out for > 1 year at the time of writing).

mmh EIK is behind is my conclusion sofar..

Let's continue...(This doesn't feel right). 

The next step is to create a launch config. As we likely miss bundles in the TargetPlatform, I am curious to see if this will work... .another exception....

    at org.apache.karaf.eik.ui.KarafLaunchConfigurationDelegate.fixKarafJarClasspathEntry(KarafLaunchConfigurationDelegate.java:428)
    at org.apache.karaf.eik.ui.KarafLaunchConfigurationDelegate.getClasspath(KarafLaunchConfigurationDelegate.java:187)
    at org.eclipse.pde.launching.AbstractPDELaunchConfiguration.launch(AbstractPDELaunchConfiguration.java:70)
    at org.apache.karaf.eik.ui.KarafLaunchConfigurationDelegate.launch(KarafLaunchConfigurationDelegate.java:246)
    at org.eclipse.pde.launching.OSGiLaunchConfigurationDelegate.launch(OSGiLaunchConfigurationDelegate.java:47)
    at org.eclipse.debug.internal.core.LaunchConfiguration.launch(LaunchConfiguration.java:858)
    at org.eclipse.debug.internal.core.LaunchConfiguration.launch(LaunchConfiguration.java:707)
    at org.eclipse.debug.internal.ui.DebugUIPlugin.buildAndLaunch(DebugUIPlugin.java:1018)
    at org.eclipse.debug.internal.ui.DebugUIPlugin$8.run(DebugUIPlugin.java:1222)
    at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)

Getting my bundles in there (without EIK)

So playing with EIK, has been very, very disappointing and sorry, I am not going back to Indigo to see it works then... I have an option to not use EIK tooling, let's try that.

Now the instructions on the Karaf website tell to produce a maven pom file. mmh, but I already have OSGI bundles, I want to deploy them, what are my options then?

Reading on, I can deploy a Karaf feature. There is a hot-deploy folder, where I can dump the feature.xml. Unless this gets implemented, I am afraid it will be a manual conversion process for now.

The Karaf feature will contain the references to bundles for this feature.

Here is what it looks like, I purposely do not fill in the bundle section, just to see what happens:

<?xml version="1.0" encoding="UTF-8"?>
<features name="oss2-features" xmlns="http://karaf.apache.org/xmlns/features/v1.2.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://karaf.apache.org/xmlns/features/v1.2.0 http://karaf.apache.org/xmlns/features/v1.2.0">
  <feature name="oss2" version="1.0.0">

Now copying this file to the karaf installation folder/deploy should process the feature.

And indeed, the file is process. typing log:tail shows that this happend:

2014-10-16 16:35:29,151 | INFO  | -4.0.0.M1/deploy | fileinstall                      | 5 - org.apache.felix.fileinstall - 3.4.2 | Updating bundle feature.xml / 0.0.0
2014-10-16 16:35:29,166 | INFO  | -4.0.0.M1/deploy | fileinstall                      | 5 - org.apache.felix.fileinstall - 3.4.2 | Started bundle: feature:file:/Users/Christophe/Downloads/apache-karaf-4.0.0.M1/deploy/feature.xml
2014-10-16 16:35:29,176 | INFO  | lixDispatchQueue | FeaturesServiceImpl              | 6 - org.apache.karaf.features.core - 4.0.0.M1 | Adding features:
2014-10-16 16:35:29,449 | INFO  | pool-29-thread-1 | FeaturesServiceImpl              | 6 - org.apache.karaf.features.core - 4.0.0.M1 | No deployment change.
2014-10-16 16:35:29,452 | INFO  | pool-29-thread-1 | FeaturesServiceImpl              | 6 - org.apache.karaf.features.core - 4.0.0.M1 | Done.

So now let's populate the bundle section, with a file:// reference to one of my bundles:


The feature is now visible:

oss2                          | 1.0.0                            |          |           | oss2-features           |          

and trying to install it:

karaf@root()> feature:install oss2

Error executing command: Unable to resolve root: missing requirement

[root] osgi.identity; osgi.identity=oss2; type=karaf.feature; version="[1.0.0,1.0.0]"; filter:="(&(osgi.identity=oss2)(type=karaf.feature)(version>=1.0.0)(version<=1.0.0))"

[caused by: Unable to resolve oss2/1.0.0: missing requirement [oss2/1.0.0] osgi.identity; osgi.identity=com.netxforge.base; type=osgi.bundle; version="[,]"; resolution:=mandatory

[caused by: Unable to resolve com.netxforge.base/ missing requirement [com.netxforge.base/] osgi.wiring.package; filter:="(osgi.wiring.package=org.eclipse.emf.common.notify)"]]

So this fails on dependencies (as expected), but I am a bit puzzled why these are not listed. I would expect to see a couple of required bundles. BTW It took me a bit of puzzling to understand the log,
but what it says is. 1) There is a problem with oss2 feature, 2) there is a problem with bundle com.netxforge.base 3) there is a problem with a requirement org.eclipse.emf.common.notify

As an alternative, I am going to use a karaf maven archetype facility to create a bundle:

mvn archetype:generate \
    -DarchetypeGroupId=org.apache.karaf.archetypes \
    -DarchetypeArtifactId=karaf-bundle-archetype \
    -DarchetypeVersion=2.2.5-SNAPSHOT \
    -DgroupId=com.netxforge \
    -DartifactId=com.netxforge.oss2.kbundle \
    -Dversion=1.0-SNAPSHOT \

The archetype will create a bundle the maven way.
That is typically slightly different than the way an Eclipse PDE bundle looks like.

Some noticable differences are the fact that the /src folder in Eclipse is now /src/main/java. That there is no META-INF folder holding the MANIFEST.MF file and that there is a pom.xml. So a typical maven project. The pom is interesting, as it includes the configuration to produce a bundle.



Next, I ran run as -> maven build... with goals 'clean install'
This produces the well known target folder with the .jar file in it.

Now trying to install this bundle in Karaf works nicely:

karaf@root()> bundle:list
START LEVEL 100 , List Threshold: 50
ID | State  | Lvl | Version        | Name                            
49 | Active |  80 | 0.0.0          | feature.xml                     
91 | Active |  80 | 0.0.1.SNAPSHOT | com.netxforge.oss2.kbundle Bundle
karaf@root()> bundle:info 91

com.netxforge.oss2.kbundle Bundle (91)

Back to the web 

Web is very much build into Karaf. There is support for classic .WAR files but also OSGI based .WAB files. The following instructions tell us how to install the feature.

The log in karaf will show this:

2014-10-16 21:23:48,024 | INFO  | pool-69-thread-1 | Server                           | 73 - org.eclipse.jetty.util - 9.0.7.v20131107 | jetty-9.0.7.v20131107
2014-10-16 21:23:48,056 | INFO  | pool-69-thread-1 | JettyServerImpl                  | 87 - org.ops4j.pax.web.pax-web-jetty - 4.0.0 | Pax Web available at []:[8181]
2014-10-16 21:23:48,117 | INFO  | pool-69-thread-1 | ServerConnector                  | 73 - org.eclipse.jetty.util - 9.0.7.v20131107 | Started default@7003e8b7{HTTP/1.1}{}

So here, we see a server running on port 8181. How cool.
Firing a browser and localhost:8181 shows indeed a Jetty server running and listing.


Problem accessing /. Reason:
    Not Found

Powered by Jetty://

So, the final task to create a .wab file and deploy it.
But before doing so, there already a couple of deployable web features.

feature:install webconsole

Navigating to the url: localhost:8181/system/console/gogo
gives us this:

Creating a Servlet and Karaf DS

Now, with Eclipse Equinox, I use Declarative Services with annotations.
I wanted to do the same with a Karaf (Or Maven POM first bundle) to reference the HTTPService , but this isn't a trivial thing. This entry explains it, and I will dive into it later on.

What Karaf recommends however is a .wab deployment. a wab is like a war, but then for OSGI.
There is some tooling for a .wab in the Virgo project, as we can see here. That's cool, but do I really need to install more tooling, and why isn't this part of EIK and many more questions....

So, in any event, I want to use this package: org.osgi.service.http (BTW: the full OSGI 4.2 API can be found here ). 
Now, I need to find an implementation for it. Now that's an issue, with with pom first driven development. (I am used to Eclipse, and for getting dependencies, I get a P2 location and bang it's my .target to compile against and use in features and products). For Maven driven dev. I need to look for an implementation from maven central. so I type in: org.osgi.service.http, but nothing shows. 

So, I know there is Equinox and there is Apache Felix (...and yes there are a couple more).

For Apache Felix, I can use these instructions, but for Equinox I can't do maven first. (Please prove me wrong!). The reason is that Equinox is Eclipse and Eclipse is P2 not Maven. (Wow that's blunt, but very true). Compare it with the Apache Felix instructions and you will see that I mean. 

So, first I will deploy an Equinox implementation of HTTPService in Karaf build with Manifest first and Tycho, and next I will deploy an Apache Felix implementation using POM first. 

Equinox HTTPService: 

  1. I create a plugin-Project. Configure it as a maven project and update pom.xml with tycho stuff. (I have a parent pom.xml which sets the target platform to use etc...).
  2. Add it to a Karaf feature, which dump in the /deploy folder. 
  3. Install the feature  (Same as above).
 karaf@root()> bundle:list
START LEVEL 100 , List Threshold: 50
ID | State  | Lvl | Version            | Name                 
49 | Active |  80 | 0.0.0              | feature.xml          
92 | Active |  80 | 3.0.0              | Apache Karaf :: Manual
99 | Active |  80 | | OSS2 HTTP Service    

Perfect, the bundle get's installed, but I haven't added the HTTPService dependency yet, so let's do that. The instructions are here but I don't want to use the extension registry, so I follow a regular HTTPService registration pattern.

The service looks like this:  (It uses OSGI Annotations and registers the service to respond to HTTP GET request).

public class WebDude{

    private HttpService httpService;

    public void activate() {
        try {
            httpService.registerServlet("/dudeme", new WebDudeServlet(), null, null);
        } catch (Exception exception) {

    public void setHTTPService(HttpService httpService) {
        this.httpService = httpService;
    class WebDudeServlet extends HttpServlet {
        private static final long serialVersionUID = 1L;

          protected void doGet(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
            resp.getWriter().write("I am dude");     

And then I build, and notice the dependencies which Karaf will need to know about.
So the jars are in the eclipse bundle pool. (Remember this is an eclipse tycho build which uses a target platform).

 The challenge remaining is to get these dependencies in a karaf feature or add the bundles manually.
(It dawns that a bundle pool scanner, could populate karaf automagically with a custom OBR, mmhhh...).

So I decide to refer the bundles from the eclipse installation bundle-pool in a Karaf feature and
install the feature:

This looks ok....

karaf@root()> bundle:list
START LEVEL 100 , List Threshold: 50
 ID | State    | Lvl | Version                | Name                                                
 49 | Active   |  80 | 0.0.0                  | feature.xml                                         
 92 | Active   |  80 | 3.0.0                  | Apache Karaf :: Manual                              
110 | Active   |  80 |     | OSS2 HTTP Service                                   
111 | Active   |  80 | 3.0.0.v201112011016    | Servlet API Bundle                                  
112 | Resolved |  80 | 1.0.401.v20130327-1442 | Transformer Hook Framework Extension, Hosts: 114    
113 | Resolved |  80 | 1.0.200.v20130327-1442 | Aspect Weaving Hooks Plug-in (Incubation), Hosts: 114
114 | Resolved |  80 | 3.9.1.v20140110-1610   | OSGi System Bundle, Fragments: 113, 112             
115 | Active   |  80 | 3.3.100.v20130513-1956 | OSGi Release 4.2.0 Services

but org.eclipse.osgi fails to start...maybe not a good idea to start org.eclipse.osgi on Karaf then?

2014-10-20 16:51:00,723 | INFO  | ool-206-thread-1 | FeaturesServiceImpl              | 6 - org.apache.karaf.features.core - 4.0.0.M1 |   org.eclipse.osgi / 3.9.1.v20140110-1610
2014-10-20 16:51:00,724 | ERROR | ool-181-thread-2 | FeatureDeploymentListener        | 24 - org.apache.karaf.deployer.features - 4.0.0.M1 | Unable to install features
org.apache.karaf.features.internal.util.MultiException: Error restarting bundles
    at org.apache.karaf.features.internal.service.Deployer.deploy(Deployer.java:765)[6:org.apache.karaf.features.core:4.0.0.M1]
    at org.apache.karaf.features.internal.service.FeaturesServiceImpl.doProvision(FeaturesServiceImpl.java:951)[6:org.apache.karaf.features.core:4.0.0.M1]
    at org.apache.karaf.features.internal.service.FeaturesServiceImpl$1.call(FeaturesServiceImpl.java:857)[6:org.apache.karaf.features.core:4.0.0.M1]
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)[:1.7.0_07]
    at java.util.concurrent.FutureTask.run(FutureTask.java:166)[:1.7.0_07]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)[:1.7.0_07]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)[:1.7.0_07]
    at java.lang.Thread.run(Thread.java:722)[:1.7.0_07]
Caused by: org.osgi.framework.BundleException: Activator start error in bundle org.eclipse.osgi [114].
    at org.apache.felix.framework.Felix.activateBundle(Felix.java:2204)[org.apache.felix.framework-4.4.1.jar:]
    at org.apache.felix.framework.Felix.startBundle(Felix.java:2072)[org.apache.felix.framework-4.4.1.jar:]
    at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:976)[org.apache.felix.framework-4.4.1.jar:]
    at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:963)[org.apache.felix.framework-4.4.1.jar:]
    at org.apache.karaf.features.internal.service.FeaturesServiceImpl.startBundle(FeaturesServiceImpl.java:1030)[6:org.apache.karaf.features.core:4.0.0.M1]
    at org.apache.karaf.features.internal.service.Deployer.deploy(Deployer.java:757)[6:org.apache.karaf.features.core:4.0.0.M1]
    ... 7 more
Caused by: java.lang.ClassCastException: org.eclipse.osgi.framework.internal.core.SystemBundleActivator cannot be cast to org.osgi.framework.BundleActivator
    at org.apache.felix.framework.Felix.createBundleActivator(Felix.java:4362)[org.apache.felix.framework-4.4.1.jar:]
    at org.apache.felix.framework.Felix.activateBundle(Felix.java:2149)[org.apache.felix.framework-4.4.1.jar:]
    ... 12 more

Karaf can also be launched with Eclipse Equinox instead of Felix, let's start that then.

So, I tried to leave org.eclipse.osgi out, and configure Karaf using equinox.
The latter is done, by configuring the the entry:

# Framework selection properties

Unfortunately this fails with:

Error executing command: Uses constraint violation. Unable to resolve resource org.apache.aries.blueprint.core [org.apache.aries.blueprint.core/1.4.1] because it is exposed to package 'org.osgi.service.framework' from resources org.eclipse.osgi [org.eclipse.osgi_3.9.1.v20140110-1610] and org.eclipse.osgi [org.eclipse.osgi_3.9.1.v20140110-1610] via two dependency chains.

Chain 1:
  org.apache.aries.blueprint.core [org.apache.aries.blueprint.core/1.4.1]
    import: (osgi.wiring.package=org.osgi.service.framework)
    export: osgi.wiring.package: org.osgi.service.framework
  org.eclipse.osgi [org.eclipse.osgi_3.9.1.v20140110-1610]

Chain 2:
  org.apache.aries.blueprint.core [org.apache.aries.blueprint.core/1.4.1]
    import: (&(osgi.wiring.package=org.apache.aries.util.tracker)(version>=1.0.0)(!(version>=2.0.0)))
    export: osgi.wiring.package=org.apache.aries.util.tracker; uses:=org.osgi.service.framework
  org.apache.aries.util [org.apache.aries.util/1.1.0]
    import: (&(osgi.wiring.package=org.osgi.service.framework)(version>=1.0.0)(!(version>=2.0.0)))
    export: osgi.wiring.package: org.osgi.service.framework
  org.eclipse.osgi [org.eclipse.osgi_3.9.1.v20140110-1610]

So a problem, I am having a hard time understanding. A dead-end I am afraid.
I am confident Karaf will run with the default http service. (feature:install http), but this will be a dependency I will need to load in Eclipse IDE to work against. Eclipse targets and maven centric solutions are unfortunately still lightyears apart. (Tycho eases from Eclipse to maven, but the other way around??). 

Custom Deployment after all

And then I stumbled on this:


So with a bit of work, I could setup a distribution, fully configured with my Application specific artifacts.

What about head-less builds

Sofar I am building Eclipse artifacts with Tycho. This blog-entry is about a similar experience.


As Karaf is very much maven centric, our bundles would need to be deployed on a  maven repo. But that's for a later concern.

Thursday, October 16, 2014

The quest for an OSGI friendly web framework: Day 1 => Spring and Spring DM

Day 1 => Spring and Spring DM


Springframework has a solid user base and one of the most established Java Web Frameworks. It takes care of a lot of things for building Enterprise Web applications. But then there is OSGI which is a solid foundation for application Modularity.

This blog is one of the articles in a series named Quest for an OSGI friendly Webframework, this first time we turn to about making a minimal SpringFramework web application running as an OSGI bundle which can be started/stopped etc.. and we have 1 day for it!.


To work along this exercise, you will need:
- An Eclipse installation (Kepler or Luna).
- Maven Plugin for Eclipse (m2e) and the p2-maven-plugin
- Knowledge of Eclipse PDE Concepts (Features, Plugins, Target Platform, P2 Repositories, Products).

WARNING: This is not a step-by-step instruction, you will need to make sure how to add missing OSGI and other features and bundles to make it all work. 

Result - FAILED

This failed. Spring mechanisms do not fit in OSGI. I ran into a class loading issue, which doesn't seem resolvable. I also tried Spring-DM, but the last version (1.2.1) is not dependency compatible with Spring 4 framework. A bit more reading, learned me it was donated to Eclipse in the Gemini project known as the Gemini-Blueprint. It's alive for sure. An evaluation of Gemini-blueprint will have to follow. So after one day, not even an HTTP service was up and running...

Still readers might still be interested in reading about the details and approach of this attempt.


The OSGI container I use is Eclipse equinox. Alternatives could Apache Felix, Karaf or another OSGI container.  Depending on the OSGI implementation, the way bundles are deployed differs.

What I know well is Eclipse Equinox OSGI and all the tooling for it. I am less familiar with the SpringFramework. One way to run an Eclipse application is by creating a .product This will reference Eclipse Features, which will then reference the OSGI bundles. Now, it seems a good approach to create an Eclipse Feature which will contain all the needed SpringFramework bundles.

Make the SpringFramework Bundle available to Eclipse

So the first task at hand is to make sure the SpringFramework bundles are available in the Eclipse target platform or in the Workspace to be able to add them to a feature. Getting in the workspace as source code, could be done by cloning the source repository, but I decide on another approach, which is to produce a P2 repository which can then be used to populate the Eclipse Target Platform.

Now it turns out, there is this very cool Maven plugin, which can produce a P2 Repositories from bundles available on a Maven repository. The maven plugin name is p2-maven-plugin.

Following the instructions it is possible to produce a P2 repository for the Springframework bundles.
The produced P2 can then be published on an HTTP server kept locally for consumption by a Eclipse target definition.

In my case, I have pushed the P2 to a web server:

[Screen shot of the P2?]

Now, I can reference the P2 repository and the target definition will look like this:

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- generated with https://github.com/mbarbero/fr.obeo.releng.targetplatform -->
<target name="springframework" sequenceNumber="1413366361">
    <location includeMode="slicer" includeAllPlatforms="true" includeSource="true" includeConfigurePhase="false" type="InstallableUnit">
      <unit id="org.springframework.aop" version="3.0.7.RELEASE"/>
      <unit id="org.springframework.asm" version="3.0.7.RELEASE"/>
      <unit id="org.springframework.beans" version="3.0.7.RELEASE"/>
      <unit id="org.springframework.context" version="3.0.7.RELEASE"/>
      <unit id="org.springframework.core" version="3.0.7.RELEASE"/>
      <unit id="org.springframework.expression" version="3.0.7.RELEASE"/>
      <unit id="org.springframework.instrument" version="3.0.7.RELEASE"/>
      <unit id="org.springframework.jdbc" version="3.0.7.RELEASE"/>
      <unit id="org.springframework.orm" version="3.0.7.RELEASE"/>
      <unit id="org.springframework.transaction" version="3.0.7.RELEASE"/>
      <unit id="org.springframework.security.core" version="0.0.0"/>
      <unit id="org.springframework.security.spring-security-crypto" version="0.0.0"/>
      <repository location="http://p2.netxforge.com/oss2/mvn.p2"/>

Now, it's not said this is the complete list of required Spring Framework bundles, but the basic idea is there.

Now we can load this target platform in Eclipse. (Actually a more complete TP is required, but for illustration purpose, I only list the springframwork repo location in the target above).

Create an Eclipse feature for the SpringFramework (and it's dependencies). 

File -> New -> Feature Project
I named it org.springframework.artifacts.feature

To get started easy, I would recommend just adding


...bundles to the feature. Just to see what happens and what can be consumed from this bundle already.

Add the feature to a .product file and run the product.

Assuming you are familiar with building Eclipse .product files (Which are feature based), Create a .product then add the feature you just defined plus the Equinox bundles.

From the product editor it is now possible to verify the dependencies, and you might be required to add a couple of missing plugins, like org.apache.commons.logging. The more Springframework bundles we will add to the feature, the more 3rd party dependencies will be required, hé!

Now run the product. (I usually specify -console 8811 in the product launch configuration setting, so I can get to the OSGI prompt). Oh and you will need the osgi.console etc... bundles as well...

[4] What's loaded.

Open a terminal and type: telnet 0 8811  to open the OSGI console.
and then perform:

osgi> ss org.springframework
"Framework is launched."

id    State       Bundle
16    RESOLVED    org.springframework.core_3.0.7.RELEASE
40    RESOLVED    org.springframework.expression_3.0.7.RELEASE

Now we will observe the springframework bundles are present in our OSGI application, hooray!
They remain in state RESOLVED, as they are not consumed or force started.

More details can be obtained with

osgi> bundle 16 (See output [1] org.springframework.core bundle details).

From this output we can see which packages this bundle exports and imports. It's interesting to see, the bundle can be activate, while some of the imported packages are not available

For example, the package here is imported by org.springframework.core, but is not exported by any of the bundles.

osgi> packages org.springframework.asm
No exported packages

This is Ok, as these are listed as optional (See [1]

Create an application context

The basic spring application which can be obtained from one of the Spring getting-started guides is an ApplicationContext  and a couple of java classes with Spring annotations.

The guide I followed is this one.

In the example, there is a java main method, but in OSGI we use bundle activators or declarative services to bootup code. So I have created a bundle, activator and a DS to do exactly this.

the DS Service looks like this: (Note: The @Component annotation is an OSGI annotation, not Spring).

public class SpringService {

    public void activate() {
        ApplicationContext context = new AnnotationConfigApplicationContext(
        MessagePrinter printer = context.getBean(MessagePrinter.class);


The SpringApp class looks like this:

public class SpringApp {

    MessageService mockMessageService() {
        return new MessageService() {
            public String getMessage() {
              return "Hello World!";

This bundle is then packaged with the Eclipse product (As part of a feature), with feature dependencies to the required Springframework bundles.

When launching OSGI, the services are activated, but explodes with a stacktrace. (OSGI tries to activate the service several times. Showing the component status with:


4    Unsatisfied        com.netxforge.oss2.spring.app.SpringService            com.netxforge.oss2.spring.app(bid=25)

More details...

osgi> comp 4
    name = com.netxforge.oss2.spring.app.SpringService
    activate = activate
    deactivate = deactivate
    modified =
    configuration-policy = optional
    configuration-pid = com.netxforge.oss2.spring.app.SpringService
    factory = null
    autoenable = true
    immediate = true
    implementation = com.netxforge.oss2.spring.app.SpringService
    state = Unsatisfied
    properties =
    serviceFactory = false
    serviceInterface = null
    references = null
    located in bundle = com.netxforge.oss2.spring.app_1.0.0.qualifier [25]
Dynamic information :
  The component is satisfied
  All component references are satisfied
  Component configurations :
    Configuration properties:
      component.name = com.netxforge.oss2.spring.app.SpringService
      component.id = 3
    No instances were created because: Can not activate instance of component com.netxforge.oss2.spring.app.SpringService. The activation throws: java.lang.IllegalStateException: Cannot load configuration class: com.netxforge.oss2.spring.app.SpringApp

So, the SpringApp class can be loaded by Spring. In the stacktrace, the following entry is responsible:


After a bit of code inspection and debugging, I realized the classloading mechanism will try to load the class from the spring bundle spring-context, but without knowledge of my application bundle and the SpringApp.class.

The obtained classloader is: Thread.currentThread().getContextClassLoader(); (ClassUtils of Spring).

Now in OSGI loading a class should happen using the Bundle, like this:
final Class<?> clazz = bundle.loadClass(clazzName);

This seems the likely cause of the class loading issue. I am also thinking about buddy-loading policies, but this would need us to modify the spring bundle MANIFEST.MF , which have been generated and archived, so that doesn't seem the right path.

sight... a dead-end here. I could dive into it, but it dawns, that a monolithic class framework like Spring will never fit well in OSGI unless..

Spring DM / Gemini-Blueprint

So, one of my bad-habits is not to accept the situation and move on. I had to spend a bit more time scanning the web for more information on the subject, and then I stumbled on Spring DM. Wow, this is what I needed in the first place, was the first reaction. I also found this very nice article dating 2012 about the topic: http://angelozerr.wordpress.com/about/eclipse_spring/

Wow, this is exactly what i want to do. Role up the sleeves and get going then.
So, I updated my P2 repo to include the Spring-DM bundles (Which are not called DM btw), where the version is 1.2.1


And the bundles showed up, so I added to my Eclipse feature, but then I got a version error!
What?!?! Spring DM is not compatible with Spring Framework 4?

A bit of more research revealed that Spring-DM was abandoned and given to Eclipse to become part of the Gemini project as the Gemini-blueprint. The development is active, but I am not sure about the pace, and I am still very much put off by the fact that it's not supporting the current Spring version (4.x). Someone on the forum asked about it, but the answer was "not yet", so when remains to be answered.

I don't want to give up on Gemini yet, but time's up. The sun is down, and a clear night-sky illuminates the my balcony. Time to shutdown for the night!

[1] 0utput from osgi>bundle 16

org.springframework.core_3.0.7.RELEASE [16]
  Id=16, Status=ACTIVE      Data Root=/Users/Christophe/Documents/Spaces/netxstudio/.metadata/.plugins/org.eclipse.pde.core/oss2app.product/org.eclipse.osgi/bundles/16/data
  "No registered services."
  No services in use.
  Exported packages
    org.springframework.core; version="3.0.7.RELEASE"[exported]
    org.springframework.core.annotation; version="3.0.7.RELEASE"[exported]
    org.springframework.core.convert; version="3.0.7.RELEASE"[exported]
    org.springframework.core.convert.converter; version="3.0.7.RELEASE"[exported]
    org.springframework.core.convert.support; version="3.0.7.RELEASE"[exported]
    org.springframework.core.enums; version="3.0.7.RELEASE"[exported]
    org.springframework.core.io; version="3.0.7.RELEASE"[exported]
    org.springframework.core.io.support; version="3.0.7.RELEASE"[exported]
    org.springframework.core.serializer; version="3.0.7.RELEASE"[exported]
    org.springframework.core.serializer.support; version="3.0.7.RELEASE"[exported]
    org.springframework.core.style; version="3.0.7.RELEASE"[exported]
    org.springframework.core.task; version="3.0.7.RELEASE"[exported]
    org.springframework.core.task.support; version="3.0.7.RELEASE"[exported]
    org.springframework.core.type; version="3.0.7.RELEASE"[exported]
    org.springframework.core.type.classreading; version="3.0.7.RELEASE"[exported]
    org.springframework.core.type.filter; version="3.0.7.RELEASE"[exported]
    org.springframework.util; version="3.0.7.RELEASE"[exported]
    org.springframework.util.comparator; version="3.0.7.RELEASE"[exported]
    org.springframework.util.xml; version="3.0.7.RELEASE"[exported]
  Imported packages
    org.apache.commons.logging; version="1.1.1"<org.apache.commons.logging_1.1.1.v201101211721 [11]>
    org.xml.sax.helpers; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    org.xml.sax.ext; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    org.xml.sax; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    org.w3c.dom; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    org.eclipse.core.runtime; version="3.4.0"<org.eclipse.equinox.common_3.6.100.v20120522-1841 [1]>
    org.apache.log4j.xml; version="1.2.15"<org.apache.log4j_1.2.15.v201012070815 [28]>
    org.apache.log4j; version="1.2.15"<org.apache.log4j_1.2.15.v201012070815 [28]>
    javax.xml.transform.stax; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    javax.xml.transform.sax; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    javax.xml.transform; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    javax.xml.stream.util; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    javax.xml.stream.events; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    javax.xml.stream; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    javax.xml.namespace; version="0.0.0"<org.eclipse.osgi_3.9.1.v20140110-1610 [0]>
    org.aspectj.bridge; version="[1.5.4,2.0.0)"<unwired><optional>
    org.aspectj.weaver; version="[1.5.4,2.0.0)"<unwired><optional>
    org.aspectj.weaver.bcel; version="[1.5.4,2.0.0)"<unwired><optional>
    org.aspectj.weaver.patterns; version="[1.5.4,2.0.0)"<unwired><optional>
    org.jboss.vfs; version="[3.0.0,4.0.0)"<unwired><optional>
    org.jboss.virtual; version="[2.1.0.GA,3.0.0)"<unwired><optional>
    org.springframework.asm; version="[3.0.7,3.0.8)"<unwired><optional>
    org.springframework.asm.commons; version="[3.0.7,3.0.8)"<unwired><optional>
  No fragment bundles
  Named class space
    org.springframework.core; bundle-version="3.0.7.RELEASE"[provided]
  No required bundles

The quest for an OSGI friendly web framework

In a series of blog I am investigating and trying the available Java web-frameworks and how well they play in OSGI. I am spending exactly one day with each of them.


NetXStudio is the application I have been working on the last few years. It allows Telecommunications Service Providers to collect performance data from Network components and resource, aggregate the data, and perform calculations and analysis on it. The main business goal is to act timely on congestion in the network.

It's an Eclipse RCP application with a server backend. The communication is through Eclipse CDO, which is a ORM solution on steroids. ( Besides mapping Java Objects to relational databases, it also works with non-relational DB's. On top CDO has an advanced 'audit' mode which allows to go back in time, and fetch an historical object graph, it's sort of ORM+GIT in one solution). Server communication happens through CDO communication protocol named Net4J. Additional communication, has been implemented through some HTTP calls, and a simple servlet implementation, which kicks of some processes. Here Eclipse Equinox HTTP bundles are used.

Both client and server are build with Eclipse, and grouped in OSGI bundles. On top, the Eclipse Equinox implementation provides additional functionality to deploy with products and features.

So what's next

So far so good, the application is in production and the customer is mostly happy. Mostly, because there is still lack of functionality, but also some impracticalities like not being web-based. 

As with many projects at some point there is time for reflection and planning for the future. In the case of NetXStudio, the idea is to bring the application to a wider audience by making it a modular platform with re-useable software components. Now with OSGI this is a breeze, applying OSGI Declarative Services allows true discovery of services, and service chaining and much more. With little effort most of the server side functionality was modularized and services can come and go. 

Thee is also a wish to make the UI fully web-based.

OSGI Ready web-framework 

So for a web-based UI, a web-framework is needed, with the capabilities to play nicely with OSGI.
Now, I don't really want to bother about the definition of "playin nicely", that would take up some time, I believe it's better to take 'them' for a spin and see what comes out.

The following solutions are explored:
  1. Springframework and Spring-DM
  2. Apache Karaf
  3. PAX Web 
  4. Equinox HTTP Services 
  5. Gemini-Blueprint 
  6. Vert.X (Added 17-120-2014)
  7. ... (The list will grow over time I believe). 

Setting the scene

So, this is a rather big task I am committing myself to. A couple of rules to make the comparison fair and the exercise, fun and painless.
  1. Get the framework up and running in one day.  
  2. Should run on Eclipse Equinox. 
  3. Have some interaction with my application bundles. (Serve objects to the web UI). 
  4. Give some feedback on deployment and configuration. 

Monday, December 30, 2013

How Edapt Works

The gory details of Edapt migrations

Edapt is an Eclipse technology which enables 'coupled' evolution of an Ecore metalmodel and instances of the model. Coupled evolution means evolving the meta model. (.ecore) and the model instance hand-in-hand.

As we will see, Edapt implements the concept of couple evolution in a very sophisticated way. Edapt provides great flexibility with many re-usable model migration Operations. Additionally custom migration Operations are also supported.

Why this blog

This blog explains how the Edapt Migrator works. (It's not an Edapt tutorial!). The reason I wrote this blog is that I am one of the developers of Edapt and as such, I investigated how Edapt works and made many notes. Not sure where to store this study, I thought others could benefit, because the migration process is rather complex and large and can easily daze you. As it turned out for me, understanding the inner workings also helped me understand when a custom migration Operation is needed and how to implement it.


Experience with EMF is recommended. Edapt literally constructs Ecore meta models back and forth, so understanding Ecore is key. There are many tutorials. Here is one of the them.

Experience with Edapt History editor and Operations Viewer. The primary step in model migrations. The Edapt tutorial is a must.

(Note: Edapt can now be installed on the latest Eclipse release named Kepler with this plugin repository ).


  • What is the Migrator?
  • Migrator concepts.
  • Migration example dissected.
  • Migrator in details.

What is the Migrator?

The Edapt documentation explains about the Edapt migrator here. What it teaches us is how to contribute a Migrator and how to execute a migration by i.e. extending editor code to detect if a Migration is needed and actually performing it. It doesn't tell us how it works, and this is what this blog is about.

For us to understand how the Migrator work I will explain the various concepts and show how it acts upon a sample metamodel and model instance.

What happens in a migration process can be summarized as:
  1. Process a History by visiting it's releases and changes.
  2. Create an inner map from the original metamodel to a constructed target metamodel in memory, which we will call the reference metamodel. 
  3. Load a model instance (through a converter) with corresponding meta model of a release if it's not the target release yet.  (Effectively determine if the migration of a model is required based on it's release).
  4. Migrate the model instance and referenced metamodel by applying primitive and operation changes. 
  5. When the target release is reached, finish the process and persist (save) the model back to a target resource. (A file identified by a URI)

The Edapt Migration State Model

One key aspect to understand in this approach is that various metamodels will be maintained during the migration process.
  1. The original .ecore metamodel which is referenced by the changes in the History. 
  2. The 'constructed' metamodel in memory which is in the state of the changes applied to it. This metamodel can be identified as the 'reference' metamodel. It is the reference for applying metamodel changes.
  3. The 'constructed' metamodel referenced by the model instances. (See MMMeta)

Migrator concepts

Edapt defines two abstractions to perform the migration. These are:
  • Metamodel Migration model (MMMeta)
  • Model Migration model (MMModel)
Both the MMMeta and the MMModel are bound together in a Repository for easy access.
Then there are the Reconstructors and Converters.

A Reconstructor processes the History model and reconstructs a certain release going forward or backward for the releases in the history. The reconstruction process acts on the mapping and the reference metamodel in case of primitive changes which do not affect a model instance. It also acts on the MMMeta and MMModel when the actual model instance is impacted.

Note: The reconstruction process is also available through the Edapt UI acting on an Ecore editor. It will in this case only reconstruct the metamodel and not the actual model instance.

Converters are required to convert back and fort a model from a MMMeta/MMModel to an EMF ResourceSet. Finally Edapt deals with persistence through a utility class named Persistency, which we will discuss as well.


Migration Example dissected

Here we follow an example which is part of the Edapt tests. The example Model and Meta model can be obtained from here.

History Model

Release 1 (Which is only the metamodel definition).

<releases date="2008-11-23T22:45:42.562+0100">
<changes xsi:type="history:Create" element="component.ecore#/">
(1)        <changes xsi:type="history:Set" element="component.ecore#/" featureName="name"
(2)        <changes xsi:type="history:Set" element="component.ecore#/" featureName="nsURI"
(3)        <changes xsi:type="history:Set" element="component.ecore#/" featureName="nsPrefix"

etc.... (The rest of Release 1 further builds up the Meta model

Release 2 (Which actually starts to change Release 1, Model instances conforming to Release 1 will be migratable with Operations from Release 2). 

 <releases date="2008-11-23T22:49:28.078+0100">
    <changes xsi:type="history:MigrationChange" migration="org.eclipse.emf.edapt.tests.migration.custom.ComponentSignatureCustomMigration"
      <changes xsi:type="history:OperationChange">
        <changes xsi:type="history:Create" target="component.ecore#/" referenceName="eClassifiers"
          <changes xsi:type="history:Set" element="component.ecore#//InPort" featureName="name"
          <changes xsi:type="history:Add" element="component.ecore#//InPort" featureName="eSuperTypes"
        <operation name="newClass">
          <parameters name="ePackage">
            <referenceValue element="component.ecore#/"/>
          <parameters name="name">
          <parameters name="superClasses">
            <referenceValue element="component.ecore#//Port"/>


...or as a screenshot from the History editor:

Following along

Here we illustrate the migration process step by step.  

Release 1 (Here the meta model is constructed).

1. caseCreate()

Change => Create (EPackage)
  • create a new EPackage
  • add it to the Metamodel Extend cache.
  • Add mapping between the EPackage in the Create Change and map it to the new factored EPackage. (Later on when loading the MMMeta, the extend is used to get the EPackage).

2 caseSet()

  • Cet the target element from the Set
  • Get the equivalent from the mapping definition.
  • set the attribute (feature) on the target element.
etc... continued construction of the meta model

Release 2 ( Here the model is also migrated, as we use Operations)

Change => Custom Migration

2.1 startChange() which delegates to the MigrationReconstructor.

  • caseMigrationChange()
  • load the Custom Migration (In our case "ComponentSignatureCustomMigration")
  • call migrateBefore() only. (migrateAfter is implemented by this custom migration, so nothing really happens here).
2.2 calls switch again but not delegating to other reconstructors.
  • caseMigrationChange() => null (EcoreFwReconstructor returns null).
2.3 .. back in the ForwardReconstructor
  • Iterate over the MigrationChangeChildren()
  • caseOperationChange
  • creates a copyResolve OperationInstace from the ResolverBase (It copies and resolves ECore elements from the mapping).
  • Converts the OperationInstance to an OperationImplementation.
  • calls OperationImplementation instance's checkAndExecute( ) with the model and metamodel
  • We end up on the Operation Implementation which is NewClass for this operation, it calls MetaModelFactory to create the class with all the operation parameters.

2.x endChange() which delegates to the MigrationReconstructor 


When the target release is reached, the MMModel is converted back to a valid model instance which conforms to the target release of the Ecore metamodel, which completes the migration process. 

Migrator details

We explain deeper the various Migrator concepts.



The metamodel migration model (MMMeta) is a migration specific representation of the .ecore metamodel.

MMMeta Instance creation

The MMMeta instance is created with the MetamodelExtent corresponding EPackage for a model nSURI. The Metamodel instance is then available for the migration process, for example to load a model instance with the correct .ecore


The model migration model (MMModel) is migration specific representation of a model instance conforming to one of the releases of a metamodel.

The basic idea is to group instances, attributes and references together. So a change can easily iterate over the Instances and change (In one of the changes which affects the MMModel) for example the Type of an Instance in the MMModel. Later on as we will see with the converters, the 'migrated' MMModel will be serialized back into a regular Ecore model instance and can be persisted.  

The following entities exist (Somewhat simplified).
  • Model => The MMModel 
  • Instance => Each EObject in the actual model instance will have an Instance with a Type
  • Type => Each Instance has a Type. A Type has an eClass which corresponds to the original eClass of the EObject
  • AttributeSlot => Each attribute in EObjects has an AttributeSlot with the EStructuralFeature and EJavaObject as values. 
  • ReferenceSlot => Each reference in EObjects has a ReferenceSlot with the EStructuralFeature and Instance pointed to by this ReferenceSlot


As said the reconstructors take the .history and allow to 'build' a certain release of the Meta model and instance model. The Migrator uses the EcoreForwardReconstructor (As models typically age and need to be migrated forward).

Now the EcoreForwardReconstructor extends the CompositeReconstructorBase which delegates the reconstruction to one or more reconstructors declared with it. This is the typical delegation pattern to allow the reconstruction process to be extended. This is also exactly the way the Migrator works. The Migrator adds the MigrationReconstructor to the EcoreForwardReconstructor so delegation happens when needed.

As we will see in the Reconstruction Process, at some point in the reconstruction we will hit a Change. This definition knows many forms. (Many types of changes). In order to act appropriately on the Change type, the reconstructor typically implements a model object Switch.

In the Edapt case the History code generation produced the HistorySwitch which is extended by the various reconstructors to perform the appropriate action.

The MigrationReconstructorSwitch for example deals with specific Change implementations like an OperationChange and a MigrationChange to add or delete from the MMMeta or MMModel.

Mapping and resolving

Whenever a Migration kicks in it will create a Mapping instance which is initialized through all reconstructors and delegated reconstructors through the init(...) method of an reconstructor.  

The Mapping contains a TwoWayIdentityHashMap for mapping EObjects to each other.

One of the usages is to map Change elements to the created equivalent (The Ecore Metamodel) in memory of these Change elements.

In this case the history model is visited, starting with the initial Release. From this Release, the Ecore metamodel in the form of one or more EPackages is gradually build up to be the first release of the as intended by the history. (With corresponding nsURIs).

Then in subsequent releases and underlying changes, the reference EPackage is adapted gradually. When migrating the actual model instances (Which is only applicable for some changes), the model instance references to the Ecore model artifacts (EClass, EReference, EAttribute) are resolved from this very same mapping. It is therefor absolutely key, that the MMMeta and MMModel are loaded with the same EPackage from the 'extent'. 


When adding a new feature to a Class, in the mapping the target element from the Change
is looked up. and the new feature is added to the EClass (in the mapping). 

With the Mapping utility there is a  ResolverBase class to resolve elements from the mapping.  The ResolverBase has a special method named:


What it does is for an OperationChange in the History is to perform a copy of the model element but resolving from the mapping at the same time. It descends the hierarchy of features and resolves when an EClass package is of type ECorePackage.

Effectively what this means is that when the OperationChange is a  metamodel definition, the resolver starts resolving from the mapping, making sure the constructed Ecore metamodel is used.


The are two converters. 
  • ForwardConverter => Converts a ResourceSet to a MMModel
  • BackwardConverter => Converts from a MMModel to a ResourceSet

The Model model is populated in the order.

initElements(); (EClassifiers etc..).
initProperties() (EAttribute => AttributeSlot, EReference => Slot / ReferenceSlot).


The ResourceSet and it's Resources are loaded in the order. 

ResourceSet resourceSet = initResources(model);


Persistence is handled  through a utility class named Persistency. This utility is specialized to deal with the situation whereby loading and saving of Model instances respects the metamodel version for which the model should be loaded/saved.

One aspects of dealing with XMI serialized model is the potential dynamic nature of a Resource load implementation.  EMF support the dynamic creation of an EPackage based on the schemalocation attribute. The schemalocation attribute will potentially point to an instance of the .ecore which is constructed on a certain release of the history, so loading the resource, will auto-create an EPackage for that release. An EObject will have an eClass with parent EPackage for a certain Release of the history.

This is important, as various reflective functions which act on the EPackage should be acting on the exact intended EPackage 'version'.  I ran into this, when trying to copy a loaded model, with ECoreUtil (To load a copy in another Resource). The model had an EPackage which corresponded to the latest release, while the model itself was serialized with a previous release.

Edapt has encountered this issue and deals with this in the following manner; 

When loading a model it provides the EPackage to use from the MMMeta instance (See MMMeta Instance creation).  The EPackage is mapped to the nsURI of the model in the EPackageRegistry of the ResourceSet, so when the resource is loaded, it consults the EPackageRegistry and uses the EPackage instead of dynamically loading the EPackage.
Reconstruction Process
The reconstruction process 'visits' the History model hierarchy, it has hooks for the start and end of Releases and Changes.

When descending the history to the intended release, the reconstructor will delegate call change (CompositeReconstructor) which call the MigrationReconstructorSwitch, which 'switches' the Change. 

At a determined point the migration reconstructor loads the 'Model' model and a 'MetaModel' instance. This happens when the end of a Release is reached which is not the targeted release.

If the change is one of the types CompositeChangeMigrationChange or  InitializerChange, then the change also reconstructs the children of the specialized Change instances by the ForwardReconstrutor.

The Reconstrucion process can be represented as:

startHistory History
    startRelease Release
        for Release.changes()
            startChange Change
                startChange (CompositeReconstructor).
                switch Change
            endChange change
    endRelease Release =>  (If the Release if the original release, load the model, see MigrationReconstructor )             
endHistory History


The switchs process the following Change types:




The Edapt Migrator and it's concepts have no more secrets! We explored the concepts and how they work. The migration process which couples Meta and Model is quiet impressive. In subsequent posts on Edapt I will elaborate on how to work with a non XMI Resource based Persistency, for example CDO.