Hands-on Review of Bands in iReport

Posted by Esteban Martinez on August 19, 2014  /   Posted in Online Platform, Web Development

Here at nextCoder, we regularly use the JasperReports reporting engine and the iReport report designer tool in the systems we build for our clients. One of the issues I’ve noticed is that the iReport Ultimate Guide has lots of great technical info but doesn’t address most of the issues that come up in my real-world usage of the tool.

So I’ll be writing about some of tips and tricks that we here at nextCoder use with iReport. There’s a bunch of stuff to cover so I’ll deliver the goods in 3 separate articles. So lets get part 1 started with understanding the basics of all those report bands within the template, which will help with some creative reporting!

General Assumption
I assume that you are somewhat familiar with how iReports work, I will be glazing over some terminologies and concepts that I consider basic.  If you would like for me to go deeper on some of these, that’s what the comments box is for down there.

Title Band:
Meant for the title of the report. Intended to only print once and lacks any iteration properties.

Page Header:
Meant for printing on every page. Designed for keeping track of where you are in a multi-page report or to appear on a page when particular trigger has been fired. For example, if you have an accounting report for business expenses, there will be a section that will have employee costs. So in the header, you would show that this page is reporting employee costs. Then when
the report changes to reporting the utility expenses, the header would change to reflect this.

I usually have Static Textboxes for keeping track of the Page Header. A variable from a parameter could be used, however, if your information changes mid-page the Page Header will not show this. It’s often simpler to make a new sub-report if you have new Page Header information.

So in our example, the report would finish somewhere on the page reporting the cost of employees and leave the rest of the page blank. Then the new page would start with the new Page Header and start showing the costs of utilities. An easy way to ensure that a sub-report will start on it’s own page is make sure that the actual sub report (NOT the sub-report element ‘graybox’) has a page height of a full page. For 8.5 x 11″ this is 792. You can see this information by selecting the name of the report at the top of the list on the left hand side.ReportName

A secondary use of the Page Header is as a notification or alert. An example would be of a project exceeding a budget. In this case you would not always want the band to print. We can control this with the PrintWhen property of the band.

PrintWhen So the expression would look similar to:

$F{current_expense} > $P{project_budget}

Put in your notification text in the textfield and now this message will only print when the project has exceeded costs. Also great use for inventory too high or low.

We here at nextCoder we love the PrintWhen control!

TIP: There is a PrintWhen control for individual textboxes, a frame, or the entire band. This can be a real life-saver if your report has to dynamically change in accordance to user choices/selections.

Column Header:
Simple Static Text boxes that will print at the top of a column and will reprint again on the next page.

TIP: If you drag a field from the list on the left hand side of iReport into the detail band, a corresponding column header will be generated automatically in this band.


Detail Band:
This is where the majority of your report will take place. The most important aspect of the Detail Band is it will iterate through data automatically. So if you have a field called $F{employee}.getSalary(), then it will access that list of employee salaries and start listing them until it reaches the end.

NOTE: If you are using a sub-report, then it will be the band in which the actual sub-report is being processed that will determine iterating behavior. So if the parent report wants a complete list of a set in the title band (or any non-iterating band), then place the sub-report element in the title band. But make sure that the child(sub-report itself) is using the detail band to
iterate through the data.

Column Footer:
This band is meant mainly for keeping track of a total for a column of a page. It can also be used for keeping a running total if the report is spanning several pages for that column.

Here at nextCoder, we typically create a variable for this and give it a Calculation of Sum and Reset Type of Report. This will give me a running total for the report on each page. Use the Increment Type to update the variable when you like. So if I want the new total at the bottom of every page, I would select Page for the Increment Type.

Page Footer:
This band is used to repeat values at the bottom of every page. Typically used for Page X of Y information and a good place for the date. Both of which you can simply insert from the Palette Tools section in iReport and drag a box into the band.

NOTE: To open Palette, select ‘Window’ from menu and then ‘Palette’ or Ctrl + Shift + 8 Paging and Date utilities will be in the ‘Tools’ section.

Last Page Footer:
This band will print only on the last page of a report. Typically used for signaling the end of the report with a Static Text box.

Summary Band:
The main purpose for this band is to report out totals for the entire report. Will only print out on last page of report. If you have been keeping track of running totals with variables, or simply want to print a final total for the report, this is where they should go.

NOTE:  Variables should have their Reset Type set at Report. Otherwise you may not have all the data in the variable.

No Data:
This band will only print when enabled and if the report does not recieve data to process. To enable the band, Go to the properties of the report(Usually I just click on the report name at the top of the list on the left hand side), and find the section “When No Data”. Then select “No Data Section” from the drop-down list.


Put in whatever Static Text you want into this band to signal that no data was sent to it. Good for debugging reports and sub-reports.

I hope this has given you a good overall feel of the different bands in Jasper Report and their practical, real-world uses.

See you next time when I’ll be covering the most powerful, yet most misused and misunderstood feature of the engine: Sub-reporting. We are going several levels deep!

-Esteban Martinez

Senior Programmer at nextCoder

How To Build and Maintain Multiple Online Platforms

Posted by admin on May 14, 2014  /   Posted in BI and Custom Development, Business, Online Platform

The other day, out of curiosity, one of our clients asked me: “How are you guys able to build and maintain the quality on so many platforms simultaneously?”

That question prompted me to pause and think about this:

What are some of the things that make it possible to build and maintain multiple online platforms simultaneously without overwhelming the capacity?

As I started to write out the answer, I decided to go ahead and share two “ingredients” or “secret-sauces” that works for us and enable us to serve multiple clients simultaneously without compromising on either quality or support.

So What’s The First Secret Sauce?

Just like in the movie “Kung-Fu Panda,” — don’t read the rest of this sentence if you haven’t seen the movie yet — the “secret” turns out to be a blank scroll.

No really, the reason we at nextCoder are able to crank out a lot of platforms is not because of a secret-sauce recipe, rather, our carefully (and painstakingly) selected and molded “ingredients” (think building blocks) that we continually evaluate, test for fit, improve and build upon.

Because we are intentional about it, these building blocks are ready to be attached together to become the online platform that serves a particular industry domain. But in their disassembled state, they are just like “a blank scroll.”

See, back when dinosaurs roamed the earth — that’s the era of DLL-thunking, MS C++ vs Borland C++ vs Intel C++ to you, young programmers — the choices to make when developing an application is which compiler, which sorting algorithm, which expensive optimized libraries, even which memory manager.  The concepts such as Open Source code repository, data sharding, virtualization, JIT compilation, UI-templates, Aspect programming, had yet to become mainstream.

Today, we have the luxury to pick and choose from literally hundreds of already-made “blocks of code” to build our platforms with.  Some are Open Source, some have to be purchased, some are as complex as a whole application or platform framework, some packaged as neat little libraries, and the others, a network of code snippets and functions.  Internet browsers (big or small, desktop or mobile) is everywhere, and themselves, serve as a platform that does the heavy-lifting.

Therefore as platform builders, we can focus more on helping businesses map their needs through the use of data modeling, system configuration, deployment models (see Docker.io if you want to see an impressive technology) and of course: Business Intelligence and Data Analytics.

Ok, That’s The First One, Out with The Second

The second ingredient has to do with how we fit our client’s business model into the building blocks we just talked about.  Having done a lot of deployments across different business domains, we found that there are a lot of businesses out there whose data and information can fit the following model:

Building Blocks 800px

Basically, beyond infrastructure facilities such as security, user, configuration, email, role-specific dashboards, scheduling, notification system, full-text indexing which comes with every online platform worth its salt, we usually can organize data in the business in one or more grouping hierarchies, which typically ends with the main business object. What is this main business object? you may ask.

The main business object is typically tied to how the business serve their customers.  So for example, in the legal industry, the main business object would probably be legal cases, for a bookstore, the books, for a landscape-maintenance company, it’s probably the customer’s property, for a gymnasium, its member’s goals (lose weight, gain weight, get fit, etc.).  The Main Business Object is what most people associate with what the system manages.

Then predictably, almost all of the businesses that we came across associate the main business object with the same concepts such as trackable / auditable notes, links and tags, custom fields (who does not need a handful of these…?) and the ability to sort, search, and filter under various contexts.

Speaking of contexts, all of us know that chronological context is used all over the place, “when” something happens, and what else happens “after that” will always be the primary way in which we process information.  Thanks to the Geographic Information System (GIS) technological advances nowadays, Geospatial data is fast to become the next super-context which will find its place in many aspects of our lives.

Then after all this are put together using the building blocks, we begin the implementation of the proprietary process owned / copyrighted / trademarked by the client’s business or personally. We have been very blessed to work with innovative business owners whose brilliant process is just waiting for a good home to be built.

This process fits the definition of an Online Platform where construction and enhancements never really ends.  All of our platforms exhibit this characteristics.

In Summary

The combination of carefully selected building blocks and the right level of abstraction (at the business object level) gives us the ability to serve many clients simultaneously through our platforms.

This is not something that we take lightly, in fact every processes that we created and use takes this as one of the main goals, by which everything else is taken into consideration.  Why is this so important? Because this is the only way we can extend two biggest benefits to our clients:

  1. Continuous technical improvements, an improvement created for one platform can and will be replicated to the others
  2. Consistent quality and ease of maintenance, which translates into lower cost

Given that most of our platforms are priced in accordance to the number of usages, the growth of our clients’ business means growth for us as well. This is, to us, the prime motivation to go the distance towards excellence.

As I mentioned in my previous blog entry, we are living in the “golden-hours” of the Information technology, where the efforts of many finds its uses to benefit a much larger portion of the population than ever before.

We are the beneficiary of many-many hours spent by dedicated and incredibly smart people. At nextCoder, we intend to pass along these benefits to help as many business as possible out there. In other words, it’s time for the “secret-sauce” to be enjoyed by many, many more.

- Will Gunadi

Online Platforms: Building Shoulders To Stand On

Posted by admin on April 02, 2014  /   Posted in Business, Web Development


I often heard that today, we are living in the “golden hours” of the Information Era where the advances is nothing short of incredible.  Example? For the first time in the known history, we became one global society.

The question now is — as those who surf would often ask — where is the next big wave?

We at nextCoder believe that the Online Platform (Software as a Service or SaaS is another name for it) sector has reached the condition where one of those big waves might happen.  And a lot of software development companies (including us) are gearing up and getting ready to ride this wave. And a lot of Business Intelligence and Data Analysis companies (including us) are getting ready for this also.

What Qualifies As An Online Platform?

The first thing to be clear about is that an Online Platform is not simply a website or a web application.  There are millions of websites out there going up and down, and there are millions of web applications developed either for private usage or public.  Obviously not all of these can be regarded as platforms.  So let’s take a look at what is it exactly that separates a website or a web application from a real Online Platform:

  • Free or available at a lower cost for the users
  • Non-exclusive systems, the more users, the better it is
  • It is continually being developed and improved
  • Users produce content or other platforms within the platform

Who are these Online Platforms Users?

The users are the bread and butter for an Online Platform. Without its users the platform is … well, useless.  And just like how it is for businesses, the type of the users or customers can be divided into two broad categories: B2B and B2C.

Each of these categories have its own characteristics and determines how the Online Platform would have to be designed, constructed, and maintained.

B2B Online Platform users are using the system within the context of their respective companies.  Therefore such platforms are usually full of complex features, heavy on customizations, more robust and secure, and zero or limited interactions between users from other companies. Attlasian JIRA would be a good example for this.

B2C Online Platforms like Twitter on the other hand puts a lot of demands in terms of capacity (thousands to millions of users using the system), but not so much on complex transactions, workflow, and security.  At least not in the beginning, but since one of the characteristics of a platform is that it grows continually, eventually, even a B2C platform can have complex workflow. Also, this does not mean that B2C platforms are insecure or not robust.

And then there are Online Platforms that deals with both Businesses and Customers at the same time. eBay would be a prime example of this kind of platform.

Online Platform Categories

Educational / Entertainment – These platforms collects, share, and serve content that is educational or entertaining (or both) in nature. These are typically B2C.

Social / Networking - I think everyone know what these looks like.

Transactional / Operational – What used to be the domain of custom and exclusively built applications now has started to be available as an Online Platform. This is possible thanks to the Internet which drives the acceptance of connectivity even among businesses, which are far more conservative than individuals in this regard. These are mainly B2B.

Why Does This Matter To Everyone?

Eventually, almost all of us will become a user of one or more Online Platforms. Why? Because the next generation of Online Platforms will be even more useful, connected, and convenient than what is available today. This should be regarded as an excellent opportunity especially for Business Owners, company Executives and Entrepreneurs.

For individual customers, the sheer convenience and quality of services will continue to climb up. The amount of information that we would know about each other will also increase. That can be either scary or good, depending on your perspective.

For Business Owners and company Executives, what does this mean? Online Platforms can be used to operate existing businesses at a much lower cost.  How so? Because the Online Platform builder can spread the cost among platform users.

What about competitions? Why would a company use an Online Platform along with its competitors? It depends on the nature of the business. A good Online Platform has to be built independent of the factors in which the users compete on.

For example, a platform that automate operations, collect metrics, and perform data analysis, would boost the service quality of its users while keeping the playing field level.  On the other hand, platforms that share users’ data without their consent, or provide unfair advantages to some of the users, is a big no-no and should be avoided at all cost.

For Entrepreneurs, Online Platforms can be used to generate new streams of revenue. Build platforms for existing businesses to use.  There are so many niche opportunities that depends more on who do we know more than what we know.

What’s The Stats?

Those of us who loves statistics, here is an interesting Google trends plot on the level of interest of the term Online Platform over the years:

It should be noted that the trend is slightly increasing. That means the awareness of the Online Platform concept (represented by the term) is rising among us.

So the next time I hear someone says “golden hours” of the Information Era, I now have an idea what some of it may look like.  Let’s ride the wave together.

– Will Gunadi

How to Build an Executable War or Jar Files

Posted by admin on March 26, 2014  /   Posted in J2EE, Web Development


In the effort to bend-over-backwards for our clients, we sometimes have to take a step back and think outside of the box.

99% of our Custom System are deployed as a web applications whether it is hosted on our server or the client’s. But for this particular client, they require a way to deploy the application on an isolated machine.  So we had to improvise a way to package a webserver *and* the application into something that can be donwloaded, and run without any complicated setup.

Sharing our Findings

So here is a simple way I found to make my maven webapp project into a self-executing jar file. Glad to share it for all to take advantage of. This has the obvious advantage of not having to setup a Tomcat server on each client that will use the application. And since an executable .jar files are OS independent, you can use this whether you are a Windows or UNIX shop.

1) Client needs to have Java installed
2) Your project must have a packaging pom or war.
3) Your web app should already be working and can compile without error.
4) This is only supported with the Tomcat7 plug-in

Now go into your pom.xml file and add the plug-in with your other plug-ins.

Here’s the code for the Maven plug-in:

 <packaging>war or pom</packaging>
 <!-- optional only if you want to use a preconfigured server.xml file -->
 <!-- optional values which can be configurable -->
 <attachArtifactClassifier>default value is exec-war but you can customize</attachArtifactClassifier>
 <attachArtifactClassifierType>default value is jar</attachArtifactClassifierType>

This plug-in code is from the Apache site. I usually remove the server.xml option and I will set the ArtifactClassifier and ArtifactClassifierType to ‘exec-war’ and ‘jar’ respectively. But it should work will all the optional tags removed.

After you have your pom.xml file saved, drill your command line to folder that has the pom.xml file and run the following command:

mvn clean package -Prunnable-war

Things to note with this command.

  1. There is no space between the ‘-P’ argument and the name of the profile
  2. The ‘runnable-war’ is just a generic profile name. You will use the name of your profile that is in the pom.xml file. If you have more than one build then select the most appropriate. I usually have a ‘dev’, ‘test’ and ‘prod’ profiles. I typically use the ‘dev’

Once the command has been completed 3 new files will be created.

  • ${yourapp-version-war-exec}.jar: A runnable JAR that contains the tomcat embedded runtime
  • war.exec.manifest: a manifest file containing the main class to run
  • war-exec.preperties: a properties file containing some Tomcat config options and info.

NOTE: If your project has multiple modules then these 3 new files will be created in each folder.
For example, I typically have 5 folders for my web apps.

 ---- my-project-common
 ---- my-project-engine
 ---- my-project-test
 ---- my-project-web

I create the jar with the parent pom.xml, but the 3 new files that I will use are going where my web app is. (‘my-project-web’)

So for the final step, go into the target folder of your web app (‘my-project-web’ in my case) and run the following command to start the Tomcat server:

java -jar ${yourapp-version-war-exec}.jar

Open up a browser and go to http://localhost:8080 and your app should be there.
First run is usually slow to start because of all the extraction that happens.

Now you can simply copy these three files to another computer that has Java and start you web app with the same command.

Esteban Martinez
Senior Developer

School Districts are Putting Up Dashboards

Posted by admin on March 13, 2014  /   Posted in BI and Custom Development

Think Dashboards are only for Executives and Businesses?

Well, think again.

Today, even school districts are doing it.  First, let me be clear, it should come to no surprise that organizations within the education sector, especially the ones focusing on educating our children, should take advantage of any information technologies available today.

What really surprised me is how they use the Dashboard. Have a look here:

The Plano Independent School District (PISD) put up a dashboard which compares them with the neighboring cities’ ISD. The above image is just one of the elements on the Dashboard, by the way.  The rest of it looks just as impressive, as they seem to use every bells-and-whistles available in the charting world (not necessarily the best approach, but at least it is out there serving some purpose).

But while the Dashboard is put-together well, what impressed me the most is that the ISD gets it. They know that in order to get ahead of their peers, they not only have to perform, but also take the time to showcase their achievements.

This realization puts them — quite sadly — ahead of how most business owners think today.

Where does the data come from?

Or more importantly, how hard or how easy it is to put up the Dashboard, given the limitation of resources and information.  And this is what makes the situation even more ironic. We are living in the golden era of publicly accessible information. What is not available from within the organization, could easily be obtained either through some governing bodies or data collected independently.

And if a business owner still think that it is impossible for them to get the data about their competitors or customers or vendors/suppliers or everything else, that’s an outdated view and should be discarded.

What is it really for?

To make it easier for interested parties to make decisions. In this particular case, the interested parties are parents of students or young people who are looking for a place to move into or to start their lives.  By utilizing the Dashboard to serve relevant information in various dimension, the Plano ISD has make it that much easier for people to consider their options.

The same case can be made strongly for almost every businesses out there. When people are considering your products and businesses, the more ways you can showcase your distinguishing features, the more you can grab the mind of interested parties (read: potential customers).

Go Publish!

With today’s technologies, and the accessibility of potentially relevant external data, for example the General Society Survey, or  The Gapminder site, and many more, it is a crime not to have your own Business Dashboards. As the ISD’s have shown, an effort to present relevant up-to-date and relevant information is not limited to a one-shot deal, it is a continuous, planned, and deliberate program that should be included in your day-to-day business operations.

Business Dashboards – The Next Generation

Posted by admin on February 13, 2014  /   Posted in BI and Custom Development, Business, Data Best Practices

Dashboards Today

Unfortunately, most vendors that provide dashboard software have done little to encourage the effective use of this medium. They focus their marketing efforts on flash and dazzle that subvert the goals of clear communication.

They fight to win our interest by maximizing sizzle, highlighting flashy display mechanisms that appeal to our desire to be entertained. Once implemented, however, these cute displays lose their spark in a matter of days and become just plain annoying.

An effective dashboard is the product not of cute gauges, meters, and traffic lights, but rather of informed design: more science than art, more simplicity than dazzle. It is, above all else, about communication.

- Stephen Few, Information Dashboard Design, O’Reilly, 2006

Amazingly this comment is still pretty much true even today. But it doesn’t have to be. The next generation of effective dashboards are closer to a mixing panel of a music studio than plain, static dashboards we see today. And we at nextCoder are going to make sure that you business leaders are well-informed about it.

Recap: What is a Business Dashboard?

Nowadays you see fancy dashboards in cars that rival a desktop computer, almost. But what does the most basic dashboard in your car actually do? It tells you, the driver — in real time, at least these four information:

  1. How fast are you going (the odometer)
  2. How far can you go (tank full of gas or almost empty)
  3. Alerting you to pending breakage (weak battery, engine oil empty, engine needs checking)
  4. Alerting you of what is going on (turning signal, headlamp indicator)

Running a business, just like driving a car, requires your full attention in real time, but unfortunately, a business does not come with a built-in dashboard the way cars do.  Either you have to build one yourself, or have someone build it for you.

And just like a car dashboard that is connected at all times to the sensors that feeds it with data, a Business Dashboard is connected to the data sources located either within your business or coming from outside sources.

The information that you receive from a Business Dashboard is actually pretty similar:

  1. How fast is the business growing? (is it growing in the right direction?)
  2. How long can the business last given the current situation?
  3. Are we paying too much to our vendors/suppliers?
  4. Are we serving our customers the best we could?
  5. And many, many more

But … believe it or not, as useful as knowing all of these are, that list merely covers the basic usage of a Business Dashboard. If we stop at this level, we are missing the full potential of what the next generation of dashboards can do for us.

What is brewing?

Thanks to the ever-increasing popularity of game-changing web technologies such as jQuery, HTML5/CSS and the growing list of charting and/or visualization libraries, we enjoy the power of data visualization unlike any other computing era before.

The new way to develop dashboards allows us to produce dashboards that will rival web-applications in terms of information flow. Gone are the days of static, read-only dashboards. Say “Hello” to the new generation that takes interaction to a whole new level:

  • Built in mapping: With the numerous and Geocoding API available for us to use, it is unthinkable that a dashboard should be without one. Most businesses could benefit from geographically-mapped data. Imagine being able to visually see where your products are being purchased,  or your technicians on their service routes, or your suppliers to optimize material or component shipments.  And many more uses.
  • User Inputs: Ever seen a dashboard that is not read-only? If yes, you are in a good company. More and more executives, managers, customer support personnel ask that they are able to punch in data in real-time. Why do they have to switch to another application to do that? We at nextCoder agrees.
  • User-specific Business Rules: Lets face it, a business owner or CEO has his or her own “rules” that allow them to determine whether the business is doing okay or is it floundering.  These “rules” are not for everyone to see and for a very good reason: Panic prevention (just kidding… a little bit). But the fact remains, if a dashboard cannot even contain user-specific rules, then we are shortchanging the users. Plain and simple.
  • External Data: A lot of smaller businesses assume that just because their data volume is not gigantic, they do not have any use for aggregated data analysis. This was true in the past, not anymore. Today, there are volumes of data about any businesses, but it does not originate within the business itself, rather, the vast social-media network. You could be surprised at what your customers broadcast about your product or service to their friends and family, good and bad. And for the sake of your business, monitoring it is a good course of action.

These are just the tip of the iceberg when it comes to what a Business Dashboard is capable of serving with the current technology. Our goal is to deliver these features to our clients with each dashboard we build for them. In the next blog entries, we’ll have a peek on how to do just that.

Pentaho 5 CE Hands-on Review

Posted by admin on January 16, 2014  /   Posted in BI and Custom Development, Data Best Practices

One of the most exciting software release towards the end of 2013 is the Pentaho 5.0 CE (Community Edition) which was rolled out on November 18th 2013. While the EE (Enterprise Edition) was released a couple months prior, the CE version has always been my favorite both to work on and especially to be part of the community who is always full of new (and wonderful) ideas, and actually have the brain power to realize those.  Truly one of the most interesting Open Source communities.

As a BI Consultant, I had several requests to review this new release, so without further ado, let’s take a look.

As usual, I get the zip files for each Pentaho BI Suite components from here. This is what my folder looks like this when I was done downloading:

  • pad-ce-5.0.1-stable.zip – Pentaho Aggregation Designer (missing as of the time of this review, no idea where it went)
  • psw-ce-3.6.1.zip – Pentaho Schema Workbench
  • biserver-ce-5.0.1-stable.zip – Pentaho BI Server
  • pdi-ce-5.0.1-stable.zip – Pentaho Data Integration (Kettle and Spoon)
  • pme-ce-5.0.1-stable.zip – Pentaho Metadata Editor
  • prd-ce-5.0.1-stable.zip – Pentaho Report Designer

Unzipping any of these zip files will “install” the component. Simple as that.
I haven’t had the time to look at PAD or PME, so we’ll review this in the future. For now let’s start with PRD.

The new Pentaho Report Designer has a very convenient and useful item on the Wizard which you see when you started the report-designer.sh (or .bat on Windows) script. It’s called “What’s New”.

It’s basically a report that we can Preview and it listed all the new features in this 5.0 release; very handy to read about the improvements. What piqued my interest especially is they seem to improve the creation of interactive HTML reports, which now can serve links to other reports within Pentaho.  Maybe a new way to serve content that is somewhere between reports, dashboard, and wizard pages.

The popular Kettle (or Spoon or PDI) increase the number of steps including one that I have been waiting for: OpenERP Input and Output. Speaking of OpenERP, I need to contribute the custom OpenERP step that we developed last year into the community.

I’m also eager to try out the MongoDB steps as I started to use it for our projects.  I’ll have more to say about these two wonderful tools in upcoming articles.  These two are big enough to have their own reviews.

Pentaho BI Server

But the biggest changes are truly visible in the Pentaho BI Server itself. After unzipping the biserver-ce-5.0.1-stable.zip, dive into biserver-ce director and issue ./start-pentaho.sh if you are on UNIX or start-pentaho.bat if you are on Windows.

By starting the BI Server from this location, the starting scripts already set the memory allocation and other environment parameters to more reasonable values than the ones that comes default with Apache Tomcat.

After starting the server and wait for a while — or if you are familiar with Tomcat logging features, on UNIX do:

tail -f biserver-ce/tomcat/log/catalina.out

Which will allow you to see if the server starts correctly or failed with errors. On Windows, use the Tomcat Start/Shutdown application to see the logs. When you see the log files stops scrolling, bring up a browser (on the same computer) and try to hit the Tomcat server by entering http://localhost:8080/pentaho if you use the default settings. And you should see:


Yes, that’s what our version of the Pentaho User Console (PUC) looks like after a couple of customization steps:

  • Change the login image:
    – user@server:~/pentaho5/biserver-ce/pentaho-solutions/system/common-ui/resources/themes/crystal/images$ mv ~/your-own-similarly-sized-image.jpg ./login-crystal-bg.jpeg
  • Change Pentaho to nextCoder
    – user@server:~/pentaho5/biserver-ce/pentaho-solutions/system/common-ui/resources/themes/images$ mv ~/your_logo.png puc-login-logo.png
  • Change the wordings of the Login page:
    – user@server:~/pentaho5/biserver-ce$ vi tomcat/webapps/pentaho/jsp/PUCLogin.jsp

Gone are the usual ‘joe’ user, replaced by ‘admin’ with the same default password ‘password’. Use these to get in and you’d be greeted by the Home screen:


Again, with some modifications, you can tailor the Home screen to suit your purposes. In this case the customization step is:

  • Change the content of Home
    – beruin@yamato:~/pentaho5$ vi ./biserver-ce/tomcat/webapps/pentaho/mantle/home/content/welcome/index.html

If you notice, gone is the compartmentalized panes of the old PUC, replaced by a much better-flowing (plenty of white space) minimalistic-style layout.

Another paradigm switch is the central navigation (it says ‘Home’ in the above screenshot). When you click on it, a dropdown will be displayed showing the available mode.  The Home -mode is what you see above, next is the Browse File -mode that looks like this:


This is another departure from the file-based pentaho-solution repository to this JCR-based one. What is JCR? Java Content Repository is a database-based content (files) repository specification that is implemented by among others Apache Jackrabbit project, which is the one being used here by Pentaho.

What does this all mean to users? In a way, it has its advantages being a database-based repository in terms of better control of metadata and versioning of the files without sacrificing ease of use, but which also means that we have to use a plugin if we want to synchronize this repository with a file system so we can Version Control our files on our own.  It remains to be seen if this switch will yield its fruit down the line.

One question for the Pentaho team: Why can’t I select multiple files and do some actions with them?

Speaking of plugins, which is source of productivity in the platform, the next mode we’ll talk about is the Marketplace -mode:


In version 4.8 and before, we have to install plugins such as Saiku, CDE, CDA, CDF, etc. manually either by using the ctool-install.sh script or by unzipping files at the right folders and hope it’ll work.

The new Marketplace -mode provides a more organized way to manage plugins and their versions.  Although you still have to restart the server manually after installing or upgrading these plugins, it is still miles ahead and more importantly, just a month or two after release, we started to see plugins written by developers outside of Pentaho, which is wonderful and in-line with the spirit of the community.

Next up, is the Opened -mode, which is basically a mode where it retains all of the files we are working on (both editing or opening).  This mode is somewhat similar to the new Microsoft Office paradigm (starting with Office 2010).

The Scheduled -mode is an improved user interface to schedule ETL runs:


A new feature introduced is the ability to define block-out time, within which scheduled ETL will not be run.  This is useful for scheduled downtime or maintenance for the host servers.

The last mode is the Administration -mode:


This is the answer to “Where is PAC?” The old Pentaho Administration Console is gone, it is now reborn as this mode. I can’t tell you how many times (with the previous version) I received raised eyebrows or dumbfounded-look when I had to explain that you have to run another server just to create a new user or assign roles. This is definitely a very welcome improvement!

Now, how about some real work. The plugins now take center stage as Pentaho CE matures as a real platform. Old favorites like CDE:


Improved with the much more professional-looking “Crystal” theme as the default. You could still switch to the old “Onyx” theme if you like.

Another good tool is Saiku Analytics, returned also thankfully:

saiku_graphsThe charting ability of Saiku Analytics has been improved tremendously. I  almost couldn’t believe my eyes when I see the various charts glide along visualizing the data effortlessly.

A promising newcomer in the Analytics tool called Pivot4J is also available to install through the Marketplace -mode:


The Pivot4J has one thing that has been missing in all of the Pentaho Analytic tools, the ability to render Aggregates at the last row or column.  You have no idea how many times this little feature is asked by my clients.  Yes, business people loves their totals, those helped them to make better decisions.  So good job for this Pivot4J team!

Is there any negatives? Yes, the charting in Pivot4J is not intuitive to me. Take a look at the above screenshot, you see four columns. When you click the interface that will generate the bar chart representation of the table, what would you expect? I expect one bar chart, with four bars each representing the columns.  What did Pivot4J gave me? Four bar charts. Why?? And I don’t see any ways to merge them or change those in any way.


In summary, I couldn’t be happier with this new 5.0 release of the Pentaho CE. There is enough new features here that warrants companies to consider upgrading their Data Warehouses.  What is the most exciting trend for me is the third-party plugins that starts to become available through the Marketplace.  This can signal a real growth in quantity and quality of what is already one of the most useful BI suites in the market.

So to Mr. Pedro Alves and his team, big kudos, thank you, and good job. 2014 is looking like another stellar years for Open Source BI, starting with Pentaho 5.0 CE.

What You Should Expect from Your Data Warehouse

Posted by admin on October 01, 2013  /   Posted in BI and Custom Development, Business, Data Best Practices

When it comes to the benefits of having a Data Warehouse (DW from here on), most of our clients are already cognizant of its critical role as part of a business system.  What many still don’t realize is that there are good DWs and not so good ones.  Of course everyone is aware of their existence (good vs not so useful), but not everyone has the time thinking of the ways to distinguish between those two.

A Good Data Warehouse Goes (Far) Beyond a Database

The number one question that I get when I talked to anyone about the services we provide to our customers is the difference between a Database and a Data Warehouse.  A lot of people associate a data warehouse with storing data (just like a physical warehouse)  — and that’s about it.

Truth is, a good DW serves the business in at least five areas in addition to hosting business data:

  1. Automatically apply business rules
  2. Automatically generate reports (both on-demand and periodical)
  3. Automatically notify parties of data events of interest
  4. Allows data owners to look into their data in multiple dimensions
  5. Allows for data error detection, investigation, and corrections

When you are building a DW, the above functions must be present in the planning, implementations, and testing phases.  Otherwise, you’d be left with basically just a database.

A Good Data Warehouse Grows into Maturity

In the next blog entry, I’ll be discussing in specifics, the different levels of DW maturity. A lot of misconception out there regarding how to assess the reliability and usefulness of a DW. Most people expect a DW to become mature at the end of the initial implementation project. This is not realistic given the typical complexity of meta-data, data, and business rules surrounding it. The correct way to proceed with a DW-building project is to realize that there are certain phases that a DW goes through which brings it along the path into maturity.

A Good Data Warehouse Is Redundant

An experienced BI consultant would build one or more copies of the DW that can be switched at any given time.  Not only will this provide a good development and testing environment for analytic outputs (reports, dashboards, ETLs), but it would also

A Good Data Warehouse Highlights Data Problems

Due to its proximity with people who did the accounting, and forecasting in the organization, a good DW is usually the first place where these people notice data inconsistencies.   A good way to take advantage of this characteristics, is to put in place monitoring ETLs that runs periodically, executing queries that is designed to check the consistencies of data.

A Good Data Warehouse Reduces the Load on the Transactional System

A lot of the systems that maintains the transactional data input from users are inundated with processes that shouldn’t belong there in the first place.  These processes typically does checking, applying business rules, and monitor the validity of the data, which takes away resources from what Transactional Systems should be mainly about: Collecting data and preserving its integrity.

Now the follow up question is: How do we know which part of the Transactional System should be offload to the DW?

A good boundary rule-of-thumb is whenever a process can be executed against any data in batch-mode. This is an indication that it should be taken off the transaction system and moved to the DW. And by the way, it is not a taboo for a DW to pump data back into the Transactional System when a valid use-case can be made.

In Summary
There are many other ways to assess a DW to see if it’s “good” or “not good enough,” but all of them revolves around the relationship between the DW and the existing Transaction Systems and the other flip side of the coin, the decision maker team. A good DW should always bring clarity and bridge the communication barrier caused by incongruent view of data held by different parts of the business. Hopefully this can get you started in the correct line of thinking as you embark on your DW projects.

ZK + TimelineJS + JQuery = Visualizing Business Data Along The Time Dimension : Part Two

Posted by admin on August 13, 2013  /   Posted in BI and Custom Development, Business, Web Development, ZK

by Cancan Gunadi

In our previous blog, Visualizing Business Data Along The Time Dimension, we have shown one of the ways to visualize business data along the time dimension in a meaningful and engaging way.

In this blog, we’ll show you the technical detail behind the implementation. We used TimelineJS, an interactive web-based timeline tool, and integrated it into the system that we’re building for our client. The system itself is built using ZK Framework.

First, in case you haven’t already, you can visit the actual Live Demo of this timeline on our server here (log on using the provided credential).

We assume some familiarities with the ZK Framework and TimelineJS libraries. We recommend you to visit their respective documentations.


The timeline area is contained inside a ZK iframe tag inside one of the ZK .zul file as such:

<window id="mainWin" apply="${mainCtrl}">
    <iframe id="iframe" src="jsp/timeline.jsp>

The iframe loads a JSP file which contains the TimelineJS script which is shown below. Some of the details of the JSP and zul file have been removed for the sake of simplicity.

// normal JSP import stuffs... here...

   <!-- jQuery -->
   <script type="text/javascript" src="../timelinelib/jquery-min.js"></script>
   <!-- BEGIN TimelineJS -->
   <script type="text/javascript" src="../timelinejs/storyjs-embed.js"></script>

   <script type="text/javascript">
       $(document).ready(function() {
       <% if (docs != null && docs.size() > 0) { %>
           var dataObject = 
                   "headline":"Progress Info",
                   "text":"<%= user.getFullName() %>",
                   <% if (startDate != null) { %>
                   <% } %>
                   "date": [  
                   <%for (Document doc : docs) {%>
                           "startDate":"<%= doc.getDate() %>",
                              "<a class='timeline-doc-link' 
                                  doc_id='<%= doc.getId() %>' 
                                  href='#'><%= doc.getTitle() %></a>",
                           "text":"<%= doc.getText() %>",
                   <% } %>            


               type:       'timeline',       
               source:     dataObject,
               embed_id:   'timelineContainer',
               font:       'Arvo-PTSans',
               start_at_end: true,
           // div with id "timelineContainer" is the div that wraps 
           // around the entire timeline area, defined in the <body>
           // section.
           $('div#timelineContainer').click(function(event) {
               var clickedElem = 
                   document.elementFromPoint(event.pageX, event.pageY);
               if (clickedElem.nodeName == 'A' 
                       && clickedElem.className == 'timeline-doc-link') {
                   // parent is [object Window] according to firebug
                      new parent.zk.Event(
               return event;
       <% } else { // if no item to be shown ... %>

               '<span style="font-size:75%;color:grey;">
               No progress information submitted yet.</span>');

       <% } %>    

    <!-- END TimelineJS -->

    <div id="timelineContainer"></div>

Line 12-53 of the JSP is the TimelineJS script which is dynamically generated using the JSP scriplet for-loop (line 24) to insert the available items (or “documents” in this case) into the timeline.

Making the Timeline More Useful

To make the timeline more useful, hyperlinks were added to the timeline, which can be clicked to open new ZK Tabs containing the detail of the timeline item. To achieve this, the javascript code in the timeline script needs to be able to send an event to the enclosing ZK container so the ZK application can respond and open a new ZK Tab.

Line 28-31 shows the <a> element which is tagged with a CSS class “timeline-doc-link” and a custom attribute “doc-id” which contains the id of the document.

On line 58-73, JQuery code is used to add a click handler to the “div” element with id “timelineContainer”, which is a wrapper around the timeline (The div is defined in the of the HTML).

Line 59-62 retrieves the element which received the click and checks if it’s really the anchor element with the special css class.

Line 65-69 is the real key to enable the interaction between the timeline and the rest of the application, it takes advantage of a very useful ZK feature for sending Event to parent element. Using the “parent.zAu.send” method, we sent an event (called “onDocRequestedFromIframe”) to the parent ZK component “mainWin” (refer to the ZUL snippet shown earlier). From there, the ZK main controller (Java code) that is attached to the main Window component can respond by opening a new ZK Tab to show the requested document.

The result is nicely integrated timeline which can interact with the rest of the ZK application via ZK Events as shown in the following figure. Eventhough the timeline script resides within an iframe, this approach allows the timeline script to invoke Tabs, Menus, and other component within the containing ZK application.

Figure 1 - Seamless Integration of TimelineJS into ZK Application

Figure 1 – Seamless Integration of TimelineJS into ZK Application

In a Nutshell

We have shown one of the ways to visualize business data along the time dimension in a meaningful and engaging way, and how to implement it using the ZK Framework, TimelineJS and a little bit of JQuery and Javascript codes. We have shown how to enable seamless interactions between the TimelineJS script and the containing ZK Application using ZK’s Events.

We hope this post can be beneficial for anyone who may be looking to implement similar functionality.

Please also check out our main blog page where you can find useful information regarding Data Management, Business Intelligence, and more (click here).

Visualizing Business Data Along The Time Dimension

Posted by admin on June 18, 2013  /   Posted in BI and Custom Development, Business, Web Development, ZK

by Cancan Gunadi

From time to time businesses are faced with the need to visualize their business data along the time dimension. It is very valuable for the businesses to have a way to visualize information such as milestones or key events in a meaningful way. This visualization answers questions such as “When was the last Sales Milestone achieved?”, “How do I see the progress of XYZ?”, or “What was the last step completed by a training class participant, and when?”

One way we have found to be both effective and engaging is to use a “timeline”. Timeline can be described as a sequence of related events arranged in chronological order and displayed along a line (usually drawn left to right or top to bottom). Data can be aggregrated though a nightly ETL job and key events be made available to be plotted on the timeline by the same job.


In our implementation, we used TimelineJS, an interactive web-based timeline tool, and integrated it into the system that we’re building for our client. The system itself is built using ZK Framework. I will write more about the technical detail of how this is done in my next blog.

Here’s an example of how the timeline visualization looks like. Please visit the actual live demo on our server here. (log on using the provided credential).


In this mock up example, we show the sales milestones on the timeline. This timeline is interactive, user can slide the timeline bar (bottom half of the timeline widget) with mouse to the left of right, or even swipe with his/her finger on a touch-screen. Clicking on each “flag/event” on the timeline brings up a related bar chart above the timeline bar. If the user needs more detail, he can click on the hyperlink next to the bar chart to open a new tab with all the detail regarding this milestones (See my next blog post for interesting technical detail about the interaction between the TimelineJS, the containing HTML inline frame, and the ZK Framework). User can also navigate to the previous milestone by using the arrow button to the left of the bar chart.

In a Nutshell

We have shown one of the ways to visualize business data along the time dimension in a meaningful and engaging way. Using this timeline, businesses can quickly see when a key event occured, time-wise, in relation to other/previous key events.

Stay tuned for my next blog post, where I will describe the technical detail on how to implement this timeline visualization using the TimelineJS, and ZK Framework.

Please also check out our main blog page where you find useful information regarding Data Management, Business Intelligence, and more (click here).

We serve businesses of any type and size
Contact Us Today