Enterprise Architect version 13

download
purchase

English Chinese (Simplified) Czech Dutch French German Italian Korean Polish Portuguese Russian Slovak Spanish Swedish

My Profile

Social Media Channels

facebook google plus twitter youtube linkedin

Wednesday, 03 July 2013 06:32

Model Led Development for Greenhouse Gas Reporting

Written by 

Business situation

 

As companies seek to comply with greenhouse gas emissions reporting regulations, there is a clear challenge to avoid the yearly pain of intensive manual information gathering from multiple sources, spreadsheets, databases, meters and output from statistical reports. 

A UML model can be used to semi automate cloud hosted software services.  It is possible to set up an automated reporting system once, so that the information required for CO2e emissions by energy source is at hand throughout the year. 

Most of the software services depend on having a robust, tried and tested information model, and the time saved by automated generation from the UML model is huge, making the development of the reporting capability a lot simpler and more cost effective. 

View an HTML version of the UML Energy Reporting Domain Model

Energy GHG Reporting Domain Model

 

Figure 1: Energy Reporting Domain Model - top level

Technical situation

The costs of manual and semi-automated aggregation of greenhouse gas emissions data is high, because it requires a lot of human resources to process. 

This type of activity is error prone, and one of the reasons for the lower than expected price of CO2e emissions, is that markets dislike inaccuracy. (CO2e stands for CO2 equivalent, and it is a calculation of the warming effect of the full suite of greenhouse gas emissions (methane, nitrous oxide, etc) standardized as for CO2.  CO2e value is generally measured as a price per metric tonne.) 

Solution

The setup and configuration of the data interfaces is always going to be the biggest problem to solve. A UML information model can ensure the desired results through a back-tracking problem solving methodology. 

The Energy Reporting Model  was developed from the specifications published by the Australian government National Greenhouse Emissions Reporting, and Energy Efficiency Opportunities reporting requirements.  

The following steps describe the building of a web reporting facility directly from the model using Sparx EA code generation facility for databases. 

1.Building a UML model of the required reports. 

2.Code generate the data in the desired database definition language (DDL) script directly from the model

3.Build the target database using the script and deploy to a database instance on the cloud (I used Postgres database services hosted on the Amazon cloud. 

4.Analyse the current data collections and define the interfaces required to populate the reporting database from input data sources (the example uses spreadsheets and databases, and Postgres SQL to import the data)

5.Develop the SQL to generate the data to the required reporting staging tables from the input sources. 

6.I then used two additional cloud hosting services

a.RunMyProcess hosted workflow/automation to build data collection user interfaces for manual input of some of the energy data, written to the Amazon cloud database. 

b.Zoho reports provided the detailed reports accessing the Postgres data from Amazon,  updated using a Zoho scheduling capability .  (Zoho has an excellent reporting capability, in my view the best of their hosted offerings, and I used the cloud hosted offering to build the final tables, graphs, and drill downs). 

 

Benefits

The experts have found that simply monitoring energy usage does significantly reduce carbon footprint for a very low cost compared with other methods, and consequently lowers the price organisations have to pay for emitting greenhouse gases.

It makes sense to automate processes that have to be repeated annually to comply with legislation and reporting regulations, in terms of lowering the cost of reporting.  Payback for automation can be immediate, as the ICT costs are quite modest. 

In the next iteration of the GHG Reporting, I am building real time access to metered data collections to be accessed from the RunMyProcess cloud. Using Sparx EA generated code (e.g. Java, PHP, C++ as required) as a basis for aggregation of metered data into the electricity time-of-use daily, weekly monthly and annual summaries that can update the database to a predefined schedule for security access web reporting.  I am also designing a simple mobile app to provide graphical location based energy usage alerts. 

Sparx EA 

The greenhouse gas reporting solution used Sparx Enterprise Architect to build the reporting model, and to generate the DDL scripts to generate the Postgres database.  I was able to modify the model many times, then regenerate the databases automatically to reflect changes as I improved the model.  This was a huge saving of time and effort, and is one of the strengths of Sparx EA for building solutions. 

Other Software and Services 

The Sparx EA scripts generated the Postgres databases with very little additional coding required.  This was very useful as Postgres has a strong cloud presence and is supported by all the other cloud hosted software services I needed to build the reports.  I used Amazon as it was easier to set up than the HP offering, which is geared to large installations. RunMyProcess and Cloudbees cloud hosted software offerings are easy to use, and off course offer JDBC connectors to Postgres.  

 

Conclusion

Model led development is a very successful method of building ICT reporting systems.  And once the model is stable, and data aggregated into a standard RDBMS, the same model can be used to generate the starter code for building specific interfaces from a variety of data sources and formats, in any of the common coding languages supported by Sparx EA. I am looking forward to further automating the integration of energy data interfaces using cloud hosted software services. 

A worked example using the methodology outlined in this case study used Microsoft Excel spreadsheets and Postgres databases as input data sources, mapped to a Postgres Energy Reporting database scripted using EA’s code generation capabilities. 

 

Read 10542 times Last modified on Friday, 23 August 2013 03:27
Rate this item
(1 Vote)
nyamurray

Nya Alison Murray

Trac-Car (ICT Architect)
 
Twenty five years in the Information Technology industry, 15 years as a developer and designer, 10 years as a solutions, information, and enterprise architect. Trac-Car provides information architecture consultancy and develops enterprise models for information dissemination.  Global Carbon Data is an offshoot of Trac-Car.  It is a cloud hosted knowledge base curating climate change information using web search technology.  Global Carbon Data provides model based carbon emissions monitoring data, for the public, and on a subscription basis to analyse carbon footprints for interested parties. 

Website: www.trac-car.com
Login to post comments