Enterprise Architect version 13


English Chinese (Simplified) Czech Dutch French German Italian Korean Polish Portuguese Russian Slovak Spanish Swedish

My Profile

Social Media Channels

facebook google plus twitter youtube linkedin



Nya Alison Murray

Trac-Car (ICT Architect)
Twenty five years in the Information Technology industry, 15 years as a developer and designer, 10 years as a solutions, information, and enterprise architect. Trac-Car provides information architecture consultancy and develops enterprise models for information dissemination.  Global Carbon Data is an offshoot of Trac-Car.  It is a cloud hosted knowledge base curating climate change information using web search technology.  Global Carbon Data provides model based carbon emissions monitoring data, for the public, and on a subscription basis to analyse carbon footprints for interested parties. 

Journey to a Stable Climate


It is particularly important to measure and manage carbon emissions reduction from project activities. The major objective of the Global Carbon Data program is to establish and communicate accurate monitoring algorithms to quantify emissions reduction.  Carbon abatement projects can then qualify for carbon offsets/credits using these standard methodologies.

The program is to offer a centre of excellence for those organisations who want to engage in testbeds for specific project types, facilitating knowledge and technology sharing of climate change and greenhouse gas emissions reduction. Data can be published to reach governments, NGOs and the corporate sector through a climate change specific cloud hosted knowledge base.     

The implementation of Global Carbon Data is being developed around a UML model that provides a data model for Energy and Carbon Emissions generated using MDA transformation from classes to data tables.

Providing for requirements and scenario traceability using UML modelling technology means that resources and efforts can be readily managed and ­monitored during the staged development phases of delivering information about energy and carbon emissions monitoring.

An architecture  model led approach provides for managing the hybrid private cloud deployment using model-led metadata for both Enterprise Planning and Business Process Automation.


About Global Carbon Data


The program aims to enable climate change and carbon emissions research and development, knowledge and expertise in two main ways.

     To establish an easily accessible and relevant global knowledge base of research, reports and data, using software that is able to auto classify information based on context, content and commonality of usage.

     To facilitate and monitor data from the establishment of pilot projects to collect, accurate test data to be used as the basis for emissions reduction metrics. The aim is to facilitate the certification of emissions reduction methodologies resulting in carbon credits/offsets able to be traded in global carbon markets.

Global Carbon Data online is to share and communicate structured data (carbon metrics, analytics and algorithms) as well as research, reports, and analysis of project activities that effect a net reduction in greenhouse gas emissions. High quality research and reporting provided by international organisations engaged in addressing climate change can be published through the site.

Being able to view the management of the project resources and scheduling directly from a task perspective is useful, particularly if the production of data is also linked directly to delivery of the capabilities that facilitate the business scenarios.

Data is becoming increasingly important, in view of advances in cloud security, big data analytics and the proliferation of information gathered from IIOT devices.

To work with a development team that is located in different locations, it is very useful to be able to share the entire architecture specification from a single model. This not only facilitates the focus on delivery, it provides for ready peer review and the cross pollination of knowledge and technologies.

Energy and Carbon Emissions


Figure 1: Energy and Carbon Emissions Monitoring Model

This approach reflects through to the delivery of the information to the platform subscribers.  The focus on data delivery promotes information services delivery not only to the system builders, but the system users as well. Collaboration is a simple matter of providing project data, expert knowledge and information in context, and making it readily accessible.

Cloud technology is available to facilitate knowledge sharing across organisations, national and regional boundaries. Connecting people, projects and research ensures the rapid transfer of data over mobile, public and private networks.


Curated Access to Information

The program aims to facilitate organisations seeking information about climate change by providing ready access to information about greenhouse gas emissions reduction from credible sources.

Subscribers can access published reports, research and data by project category, research topic, using simple search terms The knowledge base is to provide highly focused, context specific information via a specially developed climate change taxonomic search engine.

Taxonomic search can return research, reports, data and project analysis, ranked by relevance, regardless of whether the terminology used is for precipitation or rainfall, greenhouse gas emissions reduction or carbon abatement, agriculture for hot dry climate or drought tolerant crops.. By making the information context sensitive, the online search results are far more focused than standard text search mechanisms.

Consumers of information, such as government agencies and corporations seeking to reduce emissions can use a guided process to look up relevant reports, data, research, and current thinking for strategic responses to climate change.

The Global Carbon Data program is to publish a knowledge base of climate change and carbon emissions research and data held by leading organisations. Governments and companies wishing to access the data can do so on a fee for service basis.

The key to enabling a global carbon emissions knowledge base of structured and unstructured data sharing, essential to monitor emissions across national borders and regional boundaries, is of course a common logical data model. This is also the prime enabler for a global stable price on carbon.

Data for Carbon Markets

The program is to facilitate collaboration across organisations to establish relevant, repeatable emissions reductions methodologies for a specific emissions reduction project types. Examples include:

     The collection of data from renewable energy micro grid electricity generation, to which emissions factor algorithms can be applied to produce accurate estimates of greenhouse gas reductions from replacing existing fossil fuel types with renewable energy.

     The collection of data from land management emissions abatement activities such as prevention of deforestation by forest edge cultivation, reafforestation of rainforest, restoration of mangroves in sub-tropical coastline, and soil sequestration of carbon from agriculture innovation.

The data is to be collected, analysed and made available online as required to authorised and authenticated parties using secure networks. Data collected from project activities can be certified by carbon offset standards bodies to earn carbon credits.

By collecting accurate data from a wide range of project activities, the intention is to help facilitate a stable global carbon price, essentially by improving the level of accuracy of emissions reduction data to the burgeoning global carbon markets.

Value added services for the program include the application of accurate, appropriate greenhouse gas emissions reduction algorithms to project activities and online access to analysis and infographics and time series and location based data analysis .



Monday, 30 September 2013 06:27

Global Energy Exchange - Data Without Borders




For harmonisation of data, models are absolutely essential, enabling more automated, more change enabled, flexible and future proof technology. A model led implementation of energy transmission data is an effective way to ensure a standard approach to energy monitoring & marketing, and Sparx Systems Enterprise Architect hosts the Utilities Common Information Model.

Being able to generate messages about electricity supply and energy markets using a combination of code and data generation from an enterprise model linked to the Utilities CIM, an OMG DDS implementation, and standard geospatial models is an efficient way to deliver large volumes of energy data messages about supply, and markets across networks, with cloud technology.
Interoperability is, after all, the holy grail of Smart Grid.

The deployment of real-time Smart Grid operations can provide information flow about energy supply and demand amongst end-users, electricity generators, and the electricity grid. Most countries are upgrading their transmission and distribution networks to be ‘Smart Grids’, with data from SCADA, OMS and EMS systems, and increasingly from the next generation Wide Area Measurement Systems (WAMS). This represents a unique opportunity to provide real-time geospatially connected data for a wide range of metrics and calculations from power generation, transmission and distribution, including energy and carbon trading, associated greenhouse gas emissions, and energy efficiency savings.

New technology initiatives can be very costly, with long lead times, and significant lag times for implementing change. Smart Grid costs can be significantly reduced, and flexibility can be significantly increased by sharing technology deployments.

As the energy industry landscape is transformed, there is an urgent requirement to standardise the format of data able to exchange information for electricity and transport consumption and markets. There is no reason that there cannot be global automation of energy data exchange.
The Utilities CIM is an abstract UML model that provides coverage for data elements in the Utilities industry, objective, informational, and enumerative. It can be extended to cover Wide Area Management, and synchrophasor metrics.

Today's ICT technology is increasingly automated from UML models. Connections between a model of physical elements, such as the CIM energy management and electricity supply domain, with common ICT technology, such as messaging and analytics. This gap means that significant costs and complexity of implementation effort has to be incurred to for further Smart Grid automation, with web workflows, data integration, event processing, data centric messaging etc.

To decrease risk, there has to be an increase in the accuracy of human, machine, and financial resource estimates, for an ICT enabled Smart Grid. A standard interoperability metamodel can provide a global context for information delivery. Exchange of energy management and market data is key to managing energy supply efficiencies across network operator regional boundaries.

Energy data exchange can be affected by current, best practice ICT technology, built from a platform independent metamodel, to bridge the gap between energy metrics, realtime decisioning, and energy market data flows.

A common model can provide the bedrock for the algorithms and definitions required to synchronise real-time metrics. Data models (such as the Utilities CIM), can provide the reference metadata for interface mappings in a real time geospatial context.

Model-led deployment of technology to provide energy data and information technology workflows, can deliver supply and demand efficiencies for both energy and carbon trading, across time zones and national boundaries, by reducing the costs of integration, because of harmonised data elements.

Synchronisation of data definitions, through adherence to a specific model can optimise technology services by simplifying the number and scope of systems interfaces.
For each network operator, interoperability of data can be faciilitated by translation into common formats from web interfaces and workflows for automated mapping.

Geospatial information, and cross referencing of data and standards, plays a fundamental part in addressing semantic information flows. Location can provide a key to referencing metadata. A location centric repository can provide a register for network assets, as well as data structures, transformation algorithms and messaging subscriptions.

To connect energy management and market models and geospatial models, in an information delivery context, enables operational and business intelligence from Smart Grid network devices to be mapped geographically.

Energy geospatial overlays provide an opportunity for utilisation of CEN/ISO/OGC standards, and emerging geospatial practice, further enabling data exchange among relevant energy stakeholder groups, able to share network and market intelligence without delay. Correctly implemented, operational intelligence, geographically mapped, can become widely accessible, simplifying and clarifying the processes of energy and carbon trading across borders, for current players, as well as the increasing numbers of renewable energy generators.

In addition to well publicised areas such as energy efficiency, grid health, and demand management, one aspect of the energy industry transformation that cannot be overlooked, is the potential to collect accurate data for monitoring the world's CO2 emissions from electricity consumption. Improvements can be made in the quality of the energy data to which carbon emission monitoring algorithms are applied.

Energy supply and demand efficiency must accommodate distributed, renewable energy generation - not only from new renewable energy generation plants, but also industry and households. This means that real-time supply and demand data access will be a critical factor to enable energy efficiency by transmission and distribution network operators. To meet the threat of a world energy crisis, operational processes have to be fully automated, based on real-time energy information made accessible across national boundaries to meet the information requirements of all stakeholder groups.



Regional, national, and international energy markets are challenged by the inclusion of large scale renewable energy sources into the grid. Accelerating consumer supply and demand also stress the ability of the current infrastructure to meet future energy needs. Accuracy can be improved by utilising Smart Grid operational data collected in real-time, as these new challenges have a temporal urgency, in view of UN IPCC climate change reports.

The best way to proceed is to start with standards that are already in place, then extend them to provide a common terminology, not only for energy efficiency, but also energy and carbon trading markets. This represents a real paradigm shift in current thinking and practice.
Smart Grid interoperability cannot be defined simply by a data model and web services. It has to be demonstrated in the context of a cross-border energy exchange of high speed, real-time metrics.
The breadth of application of synchrophasor technology can be not only for network stability monitoring and demand management, but also for post event analysis. Smart computing devices attached to ICT networks can provide pre-processing of energy transmission data not only to energy management processing systems, but also to energy market technology systems.

Fossil fuel and renewable energy source metrics are important variables in carbon trading, as well as pricing in energy markets. Geospatial analysis of energy utilisation, pricing, and CO2e emissions reduction insight can be made accessible from post event analysis of real-time data.

This information is of interest, not only to energy industry stakeholders and governments, but also to industry and households, as everybody has to participate in energy efficiency, supply and demand, and carbon emissions reduction, to achieve climate change mitigation.
To integrate energy information, with geospatial infrastructure, means being able to integrate geographically diverse information from both electricity and ICT networks, efficiently and successfully. This is only possible if the industry has a common parlance for automated data distribution, and workflow synchronised, and authorised collaboration and co-operation for sharing energy intelligence across organisation and national boundaries.

Smart Grid and ICT operations platforms can provide cost-effective energy efficiency, and energy pricing data, in a geospatial context, as well as carbon emissions reduction metrics. In fact this is essential to meeting the new situation of wide area energy transmission, consumer energy generation, and CO2 reduction targets.

Time is running out for addressing global energy financial and environmental costs. To proceed, it is critical that there is no tower of Babel. Now is the time to decide on common standards and terminologies. And even more importantly, now is the time to recognise that an efficient and effective technology for an automated energy data exchange as to be an international collaboration.

The core ICT technology required to mobilise a cost-effective Smart Grid, is a common semantics-based energy data integration of complex event and web content delivery technology, data centric high speed distributed, real-time information access. Agreement on semantics cannot be the province of standards bodies only. It is too complex for theoretical approaches to produce a completely useful set of semantics. Semantics have to be developed by application of best practice technology in a pilot phase.
Standards based Smart Grid/ICT can be agreed by upon by Energy and ICT stakeholders to develop a common energy model exchange technology approach. In view of the EU’s “An Energy Policy for Europe”, the North Sea SuperGrid, and the fact that industrial energy utilisation currently accounts for the majority of all global greenhouse gas emissions.

There are any number of approaches to providing ICT solutions to integrate Smart Grid information with geospatial and organisation data for operational, business and market purposes. If these solutions are not synchronised, there can be no effective information exchange across regional boundaries.

An efficient and cost effective solution has to discover, test and automate a common exchange model as the basis for generation of Smart Grid events triggering post event analysis workflows and transactions for an interconnected energy market.

Historically, excessive ICT costs have been incurred by information gathering from semantically incompatible data.

The characteristics of technology effecting a common exchange of data based on common models has to be
1. Able to connect different technologies across Smart Grid and ICT in real time
2. Able to be easily deployed on current technology integration platforms, networks and infrastructure clouds
3. Able to use and translate to and from common vocabularies and protocols.
4. Able to access and utilise standard geospatial data overlays and infrastructure



According to the European Union’s “An energy policy for Europe" legislation [1], energy accounts for 80% of all greenhouse gas emissions in the EU, one of the acknowledged factors contributing to the development of the current energy policy.

The amendment to regulation (EC) No 1228/2003 on conditions for access to the network for cross-border exchanges in electricity, on conditions for access to the network for cross-border exchanges in electricity, is changing the nature of Energy Transmission. Part of European energy policy is the role of ICT (Information, and Communication Technologies) to facilitate the transition to an energy-efficient, low-carbon economy.

US Department of Energy’s website endorsement of the “National Transmission Grid Study”, part of which is “Ensuring the Timely Introduction of Advanced Technologies” [2] , promotes the modernizing of America's electricity infrastructure, and is one of the U.S. Department of Energy's top priorities.

The NIST (National Institute of Standards and Technology) website US Recovery Act information has targeted funding in the area of energy, environment and climate change. Named sub topics include - Research on measurement technologies to accelerate the deployment of Smart Grid to the US electric power system. - Research to develop advanced measurement capabilities to monitor greenhouse gas emissions. [3] Innovations in Energy storage technologies [4] , the European North Sea renewable SuperGrid [5], and the European Commission Energy target ‘20% renewable energy by 2020” [6] has dramatically increase the significance to transmission network operators of Smart Grid and associated ICT technologies to include renewable energy generation on a large scale.

To accommodate these developments, it is essential to deploy Smart Grid operational event data, geospatially connected by common semantics, standards-based and interoperable ICT technologies. Standards, Protocols and Terminologies The work of the CEN/TC (European Committee for Standardization: Technical Committees) [7], number 287, has ensured that there is interoperability between the ISO[8], OGC [9] and Inspire [10] geospatial data and frameworks. ICT industry not-for-profit consortium, the Object Management Group, has developed an approach, known as Model Driven Architecture (MDA) [11] supported by the Universal Modelling Language (UML) specification [12] , to help automate and integrate information across diverse formats, applications and technology platforms.

The Telecommunications Shared Information and Data model (SID) [13] is an industry model that has been constructed with deployment in mind. The Utilities CIM, incorporating the IEC standards 61970 and 61968 [14] is a UML 2.1 Platform Independent Model to describe the current state of Energy Generation, Transmission and Distribution - “Unlike a protocol standard, the common information model (CIM) is an abstract model that may be used as a reference, a category scheme of labels (data names) and applied meanings of data definitions, database design, and a definition of the structure and vocabulary of message schemas. The Utilities CIM also includes a set of services for exchanging data called the Generic Interface Definition (GID).” The GID specifies how to exchange data, while the RDF CIM is an XML version of the model, useful for system interaction. Both are designed to facilitate information exchange between Smart Grid and ICT technologies.
The OMG DAF (Data Access Facility) Specification[15], designed to address Utility Management Systems, mobilises the RDF version of the CIM.

The OMG DDS (Data Distribution Service), and DDSI (Data Distribution Service Interoperability) Specifications [16] provide for a distributed global data space for high speed messaging and semantic interoperability over a wired network protocol. A geospatial context for Energy Smart Grid data is essential to take advantage of technology advances such as WAMS ….. “In Europe, the SmartGrids initiative is developing a roadmap to deploy the electricity networks of the future, and WAMS is one of the key technologies considered. In the U.S., the Electric Power Research Institute (EPRI) runs an initiative aiming to provide a technical foundation to enable massive deployment of such concepts (Intelligrid)” [17]

The European Commission geospatial standard INSPIRE [18], provides not only geospatial data and infrastructure, but also a community geoportal is currently being developed, to provide real-time geospatial context.

What is required is a simple standard way to deploy the energy models such as the Utilities CIM in a geographically distributed Long Term Evolution ICT network context, with automatic semantic translation, on a technology platform capable of high speed complex event processing, to meet the current legislative, operational, business and market challenges for Trans European Energy Networks. This interoperability standard has to apply to the performance of the integration technology, as well as the common data service domains, keys and topics for energy management and market data.



New ICT paradigms have emerged in recent years, including “cloud” infrastructure grids, automated semantic data, web federated and low latency real-time messaging services, smarter wireless devices, next generation IP networking, federated web content delivery and complex event processing.

The context over the next five years is rapid development and growth in Energy Smart Grid technology, renewable power sources, energy trading, and the linking of real-time grid data into the carbon trading market. These radical transformations will be accelerated by consumer participation in energy supply and demand, and new legislative requirements for energy efficiency, carbon and energy markets.

What is required to meet this growth, is a semantic core technology that supports the current and emerging standards of high speed event technology automation, able to connect with a huge diversity of new and existing Smart Grid and ICT technology, and very importantly, supports the existing and emerging ICT standards relevant to Energy Generation, Transmission and Distribution. At this stage, there is a type of modeling technology that supports this architecture, and that is UML integrated Model Driven Generation (MDG) Platform Independent Modeling, supporting standards in architecture, business process automation, and systems engineering, event processing and data distribution. However there are few players in this space, and even fewer supporting the particular requirements for an Energy Model Exchange technology.

UML is the defacto standard used by most technology suppliers to deploy business workflow, web information access, real-time complex event handling, and data and message integration on a distributed technology grid, governed by current and emerging Energy and ICT standards, and models.



Energy exchange model technology requires a complete a UML/MDG automation into a high speed event context. Data exchange has to encompass and automate not only energy data elements, but also new energy efficiency and trading, and carbon trading semantics that are only now being identified. A metamodel has to be a Smart Grid/ICT deployable exchange model that is more than just the canonical model of energy elements, that , for example, the Utilities CIM provides.

The technology has to provide for ICT data distribution and event handling services metadata, as well as additional semantics for energy trading, efficiency, and CO2 emissions. This metamodel has to be able to logically connect all elements from Smart Grid and ICT deployment in a real-time network context.

This marks the real challenge for interoperability of existing utilities ICT systems, e.g. load management, transmission and distribution, metering and billing applications, with new energy intelligence, including demand management, energy efficiency and trading markets.
The timeframe, to meet with EU Energy policy and legislation directives about renewable and sustainable energy, energy efficiency, security of supply, as well as technology and innovation, is quite short .



Efficient, geospatially aware energy management requires model technology automation of high speed real-time event handling in a multi-party semantic communication framework in the context of distributed, allocate-to-order real-time infrastructure performance. An energy model exchange technology has to go to the next stage of automation evolution. It has to presume a multi-way teleportal communications paradigm shift, with all parties as energy suppliers, and all parties as energy consumers, to facilitate energy and carbon markets.

The energy Smart Grid also has to meet many of the ICT challenges of IP V6 Long Term Evolution networks, requiring an advanced flexibility and growth capability, able to connect and upgrade smart telemetry device interactions, capable of adding new functionality without redevelopment.
What is required to meet these challenges, is a distributed, model-led, semantic, integration capability based on a deployable common geospatially aware exchange model that connects the elements of the CIM, databases, web services, and messages. And these characteristics have to be put to work in the energy Smart Grid context of complex operational event processing, business integration services, deployed on scalable network and server infrastructure. In addition, information has to be accessed and exchanged by different stakeholder groups, organisations and businesses.



There are already interface, communications and data standards and infrastructure, developed by various global standards organisations, an energy
model exchange technology can readily be achieved with a collaborative, concerted effort by Energy and ICT stakeholder organizations.

In terms of finding a modeling technology capable of meeting the trans Europe energy challenge, Sparx Systems technology is a leader in the support for open technology standards and semantics, particularly those standards that would enable a high speed Energy Model Exchange, with existing support for the Utilities CIM, as well as OMG MDA and DDS.



Standardisation of geospatial information and infrastructure is a significant advance for the cause of interoperability in Energy Smart Grid/ ICT. It is important for energy artefacts to be included as standard overlays with the Inspire, ISO and OGC initiatives. The next step is to ensure that geospatial data is integrated via the common semantic data exchange outlined above. Geospatial integration raises the subject of distributed data processing, able to provide global geospatial context with local maps and information.



An energy model exchange technology, developed collaboratively, amongst stakeholders, is a cost effective way to deploy an evolving Energy Smart Grid.

Most importantly, an extension to the Utilities CIM could provide standard interfaces to realtime energy data, to provide cross domain operational and business metrics that currently do not exist. This would allow carbon emissions monitoring algorithms for the estimation of greenhouse gas emissions avoided by electricity generation and efficiency projects, to be shared across the industry, for emissions trading schemes, consumer pricing incentives, etc.

Last, but not least, data inaccuracy puts downward pressure on price in carbon markets. Current estimation techniques and algorithms may be improved by gathering sample data electricity networks pilots. Data can be aggregated to provide emissions from energy generation by generator type on a near real time basis. Energy market data can also benefit by virtue of the fact that emissions trading data produced by energy exchange technology can be extended to provide automated transaction updates across geographic and national boundaries in real-time.

Once a real-time trading market on greenhouse gas futures from electricity transmission has been established, it is a fairly short path to collecting data for transport fuels, and then industrial processes, given the work done by the Intergovernmental Panel on Climate Change to develop accurate emissions calculations algorithms.

A pilot project for electricity transmission and market data from renewable energy sources is an obvious starting point.



[1] EU Legislation, European Energy Policy.
[2] US “National Transmission Grid Study”, presented by Energy Secretary Spencer
Abraham to the president of the USA, May 2002.
[3] US NIST (National Institute of Standards and Technology) website Recovery Act
[4] Jon R. Loma, “The Challenge for Green Energy: How to Store Excess Electricity”,
Business & Innovation Energy Science & Technology North America, 13th Jul. 2009.
[5] Press Release, The Guardian Newspaper, Germany, France, Belgium, the
Netherlands, Luxembourg, Denmark, Sweden and Ireland and the UK governments,
3rd Jan., 2010. http://www.guardian.co.uk/environment/2010/jan/03/european-unitesrenewable-
[6] European Commission Energy, Energy Policy.
[7] CEN, European Commission for Standardization.
[8] International Organization for Standardization, ISO 19115, Geographic Information,
[9] The Open Geospatial Consortium, Inc. http://www.opengeospatial.org/
[10] Directive 2007/2/EC of the European Parliament and of the Council of 14 March 2007
establishing an Infrastructure for Spatial Information in the European Community
(INSPIRE) http://inspire.jrc.ec.europa.eu/
[11] Object Management Group, Model Driven Architecture http://www.omg.org/mda/
[12] Object Management Group, Universal Modelling Language (UML), specification.
[13] TM Forum Information Framework
[14] Utilities Common Information Model (CIM) User Group.
[15] Utility Management System (UMS) Data Access Facility Specification.
[16] Object Management Group, Data Distribution Service, Data Distribution Service
Interoperability Specifications.
http://www.omg.org/spec/DDS/ http://www.omg.org/spec/DDSI/
[17] Albert Leirbukt, Ernst Scholtz, Sergiu Paduraru “Taming the Electric Grid -
Continuous improvement of wide-area monitoring for enhanced grid stability”
[18] European Commission INSPIRE Geoportal
[19] National Rural Electric Cooperative Association “Multispeak”

Thursday, 26 September 2013 00:24

Energy & Greenhouse Gas Reporting in the Cloud

Trac-Car has built a reporting solution for monitoring greenhouse gas emissions for Australia's National Greenhouse Emissions Reporting in Sparx Enterprise Architect. 

The solution has a UML model at its core that has been used to generate a database to connect to  data from electricity, transport and other industrial activities as it is made available from meters, databases and spreadsheets. 

With workflow and technology connectors , a global cloud solution can facilitate low cost location based energy & greenhouse gas monitoring, while organisations can have immediate access to their carbon footprint. 

 The model is freely available for download.

 A worked demo for Eureka Works, Australia Corp, a manufacturing facility with a transport fleet, is available for view.

Business situation


As companies seek to comply with greenhouse gas emissions reporting regulations, there is a clear challenge to avoid the yearly pain of intensive manual information gathering from multiple sources, spreadsheets, databases, meters and output from statistical reports. 

A UML model can be used to semi automate cloud hosted software services.  It is possible to set up an automated reporting system once, so that the information required for CO2e emissions by energy source is at hand throughout the year. 

Most of the software services depend on having a robust, tried and tested information model, and the time saved by automated generation from the UML model is huge, making the development of the reporting capability a lot simpler and more cost effective. 

View an HTML version of the UML Energy Reporting Domain Model

Energy GHG Reporting Domain Model


Figure 1: Energy Reporting Domain Model - top level

Technical situation

The costs of manual and semi-automated aggregation of greenhouse gas emissions data is high, because it requires a lot of human resources to process. 

This type of activity is error prone, and one of the reasons for the lower than expected price of CO2e emissions, is that markets dislike inaccuracy. (CO2e stands for CO2 equivalent, and it is a calculation of the warming effect of the full suite of greenhouse gas emissions (methane, nitrous oxide, etc) standardized as for CO2.  CO2e value is generally measured as a price per metric tonne.) 


The setup and configuration of the data interfaces is always going to be the biggest problem to solve. A UML information model can ensure the desired results through a back-tracking problem solving methodology. 

The Energy Reporting Model  was developed from the specifications published by the Australian government National Greenhouse Emissions Reporting, and Energy Efficiency Opportunities reporting requirements.  

The following steps describe the building of a web reporting facility directly from the model using Sparx EA code generation facility for databases. 

1.Building a UML model of the required reports. 

2.Code generate the data in the desired database definition language (DDL) script directly from the model

3.Build the target database using the script and deploy to a database instance on the cloud (I used Postgres database services hosted on the Amazon cloud. 

4.Analyse the current data collections and define the interfaces required to populate the reporting database from input data sources (the example uses spreadsheets and databases, and Postgres SQL to import the data)

5.Develop the SQL to generate the data to the required reporting staging tables from the input sources. 

6.I then used two additional cloud hosting services

a.RunMyProcess hosted workflow/automation to build data collection user interfaces for manual input of some of the energy data, written to the Amazon cloud database. 

b.Zoho reports provided the detailed reports accessing the Postgres data from Amazon,  updated using a Zoho scheduling capability .  (Zoho has an excellent reporting capability, in my view the best of their hosted offerings, and I used the cloud hosted offering to build the final tables, graphs, and drill downs). 



The experts have found that simply monitoring energy usage does significantly reduce carbon footprint for a very low cost compared with other methods, and consequently lowers the price organisations have to pay for emitting greenhouse gases.

It makes sense to automate processes that have to be repeated annually to comply with legislation and reporting regulations, in terms of lowering the cost of reporting.  Payback for automation can be immediate, as the ICT costs are quite modest. 

In the next iteration of the GHG Reporting, I am building real time access to metered data collections to be accessed from the RunMyProcess cloud. Using Sparx EA generated code (e.g. Java, PHP, C++ as required) as a basis for aggregation of metered data into the electricity time-of-use daily, weekly monthly and annual summaries that can update the database to a predefined schedule for security access web reporting.  I am also designing a simple mobile app to provide graphical location based energy usage alerts. 

Sparx EA 

The greenhouse gas reporting solution used Sparx Enterprise Architect to build the reporting model, and to generate the DDL scripts to generate the Postgres database.  I was able to modify the model many times, then regenerate the databases automatically to reflect changes as I improved the model.  This was a huge saving of time and effort, and is one of the strengths of Sparx EA for building solutions. 

Other Software and Services 

The Sparx EA scripts generated the Postgres databases with very little additional coding required.  This was very useful as Postgres has a strong cloud presence and is supported by all the other cloud hosted software services I needed to build the reports.  I used Amazon as it was easier to set up than the HP offering, which is geared to large installations. RunMyProcess and Cloudbees cloud hosted software offerings are easy to use, and off course offer JDBC connectors to Postgres.  



Model led development is a very successful method of building ICT reporting systems.  And once the model is stable, and data aggregated into a standard RDBMS, the same model can be used to generate the starter code for building specific interfaces from a variety of data sources and formats, in any of the common coding languages supported by Sparx EA. I am looking forward to further automating the integration of energy data interfaces using cloud hosted software services. 

A worked example using the methodology outlined in this case study used Microsoft Excel spreadsheets and Postgres databases as input data sources, mapped to a Postgres Energy Reporting database scripted using EA’s code generation capabilities. 


Wednesday, 23 January 2013 05:24

The Value of an Enterprise Information Model


An enterprise architecture reduces the cost of operations, through reuse of standard pieces of technology, application and data, and network .

However siloes of information in specialist corporate IT teams are not making it through to a central knowledge base, meaning that function and data are repeated in a string of parallel universes, that are not congruent, and have synergies that remain largely unexplored.

Enterprise architectures are often seen as not addressing the coalface reality of reuse and accessibility of data and services.

An Enterprise Information Model can address the gap between business and technology owners, practicality and principle, concept and delivery.

Deploying from an Enterprise Model

Metadata links most of the common Enterprise Planning and Business Process Automation domains:

Customers are linked to products, enterprise asset management is referenced geospatially, and of course organisation data provides the basis for linking web interfaces to identity and access management systems. 

Workflows start or are started by scheduled and triggered events, and web content/assets can be linked and reused by multiple workflows.

An Enterprise Model can be deployed from a set of domain classes.  Relevant metadata databases can be directly generated, and runtime objects can also be generated from the same Platform Independent Model in a specific technology, for example .Net, Java, XML.  Code stubs can be deployed by common development IDEs to complete the build of runtime systems.

This is an enormous saving in architecture, design and development processes, leaving people free to invest in newer technologies and innovation, rather than having to iteratively process, develop and deploy. 

From concept to code is getting ever closer.  With the advent of quantum computing, the recognition of algorithmic patterns that have quantum mechanical analogs is key to moving ICT to the next level.


Models and Modelling

Models are the ideal way to commence the process of pattern recognition for ICT deployment

Although every organization has a unique set of processes, and a unique language for describing these processes, in essence, there is an enormous degree of commonality.  While subject areas involved in a business are are published as conceptual models, cohesion outside of the project is rarely considered. 

An opportunity is missed, as subject area models can be readily mapped to standard business domains. 

Accessibility of information complies with all the ‘nice to have’  enterprise principles covering  information architecture, SOA and reuse of existing capability.

An Enterprise Information Model serves as a natural method of documentation for business domain functionality at an enterprise level.

While design principles are useful, they are no substitute for a coherent model.

An Enterprise Information Model can be used to support design and development.

For business users, a model supports the basic concepts,  and provides a common language with which to communicate with designers and developers.  At the operational level, a model can be built in such a way as to facilitate decision making by facilitating communication about performance (Business Activity Monitoring) to business owners.  

An EIM has to be be clearly stated and understandable to the system owners, builders and users.


Enterprise Information Management

A centrally managed model  can be version controlled.  

Business Domain UML Models can be deployed as a metadata database to enable cohesion across business areas by deployment as an integration data hub, as well as publication mechanism for data services.  Well-constructed, a Business Domain Model can generate databases, and engineer code that provides a starting point for implementation of data, XML and web services, implementation classes and procedures for deployment (allied with the principles of MDA).

Business Process Models (in BPMN or UML) and User Interaction Models can be mapped to the business domains, providing not only a shared knowledge base, but also a ready conversation between business analysts and developers.

Enterprise data can not only be explored and managed to avoid duplication, an Enterprise Metadata Model can facilitate integration.  Data can be shared (except where exempted by security policy) through metadata management.  Mapping physical data to a metadata database provides a publication mechanism and the basis for messaging, both data centric and middleware style, using RDBMS SQL, SPARQL for the semantic web or XQuery for XML data collections.

This promotes both federated and consolidated data integration, with a clear separation from from application services and processes, facilitating optimum database infrastructure performance. It also facilitates the development of service taxonomies for ESB, SOA and other integration platforms.

Enterprise Architecture Process

Allied with program governance, architecture governance is equally important.  An enterprise architecture process is equally as essential as a project management process,  providing the structure, timeline and  delivery framework to meet release target dates.

Using an Enterprise Information Model as the knowledge repository for a team based approached to ICT project delivery allows for a clear delineation of responsibility, in the context of technology delivery prerequisites and dependencies.

This means an evidence based approach for project estimation based on architecture, design and development project deliverables, traceable to business requirements and business priorities.

It also means that business and technology specialists can share a perspective,  collaborate on the implementation of ICT systems, on time and on budget,  with a good appreciation of the whole project, and the part that every role plays in the delivery.

View a worked example of an Enterprise Information Model


Author: Nya A Murray, Chief Architect,Trac-Car .

Date: 23rd January 2010

Website: http://www.trac-car.com

Email: This email address is being protected from spambots. You need JavaScript enabled to view it.


 1  Executive Summary


Corporations and governments around the world are starting to collect information from telemetry devices, in the fields of


1. Water supply

2. Transport

3. Electricity

4. Manufactured goods

5. Agriculture


Valuable information can be produced from telemetry data, about resource utilization and associated carbon emissions in all of these fields.


Climate change is definitely an imperative. Telemetry data is a critical input into providing accurate global environment information, and carbon emissions associated with individual developments .


 1.1 Emissions Trading


Emissions Trading Schemes, as set out in the Kyoto Protocol, are reliant on market mechanisms of cap and trade to regulate carbon emissions. Different mechanisms are being used in different parts of the world, probably the most developed being the European Union ETS. In addition, the Kyoto Protocol allows for two types of initiatives that developed countries can undertake in developing countries, the Clean Development Mechanism ( CDM, projects approved by UN FCCC) and  Joint Implementation (JI) projects. One Emissions Trading Unit ETU) is equivalent to 1 tonne reduction of CO2 emissions. 


There are  a few emerging problems with cap and trade schemes, such as

  • the estimation techniques have a significant uncertainty. 
  • Many emitters and industries are not covered by the schemes
  • Carbon leakage –  a net increase in carbon emissions because of different pricing and regulation between countries, of which carbon polluters take advantage. 


There are no guarantees that these schemes will be effective in the actual measurement and  reduction of carbon emissions into earth’s atmosphere, because practically speaking, it is impossible to fully regulate a carbon market within the timescales required to be sure of limiting temperature rises to an acceptable range.


With the lesson of the failure of self-regulation by financial markets still fresh, it may well prove too risky to allow market forces to regulate carbon emission reduction.


A better strategy for carbon emissions governance is to selectively apply a regime of carbon emissions monitoring, particularly  in  the energy and transport industries, encouraging engagement and participation by all sectors of the corporate community. This may serve to actively promote a popular culture of carbon emissions reduction, by offering carbon emission reduction discounts.


 1.2 Real-time monitoring and carbon discounts


By monitoring events that emit CO2, such as power consumption and transport, at an industry and household level, carbon reduction pricing incentives to reduce emissions,  may ensure that there is enough popular support, required to be effective in limiting temperature rises.  


Carbon taxes, discounts, and cap and trade schemes will all work more effectively if the end users are involved in monitoring mechanisms. Some of the obvious methods are

  1. Monitoring energy utilization in real-time and rewarding use of renewable energy  with discounts.
  2. Road pricing based on distance driven, with discounts for greener cars, particularly electric vehicles (where emissions would be accounted for by energy supply utilization)
  3. Estimation and calculation of carbon emissions used to produced goods traded or sold, and new buildings and industrial developments.


This information could be published, and used as the basis for estimating carbon taxes. 


The success of monitoring global carbon emissions reduction depends on a mechanism for ensuring that knowledge, experience and expertise in carbon emissions monitoring is shared, not only locally, but nationally and internationally. 


 1.3 The Role of Logical Models


To achieve this, it is important that a standard approach is developed to the collection of carbon monitoring data.


A modeled,  knowledge-managed approach to the integration of real-time data on which carbon emissions can be calculated or estimated is key.  Data could be collected from distributed sources, supplied by collaborative stakeholders based in multiple locations.


A common information model for key industries, such as Energy and Transport, is essential to ensuring that data can be aggregated from communities, regions, states, nations, and internationally.


Information analysis of telemetry and other metric data could take advantage of low cost information technology advances in dynamic software, hardware and network infrastructure.


And these carbon emission data collections could be shared on an industry, company and even household basis, via web portals. 


 1.4 Integrate with existing technology


A particularly effective place to start would be power utilization, as this is where consumers could be given a choice of power source.  Suppliers could offer discounts for use of renewable energy, creating incentives for moving away from fossil fuel power generation. Discounts and consumption levels could by viewed by customers over web portals.


This would provide economic incentives for encouraging the production of renewable energy grids, such as the  current initiative for a clean energy super grid by Europe's North Sea countries (able to store power, effectively forming a giant renewable power station). 


Telemetry devices are monitoring electricity consumption, transport mileage, water resource

utilization, to name a few. The quantity of information to be collected will expand exponentially over the next few years. 


A knowledge managed approached, based around standard industry models,  can be implemented on a centralized basis.  Improvements in carbon emission methodologies could be communicated across the usual national and corporate boundaries, being made available on by federated web-content management.


 1.5 Carbon Emissions Monitoring Model


As well as agreements such as Kyoto, information technology has to play a vital role to ensure that information is easily accessible and able to be aggregated across diverse industries and infrastructure.


A core logical model that is deployable in a data integration technology, able to map current technology deployment patterns to information analysis of carbon emissions is a key component of international industry collaboration for publishing and sharing of information about carbon emissions.


As knowledge and experience on carbon emissions monitoring increase,  optimal access to aggregated data in near real time will become possible, enabling immediate access to resource utilization and associated carbon emissions information, on demand.


It is essential to employ a common information technology model to map with current industry models, to collect data in real time.  Without a core logical model, federating web content from different suppliers, governments, and consumers becomes too expensive and too difficult. 


It cannot be said often enough that a Carbon Emissions Monitoring Model is a key participation component for effective short-term deployment of systems to monitor, calculate and price carbon emissions.


 1.6 Model-Based Deployment


Time is running out on climate change action, it is absolutely essential, in the short term,  to begin the process of being able to provide accurate metrics and statistics on carbon emissions, to enable assessment of  targets, which must be lowered over time to meet accelerating risk.


Device technology is advancing rapidly.  Telemetry devices can be remotely configured, and new information captured and processed, as it becomes available and useful. It is very important to make information access easy.


Data can be sent over any network, e.g. electricity supply, WiFi, UMTS, internet, corporate WAN/LAN, and aggregated centrally to provide dashboard style reporting. These technology capabilities can be meet by leading technology suppliers today.


Technology suppliers have to be capable of distributed, change-enabled, monitored, dynamic technology services, based on core logical models that can map to standard industry models. 


Real-time cumulative costs, carbon emissions and resource accounting information could be displayed to registered users through a portal.


Data that is mapped to the Carbon Emissions Monitoring Model could be aggregated across regions, industries and other groupings to provide, for example, daily carbon emissions from power and transport.


This addresses the real challenge for  Carbon Emissions Trading scheme  ‘Carbon Leakage’, (the increase in CO2 in countries because of  decreases in other countries). 


Accurate power and transport real-time monitoring data can be aggregated, and extrapolated  to provide carbon emissions statistics.


A Carbon Emissions Monitoring Model would be a starting point to ensure that telemetry information which can provide accurate emissions data , can be linked with other carbon metrics readily and rapidly.

Carbon Emissions Sub Domain 

 1.7 International Information Exchange


It is timely to envisage international exchanges of information for carbon trading, energy pricing, and inputs into climate change models, made securely and readily available to all user types, corporate, government and individual consumers.


A short-sighted approach to the technology applied to emissions trading schemes will be a huge inhibitor to accurate carbon emissions reduction monitoring.  Near enough is not good enough. The risks of inaccurate monitoring associated with inaccurate estimation methodologies are too high, in the context of the uncertainties of temperature rises and climate change that accompanies increasing levels of greenhouse gases in the atmosphere.


Global collaboration must be built into the monitoring mechanisms from inception.


Carbon monitoring initiatives be designed to utilize a best of breed technology approach, with the emphasis on carbon emissions measurement to replace estimation methodologies.


It is clear that the current emissions trading schemes cannot be relied upon to accurately estimate or forecast temperature rises that produce climate change and extreme weather events associated  with rising greenhouse gas emission levels.


 1.8 Utilization of Existing Technology


There are a number of technology options for gathering information from telemetry data collected in real-time from telemetry devices,  to be aggregated with estimated carbon emissions to provide a fuller picture of carbon emissions reduction.

Whatever technology is deployed in collecting data, it is the data integration that is critical to the success of  providing emission information


For data integration to be cost-effective and able to use any technology, the most important factor is to ensure that industry models can be mapped to a deployable logical model for the collection of real-time emissions event data.


Once the sample size is large enough, the real-time data collections could be used to extrapolate carbon emissions in climate change models.



 2  Opportunity to Real-Time Monitor CO2 Emissions


As people realize the importance of monitoring the world's resources, many new initiatives

from governments, corporations and the community, are emerging, with varying quality or no carbon emission estimates.


There is a pressing requirement to provide accurate measurements and calculations, which

would be facilitated by a systematic approach to information delivery.


Around the world, provision and planning has to be made for basic telemetry metrics, covering personal and industrial water utilization, energy usage by source, transport, agriculture production, and other carbon emissions.


 2.1 Technology Already Exists


A collaborative effort by national and international stakeholders could provide a universal capability to aggregate and present accurate carbon emission statistics in near real time, using existing technology components.


The knowledge of how to build a telemetry integration platform out of existing technology could be shared with industry stakeholders.


Information management is currently a critical problem in large organizations. Information has to be accessed and shared across organization and technology boundaries.


Information sharing is critical for the monitoring and management of environmental change.


Carbon emissions monitoring is something that will probably become mandatory through international treaties and agreements within the next couple of years.


Current broadband speeds provide a unique opportunity to facilitate a methodical approach to carbon emissions reduction monitoring, by real-time aggregation of data.


With proper forward planning and technology models, it is possible to model and organize the analysis and aggregation of telemetry data, from many different sources, to begin to understand the real carbon cost of resource utilization, including electricity, water, transport, agriculture and manufacture, to name the major sources of carbon emissions.


 2.2 Federate Information


A federated technology approach, that is, using currently existing technology components from stakeholder organizations, integrated by a central federating capability could provide a relatively low cost carbon emissions data analysis and statistics capability.


Information access to just in time data will become essential to facilitate international climate

change targets and measures to reduce carbon emissions.


A well planned, catalogue of information access requirements, mapped to a central logical carbon emissions model, will provide the basis for information integration technology for local, regional and national governments. This technology can be collaborative and shared across organization boundaries based on user profiles.


 2.3 Central Logical Model


A central logical model  is essential to ensure that the common problems of current technology integration deployments, are bypassed (such as large cost overruns and failures due to poor architecture, and systematic unresponsiveness to changing requirements).  A central model allows for efficient interfaces between unlike systems, and facilitates technology services across organization boundaries.


Forward resource planning, and appropriate business and technical models have to be  identified and determined in advance, for an effective solution to sharing information.


A best practices approach would work to ensure that information access services would be governed by measurable Service Level Agreements, and that these metrics would be subject to governance, tracked to supplier delivery.


 2.4 Service Level Agreements


An example of business services, to which SLAs apply, could perhaps be:


1. Provide telemetry data transmissions over a network every 60 seconds

  1. Provide web portal access to energy usage and associated carbon emissions data
  2. Aggregate emissions reduction data daily by local area by region and by country


The federation and knowledge management capability of telemetry integration could be

provided as a collaborative project, to optimize delivery efficiency and cost.  


Power and transport monitoring components, for example, could be supplied internally or outsourced, based on service level agreements, and shared across industry sectors.


A model based approach, if implemented correctly, is flexible. For example, when a new

energy supplier is added to an energy grid, telemetry data could be automatically updated to

reflect the new supply source, and the associated requirements for increased processing

capacity and data storage can be automatically flagged to the network and infrastructure technology services.

But only if the model is the basis for the technology design and deployment, as retro fitting models is much less efficient, and much more expensive.



 3  Recommendation with Risk Evaluation


The risks for the climate stability of the planet are huge.  Effective governance of large carbon emitters, and cap and trade schemes for carbon pollution will be compromised by carbon leakage in the short to medium term.


It is clearly never too early to initiate a technology platform to improve the accuracy of carbon emission reduction monitoring.


An incremental distributed technology approach is a cost effective way to deliver information that will become increasingly accurate because of a collaborative knowledge base, eventually able to be shared internationally.


Real-time telemetry information can be aggregated with carbon calculation algorithms to provide more accurate information than existing methodologies.


Involvement of the whole community in carbon emissions reduction monitoring is a smart move, that will help the corporate large carbon emitters to self regulate, and reducing carbon leakage through the pressure of public opinion.


Collaborative industry efforts to real-time monitor the source of carbon emissions is required to limit global temperature rises.


The technology approach, if best practice, will enable knowledge sharing and co-operative technology development.


The fact that existing technology can be used as components of information integration is

clearly cost-effective.



 4  Use of Sparx Enterprise Architect for Technology Models


The most important criteria for building core logical models for carbon emissions monitoring are

  1. Ability to use UML, the most widely used modelling language
  2. Ability to model from concept to code i.e.

a)      Business vision, concepts and requirements

b)      Business processes, workflow and role definition

c)      Technology service identification

d)     Technology implementation and deployment

  1. Ability to auto generate

a)      Deployable logical metadata from information architecture

b)      Multiple language source code, e.g Java, C#

c)      Multiple class transformations e.g. XML, WSDL.


These capabilities are key to providing information across organization boundaries, because they provide a common communications framework which can be versioned and change managed for collaborative projects, enabling synchronous deployment across multiple stakeholders on shared infrastructure.


To cover the scope of highly complex technology modeling, domain models provide a catalogue for technology deployment patterns.


A useful  catgorization is to identify common technology infrastructure from specialist applications.


  1. Technology Application Domains
  2. Technology Platform Domains


Doing so provides a starting point for the reuse of technology patterns, and also provides a coherent theme for technology integration.


Figures 2 and 3 illustrate examples of these domains.


Sparx Enterprise Architect is a techology of choice for the particular circumstances of  collaboration of Model Driven Architecture (http://www.omg.org/mda/ ) amongst business and technology stakeholders from multiple organizations.  It is readily available, easy to deploy, cost-effective and compared with other products, easy to use.


The example figures  used in this paper are extracts from Telemetry Services  UML Model (http://www.trac-car.com.au/Telemetry Services Model Version 4.0 HTML/index.htm) 

Example Application Domain Model 

Figure 2:  Example Application Domain Model

Example Technology Platform Domain Model 

Figure 3: Example Technology Platform Domain Model



 5  Implementation Overview and Accountabilities


The implementation of a flexible, change enabled telemetry integration capability

is undertaken as a planning process based on a central set of models, themselves subject to

change management and governance.


A programme plan could be developed as a result of a collaborative feasibility study with the

organization accepting responsibility for co-ordination of stakeholder effort. This process would be expected to take 6-12 weeks, depending on scale and complexity of the technology capability involved, and the level of commitment available from stakeholders.


An initial proof-of-concept could be deployed, building selected services from the model to an agreed plan. Subsequent deployment would be incremental, as resources become available.


A business and technology governance core group would be required to ensure that adherence

to agreed standards and processes is achieved.


A good place to start might be a proof-of-concept  based on telemetry metrics of power transmission by energy source by  power supplier.


This would involve definition of Service Level Agreements for the required service performance and other metrics, and selection and evaluation of suppliers by those



Full technology deployment could be achieved with the advice of appropriate business and technology subject matter experts.


Accountabilities are largely predicated on having a strong and viable governance mechanism,

administered by a core steering group, facilitated by specialist knowledge, automated by  event processing, data integration and web federation technologies.


The following table outlines the basic steps and accountabilities required to deploy a carbon emissions monitoring integration capability.


Human and financial resources could be distributed across participant organizations.




Time Estimate (days)

Feasibility Study

Telemetry Integration Model implementers

Organisation stakeholders

Technology suppliers


Strategic Programme Plan

Organisation stakeholders

Programme planners


Pilot Telemetry Integration Platform

Internal and external technology integration suppliers

Organisation stakeholders

Programme planners


Full deployment of Telemetry Integration Capability

Internal and external technology integration suppliers

Organisation stakeholders

Programme planners




 6  Technology Deployment


There best practice basic approach to building information integration in the current technology can be categorized as distributed incremental services, connected by  a central federation capability for integration of data , using existing technology.


By implementation of data integration and information access, to a centralized model, with SLAs and governance for technology suppliers, a reliable set of technology services  could deliver data just-in-time for rating, billing, and calculation of discounts for reduced carbon emissions.


By sharing technology knowledge and practice, infrastructure clouds and grids could be provisioned to provide the processing power to deliver information in real-time.


 6.1       Timescales


This approach could be implemented by  an incremental delivery of services. A meaningful output from a proof-of-concept can be delivered within 3 months. A core integration information federation capability could be delivered within a further 6 months. After one year, distributed information access services would be available to provide aggregation of telemetry information from energy suppliers, and providing a catalogue of carbon emissions model algorithms by industry.  Subsequent service delivery would  be incremental and relatively resource efficient.


By engaging in a collaborative information access process, essential feedback could be delivered across stakeholder communities, with group notifications based on user profile, promoting knowledge sharing for timely problem solving.


Costs would be contained by stakeholders supplying components from already existing technology.


 6.2 Solution Architecture Assessment


This solution is an emerging incremental approach, designed to address the shortcomings of earlier technology deployments. It is also comparatively low cost, as costs could be shared amongst stakeholder organizations.


It allows for best practice technology components, built by many suppliers,  linked

together to make a cohesive whole, based on specific  technology services subject to binding supplier service level contracts. 


The core of this technology is the federation capability, to bring together the distributed



The basic premise is to utilize standard technology approaches, however, planning and

building incrementally within a methodology which seeks to optimize available technology components, linked by standard interfaces based on coherent information models.


The technology services can be knowledge managed and tracked to business requirements by way of Service Level Agreements based on metrics formed from practical experience.


 6.3 Information Management with a Model


Information management of services, interfaces and performance is effected by

central models, which also provide knowledge management of collaborative development, tracked through workflows accessed by users with appropriate access privileges.


Collaborative program management and good communication work well with this approach.

These attributes are facilitated by a model based approach to implementation.  A model based approach also enables and facilitates technology change. And interface  mappings to a logical model can take advantage of technology that already works well.


Technology service suppliers are more readily able to be selected on the basis of the technology evaluation criteria documented as Service Level Agreements, which reflect performance requirements and other metrics for the type of business service involved.


Secure web access to a knowledge base can be automated by business and technical user profiles mapped to the central model.


Business requirements can be mapped to technology services, to provide traceability and accountability, and readily knowledge managed by model catalogue category.