Nya Alison Murray

Trac-Car (ICT Architect)
Twenty five years in the Information Technology industry, 15 years as a developer and designer, 10 years as a solutions, information, and enterprise architect. Trac-Car provides information architecture consultancy and develops enterprise models for information dissemination.  Global Carbon Data is an offshoot of Trac-Car.  It is a cloud hosted knowledge base curating climate change information using web search technology.  Global Carbon Data provides model based carbon emissions monitoring data, for the public, and on a subscription basis to analyse carbon footprints for interested parties. 
Tuesday, 11 September 2018 11:51

Data Identity Security


We live in an information paradigm. The standard model of physics is being challenged and replaced by a data centric probabilistic universe. It turns out that our deterministic material reality was a subjective human view of events, when viewed in the light of advances in quantum energy, fractals and chaos mathematics.

Automated industrial and personal computing applications are now completely pervasive in the fabric of human society in both the developed and developing world. And they are all built of blocks of data. We do have to evolve our concepts of data to understand that information technology is now a virtualization of information services, and networking connected devices of all kinds. It is comprised merely of bits of information, independent of wires, chips and electronic components, that over the next decade will probably be replaced by quantum computers of a very different physical design.

Recent heightened public awareness of protection of data privacy, triggered by electronic security failures, is an opportunity to redefine the view that data is merely an electronic representation of information. Security issues have raised the alarm, consequently we must track the data lifecycle more effectively.  The simplest solution may be to manage data holistically, from inception to end-of-life.

There is a clear requirement to standardise and categorise data in a way that we can continue to evolve our technology to meet the challenges of global information dissemination for exchange of scientific, humanitarian and world trade data. Advanced UML models and modelling technology is already being used for structured data terminologies.  An identity management domain incorporating blockchain as a class model serves to demonstrate that a platform independent model can easily extend any of the industry common information models to implement nexus identity management.

Data has a very specific meaning and lifecycle, in terms of creation, context, transformation and transportation of representation from one location to another. This entails residency, ownership and above all accessibility. Over the past decades, evolving legal frameworks around data custody, intellectual property, management responsibility and data-as-an-asset have been developed as part of an evolving set of data practices.

Yet we do not have a complete view of the data lifecycle.  Today there are a patchwork of individual standards of governance, usage policies and structural definitions in most international jurisdictions for what is now arguably the most valuable asset on the planet.  And at the same time the misuse of data is now one of the most common crimes in every society. The key to facilitating an evolution of global data residency may be by enabling a revolutionary international approach to role-based data access control throughout its lifecycle.

It seems possible that data itself requires an identity, a standard tagging of data elements with category, ownership, authorship, purpose, security classification and permitted jurisdiction and residency, that can be formed, encrypted, distributed, stored and updated with a secure practice in place of user and application credentials.  While this may seem like a complete shift, in reality it may not be so difficult with co-operation and collaboration amongst interested parties.

There is a proliferation of fraudulent use of data to the benefit of a few and the detriment of most people. The question must be asked, do we require a consolidated global standard for data lifecycle, with a transparent public audit trail of role-based data access, perhaps a blockchain approach to logging all data change transactions?

Data Identity Security

Timeline in Brief

As information technology went global in the 1980s, there were many initiatives to standardize data across industries, such as telecommunications, manufacturing and health, some of which were more successful than others.  Currently there are few precise agreed definitions of common terminologies and concepts for most fields of technology, science and industry, and a large scope for misinterpretation.

With the advent of Web 2.0, the focus shifted to user generated content at the end of the 1990s, and with the consequent proliferation of applications for everything by everyone, the momentum for standardization was lost, with the resulting increased heterogeneity, diversity and disparity of data terminologies that we experience today.  While search capabilities have improved, and context is now an important element of data definition, because of the sheer volume of information collected by individuals, NGOs, corporations and governments, the signal is being lost in the white noise, both for structured and unstructured data elements and collections. 

The security of data is falling and failing.  It is true that if there is sufficient motivation, either financial or political, a work around can be found for the current generation of security vulnerabilities to protect essential data. The efficacy of new security measures is ephemeral, short-lived and effective only until research by well-funded state sponsored actors and criminal organizations develops a new exploit.  This is because we have not address the fundamental problem, which is that the basic network and application protocols were designed without security in mind. 

The success of fraudulent misuse of data has led to the current situation, a proliferation of security tools and technologies marketed as the answer to data protection. They have been developed in response to security breaches and vulnerabilities.  In the words of Symantec , ‘We are only one step ahead of the hackers’. 

Identity and Access Management

The biggest weakness, the exploit vector bypassing data security, is identity fraud, either counterfeiting identity credentials or access tokens. As all systems of data accessibility depend on access privileges that can be compromised by persistent, carefully planned and patient interception attacks, the race against global fraud is in danger of becoming a lost cause.  The success of these attacks is not only because of poor implementation of security standards, but also the information protocols themselves, developed ad hoc, inherited from an electronic age where networking and hardware were essentially deployed in a hub and spoke configuration. Today interconnectivity is decentralized, and potentially global communications across partner organizations. The current generation of security technology is no match for attackers who are extremely well funded with resources matching those that large corporations and governments spend on cyber defence.

Identity and Access Management still depends largely on good working practices by responsible people, establishing, maintaining and securing access privileges, making use of the excellent advances in cryptography. Given the paradigm of current work practices, where people are working on and off site, over network connections that are more or less secure, this is hardly sufficient. By analogy, pilot error is still the largest cause of air safety violations, and the same is true for identity management. Human failings aside, the corruption capabilities of the extraordinary profits from identity fraud is a considerable factor.

Currently security professionals acknowledge there is no fool proof method of preventing security breaches, and that new variations of old attack methods are constantly surfacing.  Even ‘Zero Trust’ measures such as virtual variations on physical isolation of servers can be compromised over time by capturing identity details, understanding authorization mechanisms and spoofing authentication credentials of people, roles and applications. At the network layer, forms of single packet inspection to identify communications, are innovative and successful within bounds.  However these methods that are only as secure as the systems that collect the encapsulated identity data, providing a single point of failure.  Once authorization (identity verification) has taken place there is plenty of evidence of systems being compromised with exploits such as ‘golden tickets’ and ‘golden tokens’ giving intruders administrator privileges for network and identity tokens.

If privileged access to highly sensitive or classified data is the basis for data security, violation of trust is bound to increase, as the incentives for malpractice grow.  There are huge profits involved. Global political uncertainties and provocations, aligned with growing international tensions can only lead to increases in attacks on essential infrastructure.

Journey to Data Privacy

All industries hold personal data, and for those to which the European GDPR directive apply, legal protection is required. The European Commission defines personal data as any information that relates to an identified or identifiable living individual. Different pieces of information, which collected together can lead to the identification of someone, also constitute personal data. This includes technology information such as IP address and device identifier.

GDPR EU legislation is applicable to organizations either processing personal data in the EU, or relating to EU citizens. The legislation applies to organizations inside and outside of the EU. Non-compliant organizations may find it more difficult to do business in Europe. GDPR EU legislation became law in 2016, and on 25th May 2018 the stringent penalties for non-compliance came into play. There is a wide range of personally identifiable information, including personal demographic, employment, financial, healthcare and social data, that must now be adequately protected under European law.

A better way to provide data assurance and governance, rather than closing a vulnerability attack after the fact, may well be to develop a data security protocol that is secure by design from the outset, with a focus on protection of the data itself. Can we adopt a data identity standard that mandates practices, protocols and methods of non-repudiation that are focussed on the stored data representation?  Currently the focus is  the user applications, Internet Protocol, and the various integration methods and protocols of network connectivity.

A new data identity protocol could address the entirety of the data lifecycle, including creation, acquisition, encryption, storage and disposal of the data set and component elements. Standard cryptographic algorithms applied at data source, distributed to a network of identity providers for non-repudiation may be a cost-effective improvement for providing data protection.  The current  situation is a  cycle of a proliferation of information security tools applied at every stage of application access, integration, network connection, and data transportation.  These tools and techniques have largely been developed in view of attack vectors that have already happened. The cost of securing data has increased dramatically over the past decade.

Data Identity Standard

It is time to rethink the paradigm of sensitive, classified data, to provide a distributed security context for the data itself, independent of the facilitating technology services.  One innovation may be to provide collected information with an identity, a type of signature that records the registration, authorship, usage, persistence, access, update and disposal of data sets.  This accompanying metadata could remain throughout the creation and operational use of data throughout its lifecycle, protected by a distributed chain of transactions that require the consensus of the network ownership for change not only to data, but to the accompanying metadata.      

The technology is readily available, provided it is implemented and deployed as a well-designed public cloud collection and storage mechanism, with a careful use of the currently available set of security mechanisms, cryptography and key management, audited by logging and monitoring services that would be extremely difficult to corrupt, and virtually impossible if corroborated across more than one public cloud audit trail.

Industry specific terminologies have been developed over the past decades.  Telecommunications, Health and Energy industries have developed common models. The Telecommunications Information Framework (SID) provides a reference model and common vocabulary for all the information required to deploy network operations for fixed line and mobile service operations.  In electric power transmission and distribution, the Common Information Model (CIM) is a standard developed by electric power utilities to allow application software to exchange information about energy networks. Network power and telecommunications data elements are very sensitive from the point-of-view of security of public operations, and as such are obvious targets for disruption from hostile actors. OpenEHR is a specification that describes the management and storage, retrieval and exchange of electronic health records. Patient records contain some of the most important personal data to be protected from security vulnerabilities.

Starting with well-structured industry terminologies, data modelling standards groups could develop and provide recommendations on the classification of data elements, to which could be applied a consistent protocol for securing data identity. This could be a range of standard measures including cryptography, content validation, and a blockchain of independent identity providers for non-repudiation, with an audit service backed by logging and monitoring capabilities replicated across public cloud providers for non-repudiation.

Data Identity Technology

The most effective way to secure information is a combination of physical security, best practice cryptography and a multi-pass verification of identity credentials.  Currently there are standards such as OAuth 2.0 and OpenID Connect applied to end users and applications for authorization and authentication.  There is no real co-ordination of authentication across the network, transport and application layers, meaning that data integrity is only as good as the weakest security measure in the chain of protocols across networking endpoints, internet (TCP/IP) and applications (e.g. HTTP). End-to-end security is currently not secure by design, rather the result of  security patched onto and overlaid over an older paradigm of applications and data running on physical hardware and local area networks.

Blockchain was originally developed as a protocol to timestamp transactions for non-repudiation. It was adapted to its use as a bitcoin currency in 2008. A block chain is an additive set of blocks, linked by encrypting the previous links in the chain, stored as a distributed ledger. By 2018, private (permissioned) blockchain was adapted as a technology for a variety of business uses.  Once recorded, the data in any given block cannot be altered retroactively without alteration of all subsequent blocks, requiring consensus from members of the chain.

As blockchains use cryptographic algorithms, they are compute intensive, eminently suitable for low volume transactions, such as the storage of sensitive or classified data. Any data change transactions would require informed consent from the blockchain members.  The distributed data store is comprised of all the databases maintained by the members network group, meaning that there is no centralized distribution point for the sharing and replication of data.  A data set could be stored as a distributed record of transactions, broadcast simultaneously to all participants in the network, making misuse of stored data much more difficult.  

Identity Blockchain

Figure 1: Blockchain Identity Security Logical View

Such an initiative might be the only way to minimise the requirement for the proliferation of network security monitoring from devices over virtual wide area networks, connecting data centers to public clouds. The catalogue of prevention, and detection tools accompanying application security from mobile application registration and login, integration and application servers, distributed databases and third-party services could be rationalized. An organization wide security review  could address the problem that currently all of these measures have known flaws and weaknesses, with new vulnerabilities exposed while existing threat vectors are addressed.  Currently the surface area is too large for security assurance to be real.

In Summary

The current paradigm of data in the wild, protected by a patchwork of technology services, some secure, some inherently insecure, has no real future in addressing the global security of data.  Security is only as strong as the weakest link in the existing chain of application and network measures used to protect information. The global regulatory environment is rich in process, and poor in compliance and therefore security effectiveness.

Data has gone global, yet the definitions of use and abuse of information are completely different from one society to the next. There is misuse of  personal data in every industry with machine learning algorithms from web behaviours, , including not only marketing, but also  finance, social security, defence, civil administration and national security.

Initiatives such as GDPR in the European context, PCI-DSS in the finance industry are a good start, although as yet we have not found an effective method of addressing the root of data misuse -  networking and application technologies are inherently vulnerable, and when linked together, even more so. The standards for accountability for data, while worthy, are not working in practice. 

To continue to evolve the current generation of technology, a different paradigm is required to resolve this problem, as the level of financial misappropriation and vulnerability of essential infrastructure continues to grow.

All data originates from people in various roles – creator, author, publisher, distributor, manager, buyer, seller or end user of data.  People engaged in data collection are as diverse as members of the public, small business operators, employees and consultants in public and private organizations. A multitude of technology applications are proliferating around access to data in the form of identity management and federation, authorization and authentication of data at rest and in motion. Personal, sensitive and classified data persists and proliferates across networks and databases, with varying strength cryptography, and all too often in plain text.  

While we have many partial solutions to the problem of global data security, residency and accessibility, most technologies have known or potential security vulnerabilities, which when linked together into an end-to-end business technology solution, are insecure by nature. This situation can only intensify in view of the accelerating trend to network data across  on premises traditional data center infrastructure, private and public clouds, using identity federation, while data is increasingly stored internationally. 

Tuesday, 25 July 2017 05:36

Global Carbon Market Model


No matter whether the choice is a carbon tax or emissions trading scheme, a global price on carbon is the best chance for the international community to address climate change successfully to limit global warming to 2 degrees.


Trac-Car is releasing its Global Carbon Market UML Domain Model for Sparx Systems Enterprise Architect Version 13.  It can be freely downloaded, adapted and used.


It is intended for an automated cloud deployment of carbon credits from energy efficiency and renewable energy replacement of fossil fuels for electricity and transport.


Carbon Market Model












 A Carbon Credits Market can be based on CO2 emissions reduction algorithms applied to real-time metrics from the national electricity grid. An international carbon market based on accurate metrics can stimulate new investment in emissions reduction activities.


The UNFCCC COP21 Paris in December 2015 ratifies the replacement for the Kyoto Protocol to address climate change. It is likely that countries agree on emission reduction strategies that trigger national carbon pricing regimes.


A major reason for the current low and wildly fluctuating price on carbon, and therefore the low incentive to invest in carbon emission reduction technologies like renewable energy, is the lack of accuracy in estimation of CO2 savings. Markets dislike uncertainty.


A strategic approach can target the largest source of emissions, electricity and transport. Vehicle manufacturers are increasingly investing in electric vehicles, and this further emphasizes the role of the electricity grid in reducing emissions. A country with the market structure and regulatory framework in place is required to provide the initiative to kick start an international carbon market. An accurate emissions reduction measurement regime is key to engaging investors in carbon credits.


A carbon market comprising carbon credits from replacing fossil fuel electricity generation with renewable energy is a compelling place to start, as CO2 emissions reduction can be accurately measured based on real-time metrics from power transmission grids.


A global market can be established, based on a market mechanism, in the first instance applied to CO2 emissions reduction metrics from the replacement of fossil fuel generated electricity with renewable energy. Subsequently other emissions reductions activities can be rated by a rating mechanism indexed to the emissions abatement from fossil fuel replacement at the electricity grid.


Realtime carbon credits assigned by proven, demonstrable activities for replacement of emissions provides the practical application context for stimulating investor confidence, as well as providing a 'gold' standard for all CO2e abatement activities registered with an international, standard ratings agency for real greenhouse gas emissions reduction from Earth's atmosphere.


International Carbon Market



The objective is to exchange energy that plays havoc with the climate systems and the carbon cycle, to energy cycles that do not pour greenhouse gases into the atmosphere.


A simple and effective public policy may be to mandate that electricity transmission authorities nationally are asked to provide a 5 to 10 year plan to reduce fossil fuel electricity generation to zero. However it is recognized that policy takes time to implement, and that in parallel a market mechanism can provide for early adopters of new baseload renewable energy.


As this is liable to be a measure that can only be implemented with an accompanying compensatory mechanism, it is proposed that a reasonable initial price for trading in carbon credits can be offered by engaging an international market. With a decent price on carbon, exchanging renewable energy for energy produced from fossil fuels can provide a reasonable commercial framework in which the electricity industry can undertake transformation.


In Principle


The principle behind the market is that in many parts of the world there is a regulatory requirement (as well as voluntary activity) to engage in emissions reduction. This means there are buyers for carbon credits that can be used to offset emissions.


Demonstrated emissions reduction activities can provide sellers with an outlet for carbon credits. Being assured of a value in carbon credits, there is an incentive to invest in low carbon technologies to earn the carbon credits that can be offered to international buyers.


Providing a framework around a carbon price in which emissions reduction regimes including China, India, US and Europe can engage can supply the missing piece of the international carbon price jigsaw puzzle. A regulator involved in certification of the accuracy of carbon credits can provide a confidence in the carbon market, attract international trade, as well as providing a strong basis for a stable national carbon price.


In Practice


A prime target for market engagement is the replacement of fossil fuel generators with renewable energy generators. To defray the start up costs, carbon credits backed by a stable carbon price is an attractive proposition. It means that there is an incentive to invest, because there is a return on the investment.

The more participants in a carbon credits market, the better the return for investors in carbon emissions reduction activities. As the market gets going, more activities can qualify for carbon credits, attracting more players into the market.


There is now a sufficient number of potential buyers globally, and sufficient interest in improving the market mechanisms, to set up an international operation. What is required is an accurate measurement system to warrant high quality carbon credits, based on certified, guaranteed emissions reduction activities.

This accuracy is what is currently missing from the carbon credits market, and this is reflected in the fluctuating price. There is a strong latent demand, and an accurate regime for certifying carbon credits can entice participants to invest in the carbon credits market.


Currently the calculations are a manual process calculated by governments reporting to the UN. A Carbon Credits Market replacing this process with metrics based on information provided directly from electricity transmission authorities in near real-time can be a success.


Authorities already collect this information as part of electricity pricing, so the technology is already in place to measure the energy generated on a daily basis. Information and Communications Technology can be used to provision data intelligence.


A Carbon Credits Market pilot can be funded by venture capital. Once the pilot proves successful, and companies sign up as market participants, the market can be capitalized by a public offering. A Carbon Credits Marketwould then facilitate both buyers and sellers of carbon credits to participate in the market.

To expand market operations, a Carbon Credits Market can use new investment and profits to eventually include all registered, accurately measured approved CO2 emissions reduction activities endorsed by the UNFCCC and ratified by the IPCC.


A Carbon Credits Market can seek endorsement from the UNFCCC to engage with participants in the new Kyoto Protocol agreement to be signed in Paris at the UNFCCC COP21 in December. The Carbon Credits Market operations can provide the technology and emissions monitoring and data insight to participating parties, issuing one carbon credit certificate for every metric ton of demonstrated CO2 emissions reduction.

The objective is to provide a high standard for carbon credits approved by the Intergovernmental Panel on Climate Change (IPCC) in a global market to bring carbon credit buyers and sellers together. Carbon credits registered with theCCMcan be traded internationally, and the carbon price regulated by supply and demand.


The market is expected to operate on a commission basis in addition to membership fees and charges. Once the market is established, an initial public offering can be made to potential investors so that shares in theCCMcan be publicly traded.


While the initial metrics can be established with the replacement of fossil fuels for electricity, by engaging with the IPCC, a set of governance mechanisms can be established. Assessments can approve carbon reducing activities based on standard metrics, using expertise in emissions monitoring from specialist organizations, universities and research institutes.


Re-afforestation projects in South America and South-East Asia, crop replacement soil regeneration, industrial waste, and ocean plastic disposal could all be targeted as measurable emissions reduction activities qualifying for the Carbon Credits Market. 


This market can provide the missing link between government regulation and the lack of certainty for a private sector that wants to develop clean technology, currently lacking in investment confidence.

An accurate carbon credits market can bridge the transition gap as the world changes the way it conducts business from placing no value on natural resources and the biosphere, to a new paradigm of incorporating the thinking of sustainable development into corporate sector investment.


A program engaging appropriate stakeholders can demonstrate the demand for a market for carbon credits, and establish a base price. The waters can be tested and interest gauged from potential market participants and investors, in parallel with a pilot program to set up the market and the carbon credits trading mechanism.


The market is to operate by publishing and offering carbon credits initially based on real-time data collected from the electricity grid. A bid can be entered for metric tons of CO2ereduction from renewable energy replacing fossil fuels. The lifetime of emissions reduction activities can be established in order to determine a period of eligibility over which to calculate the carbon credits. For example, replacement of fossil fuel energy with renewable energy may be deemed to be valid for perhaps five to ten years from commencement, depending on factors such as the cost of the technology and the payback period for the investment.



A Global Carbon Credits Market can provide a standard carbon pricing mechanism. This would enable the corporate sector to develop a low carbon economy, independent of domestic and regional carbon market bias. The market can be developed on distributed cloud technology using a UML Global Carbon Market Model.  The objective is to increase the accuracy of carbon metrics, and provide an international standard for monitoring emissions reduction.


Initially, data about power transmission by generator type can be collected from Transmission Network Operators, both in real-time, and from forecast demand schedules. Data can be collected from scheduled transmissions, as well as from smart grid network devices to supply near real-time metrics of supply of renewable energy generation replacing what was formerly supplied by fossil fuel energy generation.


The metric tons of CO2e reduction can be calculated using standard IPCC algorithms, and the emissions reduction translated into carbon credits can be offered to buyers on the open market.


A pilot for the carbon credits market can be set up in parallel with a replacement renewable energy generation capability engaging a regional Transmission Network Service Provider to replace fossil fuel electricity generation supply with a local renewable energy plant.


Liaison with the IPCC can provide the basic algorithms to apply to the data gathered from the TNSP. Based on the energy metrics, business rules can be applied to automate the emissions calculation algorithms.


A national electricity regulator can be engaged to monitor and manage the regulation of the electricity metrics from the electricity grid.


Specialists with knowledge of IPCC algorithms and emissions factors can be engaged to determine the rules for carbon pricing for the Carbon Credits Market. Automated computing systems can be established to apply the business rules for carbon credits prior to offering them online to buyers.


As well as earning carbon credits for replacing fossil fuels with renewable energy, carbon credits can be bought by companies obliged to demonstrate emissions reduction at the going carbon price.  This approach would attract customers with all types of emissions regulations, both taxes and emissions trading.


Data gathering mechanis­ms can provide a sufficient level of accuracy for market spot pricing, and the accuracy can only increase as electricity monitoring and demand forecasts improve. This provides an enormous downstream opportunity to apply carbon emissions monitoring algorithms to demand and consumption data, providing carbon pricing discounts on the spot to wholesale and retail buyers of electricity. This approach is an organic way to encourage companies with an innovative approach to carbon emissions reduction.


In parallel, there is a market incentive to develop accurate monitoring mechanisms for other carbon emissions reducing activities that would be fostered by the presence of a Carbon Credits Market.

The establishment of a pilot program to set up the market mechanisms, and engage in a proof-of-concept with the electricity industry and the UNFCCC can be conducted over a year, including evaluation of the outcomes and promotion to investors, and market participants.


Market operations can then be fully deployed, with appropriate improvements based on the experiences of set up and deployment of the technology, and engaging with stakeholders.


The initial carbon trading stakeholders are the current electricity generators and transmission authorities. (They already have real-time metrics in place for demand management. The electricity market is a zero sum game, supply has to meet demand, and algorithms can be applied to these metrics).



Investment is required to to start the ball rolling, and private and institutional investors are to be approached to provide the initial finance.


The policy mechanism has to be determined, and policy specialists engaged to form the policy and lobby for support and adoption through the channels for public policy legislation.


In parallel, market expertise has to be engaged to establish a clearing mechanism for trades in carbon credits.  The processes themselves can be set up from scratch using specialist knowledge of market operations and information technology.


Cloud based Information and Communications Technology (ICT) can supply a visual interface on a public website monitoring the replacement of fossil fuel electricity, and automatically calculating and displaying carbon credits from replacing fossil fuel energy with renewable energy. This provides some interesting opportunities for engagement with new parties interested in reducing emissions, as well as providing public interest information.


At the outset, electricity market regulatory bodies can be engaged to promote the market to the electricity industry, and this, allied with a reasonable initial price, can ensure the ongoing stability of the market.

Electricity authorities can accurately provide provable metrics of electricity generated from renewables to replace electricity generated from fossil fuels. This certainty and the accuracy of the data is the fillip required to stimulate investor interest.


The UNFCCC and the IPCC have already developed algorithms for calculation of CO2 emissions reduction per tonne, other verified emissions reduction activities can be registered for carbon credits, monitoring mechanisms developed so that all major sources of carbon credits can eventually be traded in a global carbon marketplace.


The challenge is to replace fossil fuel electricity with renewable energy. Political and economic barriers can be overcome by legislation. Electricity networks can roadmap decommissioning of fossil fuel generators. It simply requires political will.


CO2e Emissions Reduction

The United States is the second largest emitter on the planet, topped only by China. US statistics are readily available, and they are cited here as an indication of the potential for global emissions reduction:

In 2014 4,093 billion kilowatt hours of electricity was consumed in the United States. About 67% of the electricity generated was from fossil fuels (coal, natural gas, and petroleum).


Savings over 5 years assuming an average attrition rate of fossil fuel generators from electricity grid of 5% per annum would be of the order of 520 M tonnes of CO2e emissions saved. These figures can be extrapolated into a global context.


Of course the basic assumption is that there is a strong initial carbon price set by an international regulator, and that market players do not sabotage a reserve price.


A second factor in assuring emissions reduction is public opinion. Ongoing public campaignsput pressure on global governments to make fossil fuel electricity unpalatable politically.


Assuming participation by electricity stakeholders, an international carbon market isviable. The costs of renewable generators can be defrayed by a stable carbon price attractive to investment by electricity market stakeholders.


Other emissions reduction activities would augment the level of emissions reductions over the same period.


Key benefits

Electricity stakeholder expectations can be met by providing a financial incentive to develop baseload renewable energy, which in turn stimulates the investment in renewable energy and storage technology.

A Carbon Credits Market technology provides for the new carbon pricing regimes being mandated in every part of the globe.


Lower cost of electricity results from increasing demand for renewable energy generators which in turn reduces the costs and stimulates technology improvements.


The market can provide further economic stimulus in all technologies that are deemed to reduce the planet's carbon footprint.


Market redeemable carbon credits can provide competitive pricing regime for energy consumers, as well as a robust market for investors who want to finance low carbon technologies.



Short term, a pilot project can be delivered within one year. By year five, a significant number of regional electricity grids can reasonably be expected to participate in a Carbon Credits Market, given that strong global support for an international carbon pricing regime is expected to emerge from the new UN Kyoto Protocol replacement agreement to be signed in Paris in December 2015.


By year three, the ratification of governance mechanisms to address the other major sources of CO2 emissions, transport and deforestation, can be standardized. Over subsequent years, activities from agriculture, ocean regeneration, and plastic replacement can be included as standard measured ways to reduce emissions.


In 20 years time, 100% of energy has to be renewable, and technologies to stabilize the carbon cycle to manageable levels. to have any chance of averting critical loss and damage, large scale disasters and catastrophic climate change putting the world on the footing of survival management. Reduction in energy consumption would otherwise be by attrition as social breakdown of infrastructure is bound to result.

By 2100 we have to have contained temperature rises to around 2 degrees.



IPCC Fifth Assessment Report


Changes in the pattern on Earth’s temperature rises - NASA


Carbon reservoir - World Ocean Review


What if we burn all the fossil fuels? - Stanford


Energy Model Exchange Technology for Energy and Carbon Market Efficiency – Trac-Car


. 7




Journey to a Stable Climate


It is particularly important to measure and manage carbon emissions reduction from project activities. The major objective of the Global Carbon Data program is to establish and communicate accurate monitoring algorithms to quantify emissions reduction.  Carbon abatement projects can then qualify for carbon offsets/credits using these standard methodologies.

The program is to offer a centre of excellence for those organisations who want to engage in testbeds for specific project types, facilitating knowledge and technology sharing of climate change and greenhouse gas emissions reduction. Data can be published to reach governments, NGOs and the corporate sector through a climate change specific cloud hosted knowledge base.     

The implementation of Global Carbon Data is being developed around a UML model that provides a data model for Energy and Carbon Emissions generated using MDA transformation from classes to data tables.

Providing for requirements and scenario traceability using UML modelling technology means that resources and efforts can be readily managed and ­monitored during the staged development phases of delivering information about energy and carbon emissions monitoring.

An architecture  model led approach provides for managing the hybrid private cloud deployment using model-led metadata for both Enterprise Planning and Business Process Automation.


About Global Carbon Data


The program aims to enable climate change and carbon emissions research and development, knowledge and expertise in two main ways.

     To establish an easily accessible and relevant global knowledge base of research, reports and data, using software that is able to auto classify information based on context, content and commonality of usage.

     To facilitate and monitor data from the establishment of pilot projects to collect, accurate test data to be used as the basis for emissions reduction metrics. The aim is to facilitate the certification of emissions reduction methodologies resulting in carbon credits/offsets able to be traded in global carbon markets.

Global Carbon Data online is to share and communicate structured data (carbon metrics, analytics and algorithms) as well as research, reports, and analysis of project activities that effect a net reduction in greenhouse gas emissions. High quality research and reporting provided by international organisations engaged in addressing climate change can be published through the site.

Being able to view the management of the project resources and scheduling directly from a task perspective is useful, particularly if the production of data is also linked directly to delivery of the capabilities that facilitate the business scenarios.

Data is becoming increasingly important, in view of advances in cloud security, big data analytics and the proliferation of information gathered from IIOT devices.

To work with a development team that is located in different locations, it is very useful to be able to share the entire architecture specification from a single model. This not only facilitates the focus on delivery, it provides for ready peer review and the cross pollination of knowledge and technologies.

Energy and Carbon Emissions


Figure 1: Energy and Carbon Emissions Monitoring Model

This approach reflects through to the delivery of the information to the platform subscribers.  The focus on data delivery promotes information services delivery not only to the system builders, but the system users as well. Collaboration is a simple matter of providing project data, expert knowledge and information in context, and making it readily accessible.

Cloud technology is available to facilitate knowledge sharing across organisations, national and regional boundaries. Connecting people, projects and research ensures the rapid transfer of data over mobile, public and private networks.


Curated Access to Information

The program aims to facilitate organisations seeking information about climate change by providing ready access to information about greenhouse gas emissions reduction from credible sources.

Subscribers can access published reports, research and data by project category, research topic, using simple search terms The knowledge base is to provide highly focused, context specific information via a specially developed climate change taxonomic search engine.

Taxonomic search can return research, reports, data and project analysis, ranked by relevance, regardless of whether the terminology used is for precipitation or rainfall, greenhouse gas emissions reduction or carbon abatement, agriculture for hot dry climate or drought tolerant crops.. By making the information context sensitive, the online search results are far more focused than standard text search mechanisms.

Consumers of information, such as government agencies and corporations seeking to reduce emissions can use a guided process to look up relevant reports, data, research, and current thinking for strategic responses to climate change.

The Global Carbon Data program is to publish a knowledge base of climate change and carbon emissions research and data held by leading organisations. Governments and companies wishing to access the data can do so on a fee for service basis.

The key to enabling a global carbon emissions knowledge base of structured and unstructured data sharing, essential to monitor emissions across national borders and regional boundaries, is of course a common logical data model. This is also the prime enabler for a global stable price on carbon.

Data for Carbon Markets

The program is to facilitate collaboration across organisations to establish relevant, repeatable emissions reductions methodologies for a specific emissions reduction project types. Examples include:

     The collection of data from renewable energy micro grid electricity generation, to which emissions factor algorithms can be applied to produce accurate estimates of greenhouse gas reductions from replacing existing fossil fuel types with renewable energy.

     The collection of data from land management emissions abatement activities such as prevention of deforestation by forest edge cultivation, reafforestation of rainforest, restoration of mangroves in sub-tropical coastline, and soil sequestration of carbon from agriculture innovation.

The data is to be collected, analysed and made available online as required to authorised and authenticated parties using secure networks. Data collected from project activities can be certified by carbon offset standards bodies to earn carbon credits.

By collecting accurate data from a wide range of project activities, the intention is to help facilitate a stable global carbon price, essentially by improving the level of accuracy of emissions reduction data to the burgeoning global carbon markets.

Value added services for the program include the application of accurate, appropriate greenhouse gas emissions reduction algorithms to project activities and online access to analysis and infographics and time series and location based data analysis .



Monday, 30 September 2013 06:27

Global Energy Exchange - Data Without Borders




For harmonisation of data, models are absolutely essential, enabling more automated, more change enabled, flexible and future proof technology. A model led implementation of energy transmission data is an effective way to ensure a standard approach to energy monitoring & marketing, and Sparx Systems Enterprise Architect hosts the Utilities Common Information Model.

Being able to generate messages about electricity supply and energy markets using a combination of code and data generation from an enterprise model linked to the Utilities CIM, an OMG DDS implementation, and standard geospatial models is an efficient way to deliver large volumes of energy data messages about supply, and markets across networks, with cloud technology.
Interoperability is, after all, the holy grail of Smart Grid.

The deployment of real-time Smart Grid operations can provide information flow about energy supply and demand amongst end-users, electricity generators, and the electricity grid. Most countries are upgrading their transmission and distribution networks to be ‘Smart Grids’, with data from SCADA, OMS and EMS systems, and increasingly from the next generation Wide Area Measurement Systems (WAMS). This represents a unique opportunity to provide real-time geospatially connected data for a wide range of metrics and calculations from power generation, transmission and distribution, including energy and carbon trading, associated greenhouse gas emissions, and energy efficiency savings.

New technology initiatives can be very costly, with long lead times, and significant lag times for implementing change. Smart Grid costs can be significantly reduced, and flexibility can be significantly increased by sharing technology deployments.

As the energy industry landscape is transformed, there is an urgent requirement to standardise the format of data able to exchange information for electricity and transport consumption and markets. There is no reason that there cannot be global automation of energy data exchange.
The Utilities CIM is an abstract UML model that provides coverage for data elements in the Utilities industry, objective, informational, and enumerative. It can be extended to cover Wide Area Management, and synchrophasor metrics.

Today's ICT technology is increasingly automated from UML models. Connections between a model of physical elements, such as the CIM energy management and electricity supply domain, with common ICT technology, such as messaging and analytics. This gap means that significant costs and complexity of implementation effort has to be incurred to for further Smart Grid automation, with web workflows, data integration, event processing, data centric messaging etc.

To decrease risk, there has to be an increase in the accuracy of human, machine, and financial resource estimates, for an ICT enabled Smart Grid. A standard interoperability metamodel can provide a global context for information delivery. Exchange of energy management and market data is key to managing energy supply efficiencies across network operator regional boundaries.

Energy data exchange can be affected by current, best practice ICT technology, built from a platform independent metamodel, to bridge the gap between energy metrics, realtime decisioning, and energy market data flows.

A common model can provide the bedrock for the algorithms and definitions required to synchronise real-time metrics. Data models (such as the Utilities CIM), can provide the reference metadata for interface mappings in a real time geospatial context.

Model-led deployment of technology to provide energy data and information technology workflows, can deliver supply and demand efficiencies for both energy and carbon trading, across time zones and national boundaries, by reducing the costs of integration, because of harmonised data elements.

Synchronisation of data definitions, through adherence to a specific model can optimise technology services by simplifying the number and scope of systems interfaces.
For each network operator, interoperability of data can be faciilitated by translation into common formats from web interfaces and workflows for automated mapping.

Geospatial information, and cross referencing of data and standards, plays a fundamental part in addressing semantic information flows. Location can provide a key to referencing metadata. A location centric repository can provide a register for network assets, as well as data structures, transformation algorithms and messaging subscriptions.

To connect energy management and market models and geospatial models, in an information delivery context, enables operational and business intelligence from Smart Grid network devices to be mapped geographically.

Energy geospatial overlays provide an opportunity for utilisation of CEN/ISO/OGC standards, and emerging geospatial practice, further enabling data exchange among relevant energy stakeholder groups, able to share network and market intelligence without delay. Correctly implemented, operational intelligence, geographically mapped, can become widely accessible, simplifying and clarifying the processes of energy and carbon trading across borders, for current players, as well as the increasing numbers of renewable energy generators.

In addition to well publicised areas such as energy efficiency, grid health, and demand management, one aspect of the energy industry transformation that cannot be overlooked, is the potential to collect accurate data for monitoring the world's CO2 emissions from electricity consumption. Improvements can be made in the quality of the energy data to which carbon emission monitoring algorithms are applied.

Energy supply and demand efficiency must accommodate distributed, renewable energy generation - not only from new renewable energy generation plants, but also industry and households. This means that real-time supply and demand data access will be a critical factor to enable energy efficiency by transmission and distribution network operators. To meet the threat of a world energy crisis, operational processes have to be fully automated, based on real-time energy information made accessible across national boundaries to meet the information requirements of all stakeholder groups.



Regional, national, and international energy markets are challenged by the inclusion of large scale renewable energy sources into the grid. Accelerating consumer supply and demand also stress the ability of the current infrastructure to meet future energy needs. Accuracy can be improved by utilising Smart Grid operational data collected in real-time, as these new challenges have a temporal urgency, in view of UN IPCC climate change reports.

The best way to proceed is to start with standards that are already in place, then extend them to provide a common terminology, not only for energy efficiency, but also energy and carbon trading markets. This represents a real paradigm shift in current thinking and practice.
Smart Grid interoperability cannot be defined simply by a data model and web services. It has to be demonstrated in the context of a cross-border energy exchange of high speed, real-time metrics.
The breadth of application of synchrophasor technology can be not only for network stability monitoring and demand management, but also for post event analysis. Smart computing devices attached to ICT networks can provide pre-processing of energy transmission data not only to energy management processing systems, but also to energy market technology systems.

Fossil fuel and renewable energy source metrics are important variables in carbon trading, as well as pricing in energy markets. Geospatial analysis of energy utilisation, pricing, and CO2e emissions reduction insight can be made accessible from post event analysis of real-time data.

This information is of interest, not only to energy industry stakeholders and governments, but also to industry and households, as everybody has to participate in energy efficiency, supply and demand, and carbon emissions reduction, to achieve climate change mitigation.
To integrate energy information, with geospatial infrastructure, means being able to integrate geographically diverse information from both electricity and ICT networks, efficiently and successfully. This is only possible if the industry has a common parlance for automated data distribution, and workflow synchronised, and authorised collaboration and co-operation for sharing energy intelligence across organisation and national boundaries.

Smart Grid and ICT operations platforms can provide cost-effective energy efficiency, and energy pricing data, in a geospatial context, as well as carbon emissions reduction metrics. In fact this is essential to meeting the new situation of wide area energy transmission, consumer energy generation, and CO2 reduction targets.

Time is running out for addressing global energy financial and environmental costs. To proceed, it is critical that there is no tower of Babel. Now is the time to decide on common standards and terminologies. And even more importantly, now is the time to recognise that an efficient and effective technology for an automated energy data exchange as to be an international collaboration.

The core ICT technology required to mobilise a cost-effective Smart Grid, is a common semantics-based energy data integration of complex event and web content delivery technology, data centric high speed distributed, real-time information access. Agreement on semantics cannot be the province of standards bodies only. It is too complex for theoretical approaches to produce a completely useful set of semantics. Semantics have to be developed by application of best practice technology in a pilot phase.
Standards based Smart Grid/ICT can be agreed by upon by Energy and ICT stakeholders to develop a common energy model exchange technology approach. In view of the EU’s “An Energy Policy for Europe”, the North Sea SuperGrid, and the fact that industrial energy utilisation currently accounts for the majority of all global greenhouse gas emissions.

There are any number of approaches to providing ICT solutions to integrate Smart Grid information with geospatial and organisation data for operational, business and market purposes. If these solutions are not synchronised, there can be no effective information exchange across regional boundaries.

An efficient and cost effective solution has to discover, test and automate a common exchange model as the basis for generation of Smart Grid events triggering post event analysis workflows and transactions for an interconnected energy market.

Historically, excessive ICT costs have been incurred by information gathering from semantically incompatible data.

The characteristics of technology effecting a common exchange of data based on common models has to be
1. Able to connect different technologies across Smart Grid and ICT in real time
2. Able to be easily deployed on current technology integration platforms, networks and infrastructure clouds
3. Able to use and translate to and from common vocabularies and protocols.
4. Able to access and utilise standard geospatial data overlays and infrastructure



According to the European Union’s “An energy policy for Europe" legislation [1], energy accounts for 80% of all greenhouse gas emissions in the EU, one of the acknowledged factors contributing to the development of the current energy policy.

The amendment to regulation (EC) No 1228/2003 on conditions for access to the network for cross-border exchanges in electricity, on conditions for access to the network for cross-border exchanges in electricity, is changing the nature of Energy Transmission. Part of European energy policy is the role of ICT (Information, and Communication Technologies) to facilitate the transition to an energy-efficient, low-carbon economy.

US Department of Energy’s website endorsement of the “National Transmission Grid Study”, part of which is “Ensuring the Timely Introduction of Advanced Technologies” [2] , promotes the modernizing of America's electricity infrastructure, and is one of the U.S. Department of Energy's top priorities.

The NIST (National Institute of Standards and Technology) website US Recovery Act information has targeted funding in the area of energy, environment and climate change. Named sub topics include - Research on measurement technologies to accelerate the deployment of Smart Grid to the US electric power system. - Research to develop advanced measurement capabilities to monitor greenhouse gas emissions. [3] Innovations in Energy storage technologies [4] , the European North Sea renewable SuperGrid [5], and the European Commission Energy target ‘20% renewable energy by 2020” [6] has dramatically increase the significance to transmission network operators of Smart Grid and associated ICT technologies to include renewable energy generation on a large scale.

To accommodate these developments, it is essential to deploy Smart Grid operational event data, geospatially connected by common semantics, standards-based and interoperable ICT technologies. Standards, Protocols and Terminologies The work of the CEN/TC (European Committee for Standardization: Technical Committees) [7], number 287, has ensured that there is interoperability between the ISO[8], OGC [9] and Inspire [10] geospatial data and frameworks. ICT industry not-for-profit consortium, the Object Management Group, has developed an approach, known as Model Driven Architecture (MDA) [11] supported by the Universal Modelling Language (UML) specification [12] , to help automate and integrate information across diverse formats, applications and technology platforms.

The Telecommunications Shared Information and Data model (SID) [13] is an industry model that has been constructed with deployment in mind. The Utilities CIM, incorporating the IEC standards 61970 and 61968 [14] is a UML 2.1 Platform Independent Model to describe the current state of Energy Generation, Transmission and Distribution - “Unlike a protocol standard, the common information model (CIM) is an abstract model that may be used as a reference, a category scheme of labels (data names) and applied meanings of data definitions, database design, and a definition of the structure and vocabulary of message schemas. The Utilities CIM also includes a set of services for exchanging data called the Generic Interface Definition (GID).” The GID specifies how to exchange data, while the RDF CIM is an XML version of the model, useful for system interaction. Both are designed to facilitate information exchange between Smart Grid and ICT technologies.
The OMG DAF (Data Access Facility) Specification[15], designed to address Utility Management Systems, mobilises the RDF version of the CIM.

The OMG DDS (Data Distribution Service), and DDSI (Data Distribution Service Interoperability) Specifications [16] provide for a distributed global data space for high speed messaging and semantic interoperability over a wired network protocol. A geospatial context for Energy Smart Grid data is essential to take advantage of technology advances such as WAMS ….. “In Europe, the SmartGrids initiative is developing a roadmap to deploy the electricity networks of the future, and WAMS is one of the key technologies considered. In the U.S., the Electric Power Research Institute (EPRI) runs an initiative aiming to provide a technical foundation to enable massive deployment of such concepts (Intelligrid)” [17]

The European Commission geospatial standard INSPIRE [18], provides not only geospatial data and infrastructure, but also a community geoportal is currently being developed, to provide real-time geospatial context.

What is required is a simple standard way to deploy the energy models such as the Utilities CIM in a geographically distributed Long Term Evolution ICT network context, with automatic semantic translation, on a technology platform capable of high speed complex event processing, to meet the current legislative, operational, business and market challenges for Trans European Energy Networks. This interoperability standard has to apply to the performance of the integration technology, as well as the common data service domains, keys and topics for energy management and market data.



New ICT paradigms have emerged in recent years, including “cloud” infrastructure grids, automated semantic data, web federated and low latency real-time messaging services, smarter wireless devices, next generation IP networking, federated web content delivery and complex event processing.

The context over the next five years is rapid development and growth in Energy Smart Grid technology, renewable power sources, energy trading, and the linking of real-time grid data into the carbon trading market. These radical transformations will be accelerated by consumer participation in energy supply and demand, and new legislative requirements for energy efficiency, carbon and energy markets.

What is required to meet this growth, is a semantic core technology that supports the current and emerging standards of high speed event technology automation, able to connect with a huge diversity of new and existing Smart Grid and ICT technology, and very importantly, supports the existing and emerging ICT standards relevant to Energy Generation, Transmission and Distribution. At this stage, there is a type of modeling technology that supports this architecture, and that is UML integrated Model Driven Generation (MDG) Platform Independent Modeling, supporting standards in architecture, business process automation, and systems engineering, event processing and data distribution. However there are few players in this space, and even fewer supporting the particular requirements for an Energy Model Exchange technology.

UML is the defacto standard used by most technology suppliers to deploy business workflow, web information access, real-time complex event handling, and data and message integration on a distributed technology grid, governed by current and emerging Energy and ICT standards, and models.



Energy exchange model technology requires a complete a UML/MDG automation into a high speed event context. Data exchange has to encompass and automate not only energy data elements, but also new energy efficiency and trading, and carbon trading semantics that are only now being identified. A metamodel has to be a Smart Grid/ICT deployable exchange model that is more than just the canonical model of energy elements, that , for example, the Utilities CIM provides.

The technology has to provide for ICT data distribution and event handling services metadata, as well as additional semantics for energy trading, efficiency, and CO2 emissions. This metamodel has to be able to logically connect all elements from Smart Grid and ICT deployment in a real-time network context.

This marks the real challenge for interoperability of existing utilities ICT systems, e.g. load management, transmission and distribution, metering and billing applications, with new energy intelligence, including demand management, energy efficiency and trading markets.
The timeframe, to meet with EU Energy policy and legislation directives about renewable and sustainable energy, energy efficiency, security of supply, as well as technology and innovation, is quite short .



Efficient, geospatially aware energy management requires model technology automation of high speed real-time event handling in a multi-party semantic communication framework in the context of distributed, allocate-to-order real-time infrastructure performance. An energy model exchange technology has to go to the next stage of automation evolution. It has to presume a multi-way teleportal communications paradigm shift, with all parties as energy suppliers, and all parties as energy consumers, to facilitate energy and carbon markets.

The energy Smart Grid also has to meet many of the ICT challenges of IP V6 Long Term Evolution networks, requiring an advanced flexibility and growth capability, able to connect and upgrade smart telemetry device interactions, capable of adding new functionality without redevelopment.
What is required to meet these challenges, is a distributed, model-led, semantic, integration capability based on a deployable common geospatially aware exchange model that connects the elements of the CIM, databases, web services, and messages. And these characteristics have to be put to work in the energy Smart Grid context of complex operational event processing, business integration services, deployed on scalable network and server infrastructure. In addition, information has to be accessed and exchanged by different stakeholder groups, organisations and businesses.



There are already interface, communications and data standards and infrastructure, developed by various global standards organisations, an energy
model exchange technology can readily be achieved with a collaborative, concerted effort by Energy and ICT stakeholder organizations.

In terms of finding a modeling technology capable of meeting the trans Europe energy challenge, Sparx Systems technology is a leader in the support for open technology standards and semantics, particularly those standards that would enable a high speed Energy Model Exchange, with existing support for the Utilities CIM, as well as OMG MDA and DDS.



Standardisation of geospatial information and infrastructure is a significant advance for the cause of interoperability in Energy Smart Grid/ ICT. It is important for energy artefacts to be included as standard overlays with the Inspire, ISO and OGC initiatives. The next step is to ensure that geospatial data is integrated via the common semantic data exchange outlined above. Geospatial integration raises the subject of distributed data processing, able to provide global geospatial context with local maps and information.



An energy model exchange technology, developed collaboratively, amongst stakeholders, is a cost effective way to deploy an evolving Energy Smart Grid.

Most importantly, an extension to the Utilities CIM could provide standard interfaces to realtime energy data, to provide cross domain operational and business metrics that currently do not exist. This would allow carbon emissions monitoring algorithms for the estimation of greenhouse gas emissions avoided by electricity generation and efficiency projects, to be shared across the industry, for emissions trading schemes, consumer pricing incentives, etc.

Last, but not least, data inaccuracy puts downward pressure on price in carbon markets. Current estimation techniques and algorithms may be improved by gathering sample data electricity networks pilots. Data can be aggregated to provide emissions from energy generation by generator type on a near real time basis. Energy market data can also benefit by virtue of the fact that emissions trading data produced by energy exchange technology can be extended to provide automated transaction updates across geographic and national boundaries in real-time.

Once a real-time trading market on greenhouse gas futures from electricity transmission has been established, it is a fairly short path to collecting data for transport fuels, and then industrial processes, given the work done by the Intergovernmental Panel on Climate Change to develop accurate emissions calculations algorithms.

A pilot project for electricity transmission and market data from renewable energy sources is an obvious starting point.



[1] EU Legislation, European Energy Policy.
[2] US “National Transmission Grid Study”, presented by Energy Secretary Spencer
Abraham to the president of the USA, May 2002.
[3] US NIST (National Institute of Standards and Technology) website Recovery Act
[4] Jon R. Loma, “The Challenge for Green Energy: How to Store Excess Electricity”,
Business & Innovation Energy Science & Technology North America, 13th Jul. 2009.
[5] Press Release, The Guardian Newspaper, Germany, France, Belgium, the
Netherlands, Luxembourg, Denmark, Sweden and Ireland and the UK governments,
3rd Jan., 2010. http://www.guardian.co.uk/environment/2010/jan/03/european-unitesrenewable-
[6] European Commission Energy, Energy Policy.
[7] CEN, European Commission for Standardization.
[8] International Organization for Standardization, ISO 19115, Geographic Information,
[9] The Open Geospatial Consortium, Inc. http://www.opengeospatial.org/
[10] Directive 2007/2/EC of the European Parliament and of the Council of 14 March 2007
establishing an Infrastructure for Spatial Information in the European Community
(INSPIRE) http://inspire.jrc.ec.europa.eu/
[11] Object Management Group, Model Driven Architecture http://www.omg.org/mda/
[12] Object Management Group, Universal Modelling Language (UML), specification.
[13] TM Forum Information Framework
[14] Utilities Common Information Model (CIM) User Group.
[15] Utility Management System (UMS) Data Access Facility Specification.
[16] Object Management Group, Data Distribution Service, Data Distribution Service
Interoperability Specifications.
http://www.omg.org/spec/DDS/ http://www.omg.org/spec/DDSI/
[17] Albert Leirbukt, Ernst Scholtz, Sergiu Paduraru “Taming the Electric Grid -
Continuous improvement of wide-area monitoring for enhanced grid stability”
[18] European Commission INSPIRE Geoportal
[19] National Rural Electric Cooperative Association “Multispeak”

Thursday, 26 September 2013 00:24

Energy & Greenhouse Gas Reporting in the Cloud

Trac-Car has built a reporting solution for monitoring greenhouse gas emissions for Australia's National Greenhouse Emissions Reporting in Sparx Enterprise Architect. 

The solution has a UML model at its core that has been used to generate a database to connect to  data from electricity, transport and other industrial activities as it is made available from meters, databases and spreadsheets. 

With workflow and technology connectors , a global cloud solution can facilitate low cost location based energy & greenhouse gas monitoring, while organisations can have immediate access to their carbon footprint. 

 The model is freely available for download.

 A worked demo for Eureka Works, Australia Corp, a manufacturing facility with a transport fleet, is available for view.

Business situation


As companies seek to comply with greenhouse gas emissions reporting regulations, there is a clear challenge to avoid the yearly pain of intensive manual information gathering from multiple sources, spreadsheets, databases, meters and output from statistical reports. 

A UML model can be used to semi automate cloud hosted software services.  It is possible to set up an automated reporting system once, so that the information required for CO2e emissions by energy source is at hand throughout the year. 

Most of the software services depend on having a robust, tried and tested information model, and the time saved by automated generation from the UML model is huge, making the development of the reporting capability a lot simpler and more cost effective. 

View an HTML version of the UML Energy Reporting Domain Model

Energy GHG Reporting Domain Model


Figure 1: Energy Reporting Domain Model - top level

Technical situation

The costs of manual and semi-automated aggregation of greenhouse gas emissions data is high, because it requires a lot of human resources to process. 

This type of activity is error prone, and one of the reasons for the lower than expected price of CO2e emissions, is that markets dislike inaccuracy. (CO2e stands for CO2 equivalent, and it is a calculation of the warming effect of the full suite of greenhouse gas emissions (methane, nitrous oxide, etc) standardized as for CO2.  CO2e value is generally measured as a price per metric tonne.) 


The setup and configuration of the data interfaces is always going to be the biggest problem to solve. A UML information model can ensure the desired results through a back-tracking problem solving methodology. 

The Energy Reporting Model  was developed from the specifications published by the Australian government National Greenhouse Emissions Reporting, and Energy Efficiency Opportunities reporting requirements.  

The following steps describe the building of a web reporting facility directly from the model using Sparx EA code generation facility for databases. 

1.Building a UML model of the required reports. 

2.Code generate the data in the desired database definition language (DDL) script directly from the model

3.Build the target database using the script and deploy to a database instance on the cloud (I used Postgres database services hosted on the Amazon cloud. 

4.Analyse the current data collections and define the interfaces required to populate the reporting database from input data sources (the example uses spreadsheets and databases, and Postgres SQL to import the data)

5.Develop the SQL to generate the data to the required reporting staging tables from the input sources. 

6.I then used two additional cloud hosting services

a.RunMyProcess hosted workflow/automation to build data collection user interfaces for manual input of some of the energy data, written to the Amazon cloud database. 

b.Zoho reports provided the detailed reports accessing the Postgres data from Amazon,  updated using a Zoho scheduling capability .  (Zoho has an excellent reporting capability, in my view the best of their hosted offerings, and I used the cloud hosted offering to build the final tables, graphs, and drill downs). 



The experts have found that simply monitoring energy usage does significantly reduce carbon footprint for a very low cost compared with other methods, and consequently lowers the price organisations have to pay for emitting greenhouse gases.

It makes sense to automate processes that have to be repeated annually to comply with legislation and reporting regulations, in terms of lowering the cost of reporting.  Payback for automation can be immediate, as the ICT costs are quite modest. 

In the next iteration of the GHG Reporting, I am building real time access to metered data collections to be accessed from the RunMyProcess cloud. Using Sparx EA generated code (e.g. Java, PHP, C++ as required) as a basis for aggregation of metered data into the electricity time-of-use daily, weekly monthly and annual summaries that can update the database to a predefined schedule for security access web reporting.  I am also designing a simple mobile app to provide graphical location based energy usage alerts. 

Sparx EA 

The greenhouse gas reporting solution used Sparx Enterprise Architect to build the reporting model, and to generate the DDL scripts to generate the Postgres database.  I was able to modify the model many times, then regenerate the databases automatically to reflect changes as I improved the model.  This was a huge saving of time and effort, and is one of the strengths of Sparx EA for building solutions. 

Other Software and Services 

The Sparx EA scripts generated the Postgres databases with very little additional coding required.  This was very useful as Postgres has a strong cloud presence and is supported by all the other cloud hosted software services I needed to build the reports.  I used Amazon as it was easier to set up than the HP offering, which is geared to large installations. RunMyProcess and Cloudbees cloud hosted software offerings are easy to use, and off course offer JDBC connectors to Postgres.  



Model led development is a very successful method of building ICT reporting systems.  And once the model is stable, and data aggregated into a standard RDBMS, the same model can be used to generate the starter code for building specific interfaces from a variety of data sources and formats, in any of the common coding languages supported by Sparx EA. I am looking forward to further automating the integration of energy data interfaces using cloud hosted software services. 

A worked example using the methodology outlined in this case study used Microsoft Excel spreadsheets and Postgres databases as input data sources, mapped to a Postgres Energy Reporting database scripted using EA’s code generation capabilities. 


Wednesday, 23 January 2013 05:24

The Value of an Enterprise Information Model


An enterprise architecture reduces the cost of operations, through reuse of standard pieces of technology, application and data, and network .

However siloes of information in specialist corporate IT teams are not making it through to a central knowledge base, meaning that function and data are repeated in a string of parallel universes, that are not congruent, and have synergies that remain largely unexplored.

Enterprise architectures are often seen as not addressing the coalface reality of reuse and accessibility of data and services.

An Enterprise Information Model can address the gap between business and technology owners, practicality and principle, concept and delivery.

Deploying from an Enterprise Model

Metadata links most of the common Enterprise Planning and Business Process Automation domains:

Customers are linked to products, enterprise asset management is referenced geospatially, and of course organisation data provides the basis for linking web interfaces to identity and access management systems. 

Workflows start or are started by scheduled and triggered events, and web content/assets can be linked and reused by multiple workflows.

An Enterprise Model can be deployed from a set of domain classes.  Relevant metadata databases can be directly generated, and runtime objects can also be generated from the same Platform Independent Model in a specific technology, for example .Net, Java, XML.  Code stubs can be deployed by common development IDEs to complete the build of runtime systems.

This is an enormous saving in architecture, design and development processes, leaving people free to invest in newer technologies and innovation, rather than having to iteratively process, develop and deploy. 

From concept to code is getting ever closer.  With the advent of quantum computing, the recognition of algorithmic patterns that have quantum mechanical analogs is key to moving ICT to the next level.


Models and Modelling

Models are the ideal way to commence the process of pattern recognition for ICT deployment

Although every organization has a unique set of processes, and a unique language for describing these processes, in essence, there is an enormous degree of commonality.  While subject areas involved in a business are are published as conceptual models, cohesion outside of the project is rarely considered. 

An opportunity is missed, as subject area models can be readily mapped to standard business domains. 

Accessibility of information complies with all the ‘nice to have’  enterprise principles covering  information architecture, SOA and reuse of existing capability.

An Enterprise Information Model serves as a natural method of documentation for business domain functionality at an enterprise level.

While design principles are useful, they are no substitute for a coherent model.

An Enterprise Information Model can be used to support design and development.

For business users, a model supports the basic concepts,  and provides a common language with which to communicate with designers and developers.  At the operational level, a model can be built in such a way as to facilitate decision making by facilitating communication about performance (Business Activity Monitoring) to business owners.  

An EIM has to be be clearly stated and understandable to the system owners, builders and users.


Enterprise Information Management

A centrally managed model  can be version controlled.  

Business Domain UML Models can be deployed as a metadata database to enable cohesion across business areas by deployment as an integration data hub, as well as publication mechanism for data services.  Well-constructed, a Business Domain Model can generate databases, and engineer code that provides a starting point for implementation of data, XML and web services, implementation classes and procedures for deployment (allied with the principles of MDA).

Business Process Models (in BPMN or UML) and User Interaction Models can be mapped to the business domains, providing not only a shared knowledge base, but also a ready conversation between business analysts and developers.

Enterprise data can not only be explored and managed to avoid duplication, an Enterprise Metadata Model can facilitate integration.  Data can be shared (except where exempted by security policy) through metadata management.  Mapping physical data to a metadata database provides a publication mechanism and the basis for messaging, both data centric and middleware style, using RDBMS SQL, SPARQL for the semantic web or XQuery for XML data collections.

This promotes both federated and consolidated data integration, with a clear separation from from application services and processes, facilitating optimum database infrastructure performance. It also facilitates the development of service taxonomies for ESB, SOA and other integration platforms.

Enterprise Architecture Process

Allied with program governance, architecture governance is equally important.  An enterprise architecture process is equally as essential as a project management process,  providing the structure, timeline and  delivery framework to meet release target dates.

Using an Enterprise Information Model as the knowledge repository for a team based approached to ICT project delivery allows for a clear delineation of responsibility, in the context of technology delivery prerequisites and dependencies.

This means an evidence based approach for project estimation based on architecture, design and development project deliverables, traceable to business requirements and business priorities.

It also means that business and technology specialists can share a perspective,  collaborate on the implementation of ICT systems, on time and on budget,  with a good appreciation of the whole project, and the part that every role plays in the delivery.

View a worked example of an Enterprise Information Model


Author: Nya A Murray, Chief Architect,Trac-Car .

Date: 23rd January 2010

Website: http://www.trac-car.com

Email: This email address is being protected from spambots. You need JavaScript enabled to view it.


 1  Executive Summary


Corporations and governments around the world are starting to collect information from telemetry devices, in the fields of


1. Water supply

2. Transport

3. Electricity

4. Manufactured goods

5. Agriculture


Valuable information can be produced from telemetry data, about resource utilization and associated carbon emissions in all of these fields.


Climate change is definitely an imperative. Telemetry data is a critical input into providing accurate global environment information, and carbon emissions associated with individual developments .


 1.1 Emissions Trading


Emissions Trading Schemes, as set out in the Kyoto Protocol, are reliant on market mechanisms of cap and trade to regulate carbon emissions. Different mechanisms are being used in different parts of the world, probably the most developed being the European Union ETS. In addition, the Kyoto Protocol allows for two types of initiatives that developed countries can undertake in developing countries, the Clean Development Mechanism ( CDM, projects approved by UN FCCC) and  Joint Implementation (JI) projects. One Emissions Trading Unit ETU) is equivalent to 1 tonne reduction of CO2 emissions. 


There are  a few emerging problems with cap and trade schemes, such as

  • the estimation techniques have a significant uncertainty. 
  • Many emitters and industries are not covered by the schemes
  • Carbon leakage –  a net increase in carbon emissions because of different pricing and regulation between countries, of which carbon polluters take advantage. 


There are no guarantees that these schemes will be effective in the actual measurement and  reduction of carbon emissions into earth’s atmosphere, because practically speaking, it is impossible to fully regulate a carbon market within the timescales required to be sure of limiting temperature rises to an acceptable range.


With the lesson of the failure of self-regulation by financial markets still fresh, it may well prove too risky to allow market forces to regulate carbon emission reduction.


A better strategy for carbon emissions governance is to selectively apply a regime of carbon emissions monitoring, particularly  in  the energy and transport industries, encouraging engagement and participation by all sectors of the corporate community. This may serve to actively promote a popular culture of carbon emissions reduction, by offering carbon emission reduction discounts.


 1.2 Real-time monitoring and carbon discounts


By monitoring events that emit CO2, such as power consumption and transport, at an industry and household level, carbon reduction pricing incentives to reduce emissions,  may ensure that there is enough popular support, required to be effective in limiting temperature rises.  


Carbon taxes, discounts, and cap and trade schemes will all work more effectively if the end users are involved in monitoring mechanisms. Some of the obvious methods are

  1. Monitoring energy utilization in real-time and rewarding use of renewable energy  with discounts.
  2. Road pricing based on distance driven, with discounts for greener cars, particularly electric vehicles (where emissions would be accounted for by energy supply utilization)
  3. Estimation and calculation of carbon emissions used to produced goods traded or sold, and new buildings and industrial developments.


This information could be published, and used as the basis for estimating carbon taxes. 


The success of monitoring global carbon emissions reduction depends on a mechanism for ensuring that knowledge, experience and expertise in carbon emissions monitoring is shared, not only locally, but nationally and internationally. 


 1.3 The Role of Logical Models


To achieve this, it is important that a standard approach is developed to the collection of carbon monitoring data.


A modeled,  knowledge-managed approach to the integration of real-time data on which carbon emissions can be calculated or estimated is key.  Data could be collected from distributed sources, supplied by collaborative stakeholders based in multiple locations.


A common information model for key industries, such as Energy and Transport, is essential to ensuring that data can be aggregated from communities, regions, states, nations, and internationally.


Information analysis of telemetry and other metric data could take advantage of low cost information technology advances in dynamic software, hardware and network infrastructure.


And these carbon emission data collections could be shared on an industry, company and even household basis, via web portals. 


 1.4 Integrate with existing technology


A particularly effective place to start would be power utilization, as this is where consumers could be given a choice of power source.  Suppliers could offer discounts for use of renewable energy, creating incentives for moving away from fossil fuel power generation. Discounts and consumption levels could by viewed by customers over web portals.


This would provide economic incentives for encouraging the production of renewable energy grids, such as the  current initiative for a clean energy super grid by Europe's North Sea countries (able to store power, effectively forming a giant renewable power station). 


Telemetry devices are monitoring electricity consumption, transport mileage, water resource

utilization, to name a few. The quantity of information to be collected will expand exponentially over the next few years. 


A knowledge managed approached, based around standard industry models,  can be implemented on a centralized basis.  Improvements in carbon emission methodologies could be communicated across the usual national and corporate boundaries, being made available on by federated web-content management.


 1.5 Carbon Emissions Monitoring Model


As well as agreements such as Kyoto, information technology has to play a vital role to ensure that information is easily accessible and able to be aggregated across diverse industries and infrastructure.


A core logical model that is deployable in a data integration technology, able to map current technology deployment patterns to information analysis of carbon emissions is a key component of international industry collaboration for publishing and sharing of information about carbon emissions.


As knowledge and experience on carbon emissions monitoring increase,  optimal access to aggregated data in near real time will become possible, enabling immediate access to resource utilization and associated carbon emissions information, on demand.


It is essential to employ a common information technology model to map with current industry models, to collect data in real time.  Without a core logical model, federating web content from different suppliers, governments, and consumers becomes too expensive and too difficult. 


It cannot be said often enough that a Carbon Emissions Monitoring Model is a key participation component for effective short-term deployment of systems to monitor, calculate and price carbon emissions.


 1.6 Model-Based Deployment


Time is running out on climate change action, it is absolutely essential, in the short term,  to begin the process of being able to provide accurate metrics and statistics on carbon emissions, to enable assessment of  targets, which must be lowered over time to meet accelerating risk.


Device technology is advancing rapidly.  Telemetry devices can be remotely configured, and new information captured and processed, as it becomes available and useful. It is very important to make information access easy.


Data can be sent over any network, e.g. electricity supply, WiFi, UMTS, internet, corporate WAN/LAN, and aggregated centrally to provide dashboard style reporting. These technology capabilities can be meet by leading technology suppliers today.


Technology suppliers have to be capable of distributed, change-enabled, monitored, dynamic technology services, based on core logical models that can map to standard industry models. 


Real-time cumulative costs, carbon emissions and resource accounting information could be displayed to registered users through a portal.


Data that is mapped to the Carbon Emissions Monitoring Model could be aggregated across regions, industries and other groupings to provide, for example, daily carbon emissions from power and transport.


This addresses the real challenge for  Carbon Emissions Trading scheme  ‘Carbon Leakage’, (the increase in CO2 in countries because of  decreases in other countries). 


Accurate power and transport real-time monitoring data can be aggregated, and extrapolated  to provide carbon emissions statistics.


A Carbon Emissions Monitoring Model would be a starting point to ensure that telemetry information which can provide accurate emissions data , can be linked with other carbon metrics readily and rapidly.

Carbon Emissions Sub Domain 

 1.7 International Information Exchange


It is timely to envisage international exchanges of information for carbon trading, energy pricing, and inputs into climate change models, made securely and readily available to all user types, corporate, government and individual consumers.


A short-sighted approach to the technology applied to emissions trading schemes will be a huge inhibitor to accurate carbon emissions reduction monitoring.  Near enough is not good enough. The risks of inaccurate monitoring associated with inaccurate estimation methodologies are too high, in the context of the uncertainties of temperature rises and climate change that accompanies increasing levels of greenhouse gases in the atmosphere.


Global collaboration must be built into the monitoring mechanisms from inception.


Carbon monitoring initiatives be designed to utilize a best of breed technology approach, with the emphasis on carbon emissions measurement to replace estimation methodologies.


It is clear that the current emissions trading schemes cannot be relied upon to accurately estimate or forecast temperature rises that produce climate change and extreme weather events associated  with rising greenhouse gas emission levels.


 1.8 Utilization of Existing Technology


There are a number of technology options for gathering information from telemetry data collected in real-time from telemetry devices,  to be aggregated with estimated carbon emissions to provide a fuller picture of carbon emissions reduction.

Whatever technology is deployed in collecting data, it is the data integration that is critical to the success of  providing emission information


For data integration to be cost-effective and able to use any technology, the most important factor is to ensure that industry models can be mapped to a deployable logical model for the collection of real-time emissions event data.


Once the sample size is large enough, the real-time data collections could be used to extrapolate carbon emissions in climate change models.



 2  Opportunity to Real-Time Monitor CO2 Emissions


As people realize the importance of monitoring the world's resources, many new initiatives

from governments, corporations and the community, are emerging, with varying quality or no carbon emission estimates.


There is a pressing requirement to provide accurate measurements and calculations, which

would be facilitated by a systematic approach to information delivery.


Around the world, provision and planning has to be made for basic telemetry metrics, covering personal and industrial water utilization, energy usage by source, transport, agriculture production, and other carbon emissions.


 2.1 Technology Already Exists


A collaborative effort by national and international stakeholders could provide a universal capability to aggregate and present accurate carbon emission statistics in near real time, using existing technology components.


The knowledge of how to build a telemetry integration platform out of existing technology could be shared with industry stakeholders.


Information management is currently a critical problem in large organizations. Information has to be accessed and shared across organization and technology boundaries.


Information sharing is critical for the monitoring and management of environmental change.


Carbon emissions monitoring is something that will probably become mandatory through international treaties and agreements within the next couple of years.


Current broadband speeds provide a unique opportunity to facilitate a methodical approach to carbon emissions reduction monitoring, by real-time aggregation of data.


With proper forward planning and technology models, it is possible to model and organize the analysis and aggregation of telemetry data, from many different sources, to begin to understand the real carbon cost of resource utilization, including electricity, water, transport, agriculture and manufacture, to name the major sources of carbon emissions.


 2.2 Federate Information


A federated technology approach, that is, using currently existing technology components from stakeholder organizations, integrated by a central federating capability could provide a relatively low cost carbon emissions data analysis and statistics capability.


Information access to just in time data will become essential to facilitate international climate

change targets and measures to reduce carbon emissions.


A well planned, catalogue of information access requirements, mapped to a central logical carbon emissions model, will provide the basis for information integration technology for local, regional and national governments. This technology can be collaborative and shared across organization boundaries based on user profiles.


 2.3 Central Logical Model


A central logical model  is essential to ensure that the common problems of current technology integration deployments, are bypassed (such as large cost overruns and failures due to poor architecture, and systematic unresponsiveness to changing requirements).  A central model allows for efficient interfaces between unlike systems, and facilitates technology services across organization boundaries.


Forward resource planning, and appropriate business and technical models have to be  identified and determined in advance, for an effective solution to sharing information.


A best practices approach would work to ensure that information access services would be governed by measurable Service Level Agreements, and that these metrics would be subject to governance, tracked to supplier delivery.


 2.4 Service Level Agreements


An example of business services, to which SLAs apply, could perhaps be:


1. Provide telemetry data transmissions over a network every 60 seconds

  1. Provide web portal access to energy usage and associated carbon emissions data
  2. Aggregate emissions reduction data daily by local area by region and by country


The federation and knowledge management capability of telemetry integration could be

provided as a collaborative project, to optimize delivery efficiency and cost.  


Power and transport monitoring components, for example, could be supplied internally or outsourced, based on service level agreements, and shared across industry sectors.


A model based approach, if implemented correctly, is flexible. For example, when a new

energy supplier is added to an energy grid, telemetry data could be automatically updated to

reflect the new supply source, and the associated requirements for increased processing

capacity and data storage can be automatically flagged to the network and infrastructure technology services.

But only if the model is the basis for the technology design and deployment, as retro fitting models is much less efficient, and much more expensive.



 3  Recommendation with Risk Evaluation


The risks for the climate stability of the planet are huge.  Effective governance of large carbon emitters, and cap and trade schemes for carbon pollution will be compromised by carbon leakage in the short to medium term.


It is clearly never too early to initiate a technology platform to improve the accuracy of carbon emission reduction monitoring.


An incremental distributed technology approach is a cost effective way to deliver information that will become increasingly accurate because of a collaborative knowledge base, eventually able to be shared internationally.


Real-time telemetry information can be aggregated with carbon calculation algorithms to provide more accurate information than existing methodologies.


Involvement of the whole community in carbon emissions reduction monitoring is a smart move, that will help the corporate large carbon emitters to self regulate, and reducing carbon leakage through the pressure of public opinion.


Collaborative industry efforts to real-time monitor the source of carbon emissions is required to limit global temperature rises.


The technology approach, if best practice, will enable knowledge sharing and co-operative technology development.


The fact that existing technology can be used as components of information integration is

clearly cost-effective.



 4  Use of Sparx Enterprise Architect for Technology Models


The most important criteria for building core logical models for carbon emissions monitoring are

  1. Ability to use UML, the most widely used modelling language
  2. Ability to model from concept to code i.e.

a)      Business vision, concepts and requirements

b)      Business processes, workflow and role definition

c)      Technology service identification

d)     Technology implementation and deployment

  1. Ability to auto generate

a)      Deployable logical metadata from information architecture

b)      Multiple language source code, e.g Java, C#

c)      Multiple class transformations e.g. XML, WSDL.


These capabilities are key to providing information across organization boundaries, because they provide a common communications framework which can be versioned and change managed for collaborative projects, enabling synchronous deployment across multiple stakeholders on shared infrastructure.


To cover the scope of highly complex technology modeling, domain models provide a catalogue for technology deployment patterns.


A useful  catgorization is to identify common technology infrastructure from specialist applications.


  1. Technology Application Domains
  2. Technology Platform Domains


Doing so provides a starting point for the reuse of technology patterns, and also provides a coherent theme for technology integration.


Figures 2 and 3 illustrate examples of these domains.


Sparx Enterprise Architect is a techology of choice for the particular circumstances of  collaboration of Model Driven Architecture (http://www.omg.org/mda/ ) amongst business and technology stakeholders from multiple organizations.  It is readily available, easy to deploy, cost-effective and compared with other products, easy to use.


The example figures  used in this paper are extracts from Telemetry Services  UML Model (http://www.trac-car.com.au/Telemetry Services Model Version 4.0 HTML/index.htm) 

Example Application Domain Model 

Figure 2:  Example Application Domain Model

Example Technology Platform Domain Model 

Figure 3: Example Technology Platform Domain Model



 5  Implementation Overview and Accountabilities


The implementation of a flexible, change enabled telemetry integration capability

is undertaken as a planning process based on a central set of models, themselves subject to

change management and governance.


A programme plan could be developed as a result of a collaborative feasibility study with the

organization accepting responsibility for co-ordination of stakeholder effort. This process would be expected to take 6-12 weeks, depending on scale and complexity of the technology capability involved, and the level of commitment available from stakeholders.


An initial proof-of-concept could be deployed, building selected services from the model to an agreed plan. Subsequent deployment would be incremental, as resources become available.


A business and technology governance core group would be required to ensure that adherence

to agreed standards and processes is achieved.


A good place to start might be a proof-of-concept  based on telemetry metrics of power transmission by energy source by  power supplier.


This would involve definition of Service Level Agreements for the required service performance and other metrics, and selection and evaluation of suppliers by those



Full technology deployment could be achieved with the advice of appropriate business and technology subject matter experts.


Accountabilities are largely predicated on having a strong and viable governance mechanism,

administered by a core steering group, facilitated by specialist knowledge, automated by  event processing, data integration and web federation technologies.


The following table outlines the basic steps and accountabilities required to deploy a carbon emissions monitoring integration capability.


Human and financial resources could be distributed across participant organizations.




Time Estimate (days)

Feasibility Study

Telemetry Integration Model implementers

Organisation stakeholders

Technology suppliers


Strategic Programme Plan

Organisation stakeholders

Programme planners


Pilot Telemetry Integration Platform

Internal and external technology integration suppliers

Organisation stakeholders

Programme planners


Full deployment of Telemetry Integration Capability

Internal and external technology integration suppliers

Organisation stakeholders

Programme planners




 6  Technology Deployment


There best practice basic approach to building information integration in the current technology can be categorized as distributed incremental services, connected by  a central federation capability for integration of data , using existing technology.


By implementation of data integration and information access, to a centralized model, with SLAs and governance for technology suppliers, a reliable set of technology services  could deliver data just-in-time for rating, billing, and calculation of discounts for reduced carbon emissions.


By sharing technology knowledge and practice, infrastructure clouds and grids could be provisioned to provide the processing power to deliver information in real-time.


 6.1       Timescales


This approach could be implemented by  an incremental delivery of services. A meaningful output from a proof-of-concept can be delivered within 3 months. A core integration information federation capability could be delivered within a further 6 months. After one year, distributed information access services would be available to provide aggregation of telemetry information from energy suppliers, and providing a catalogue of carbon emissions model algorithms by industry.  Subsequent service delivery would  be incremental and relatively resource efficient.


By engaging in a collaborative information access process, essential feedback could be delivered across stakeholder communities, with group notifications based on user profile, promoting knowledge sharing for timely problem solving.


Costs would be contained by stakeholders supplying components from already existing technology.


 6.2 Solution Architecture Assessment


This solution is an emerging incremental approach, designed to address the shortcomings of earlier technology deployments. It is also comparatively low cost, as costs could be shared amongst stakeholder organizations.


It allows for best practice technology components, built by many suppliers,  linked

together to make a cohesive whole, based on specific  technology services subject to binding supplier service level contracts. 


The core of this technology is the federation capability, to bring together the distributed



The basic premise is to utilize standard technology approaches, however, planning and

building incrementally within a methodology which seeks to optimize available technology components, linked by standard interfaces based on coherent information models.


The technology services can be knowledge managed and tracked to business requirements by way of Service Level Agreements based on metrics formed from practical experience.


 6.3 Information Management with a Model


Information management of services, interfaces and performance is effected by

central models, which also provide knowledge management of collaborative development, tracked through workflows accessed by users with appropriate access privileges.


Collaborative program management and good communication work well with this approach.

These attributes are facilitated by a model based approach to implementation.  A model based approach also enables and facilitates technology change. And interface  mappings to a logical model can take advantage of technology that already works well.


Technology service suppliers are more readily able to be selected on the basis of the technology evaluation criteria documented as Service Level Agreements, which reflect performance requirements and other metrics for the type of business service involved.


Secure web access to a knowledge base can be automated by business and technical user profiles mapped to the central model.


Business requirements can be mapped to technology services, to provide traceability and accountability, and readily knowledge managed by model catalogue category.