Enterprise Architect version 13.5

download
purchase

enzh-CNcsnlfrdehiiditkoplptruskessv

My Profile

Social Media Channels

facebook google plus twitter youtube linkedin

doug rosenberg

doug rosenberg

doug rosenberg

ICONIX (Chief Executive Officer)
 
Doug Rosenberg founded ICONIX in his living room in 1984 and began training companies in object-oriented analysis and design around 1990. ICONIX specializes in JumpStart training for UML and SysML, and offers both onsite and open-enrollment courses.
 
Doug developed a Unified Booch/Rumbaugh/Jacobson approach to modeling in 1993, several years before the advent of UML, and began writing books around 1995. Design Driven Testing is his 6th book on software engineering. He’s also authored numerous multimedia tutorials (including Enterprise Architect for Power Users) and several eBooks, including Embedded Systems Development with SysML.
 
Doug has spent the last few years doing "deep dive" consulting into cutting-edge technology including cross-platform mobile app development, REST APIs, and NoSQL databases, and gaining first-hand experience on some "hardcore agile" projects of varying sizes.  He's also been working with dozens of graduate students at the University of Southern California Center for Systems and Software Engineering (USC CSSE), managing Directed Research projects and developing/piloting the Resilient Agile process. 

 

In Parts 1, 2, and 3 of this article series we introduced a student project that I’m managing at the University of Southern California Center for Systems and Software Engineering.  This article (Part 4) describes the results of our first semester’s effort.  We are currently getting started again on enhancements with a new group of students for the Spring 2017 semester.  USC’s location in downtown Los Angeles is at the epicenter of a lot of bad driving, so we’re attempting a “crowdsourced bad driver reporting system”.

Our system consists of a voice-activated “dashboard-cam” mobile app connected to a Mongo database in the cloud, via a Node JS REST API, and some Angular JS webpages to file, review and query bad driver reports.  This technology stack for our web-app is sometimes referred to as MEAN stack (Mongo, Express, Angular, Node).  We developed a native Android mobile app in Java, and a native iOS app in SWIFT.  Following the Resilient Agile process, and using Enterprise Architect to model the project, we attempted to go from zero to a working system in about 12 weeks of class time, by having students develop use cases in parallel with each other.  Previous articles in the series have presented snippets of the UML model but everybody knows that successful implementation is where the rubber meets the road.  So this article will show you how far we got.  

When the driver issues either the “Report Bad Driver” or “Emergency Alert” command, the mobile app triggers video upload and server-side creation of the Bad Driver Report or Emergency Alert Report, as appropriate .   The server then sends an email to the driver’s posting account with a link to a new report that’s pre-populated with the video.

While submitting the report, the poster reviews the video, records the license plate number, and grabs a single video frame that most clearly captures the offending vehicle.  The report is then made available for independent reviewers to evaluate.  The system requires unanimous agreement from 3 independent reviewers that the report is accurate.  Once this consensus has been achieved, the report is entered into a database that is queryable by insurance companies.

From a development standpoint, we were able to exploit parallelism among the students to complete this set of use cases (including defining requirements, UML design, and coding) in approximately 12 weeks of calendar time with 15 students each contributing 5 hours a week.  This calculates out to 900 student-hours or 22.5 equivalent full-time work weeks.  In other words, about half a person-year total effort.

 

In Part 1 and Part 2 of this article series we introduced a student project that I’m managing at the University of Southern California Center for Systems and Software Engineering (USC CSSE).  USC’s location in downtown Los Angeles is at the epicenter of a lot of bad driving, so we’re attempting a “crowdsourced bad driver reporting system” this semester, and because we need to be really productive, we’re using Enterprise Architect to model the project.

 

The statistics on the costs of bad driving to society and to insurance companies are staggering.  Among the high points from the article linked above:  

·      A trillion dollars a year in costs due to vehicle accidents.  

·      Over 5 million reported accidents a year.  

·      An estimated 10 million additional unreported accidents a year

 

With these numbers there's a strong argument to be made for a crowdsourced approach – it’s the only viable way to get enough eyes on the road.  So this semester I’m working with a group of 15 Masters students to build a “proof of concept” system, following the Resilient Agile process and leveraging parallelism, with each student assigned to a different use case and all students working in parallel. 

Download the attachment to read the full article.

Read Part 1 of this Case Study

We’re attempting a “crowdsourced bad driver reporting system” this semester, and because we need to be really productive, we’re using Enterprise Architect to model the project, field-test the Resilient Agile process, and to coordinate all of the student homework.  Students communicate with each other and with me using a shared EA model.

This semester I’m working with a group of 15 Masters students and an aggregate effective time budget of 80 student hours per week.  We’ve got about 12 usable weeks of student time, so it works out to a time budget of roughly 1000 student hours (that’s about half-a-person-year at 40 hours a week) over a 3 month schedule.

Resilient Agile is a flexible process in that it can be employed with traditional Scrum/Kanban sprints and backlogs, or alternatively we can leverage parallelism, and each student can be assigned a use case and develop their use case independently. 

I’ve been a big fan of leveraging parallelism in software development since I was a programmer at NASA/JPL way back in the 80s when I rescued a late project using a “divide-and-conquer” coding strategy, so we’re trying to see how far we can push the limits on massively parallel development with student projects at USC.  Communication and well-defined interfaces are key when team members are working in parallel, so the shared EA model is critically important.

Parallel modeling and development has also been a theme of our ICONIX JumpStart classes for the last 20 years, where we go into industry and work a client’s real project by splitting the class up into “lab teams”.  Typically in ICONIX JumpStart classes we put 3 or 4 students on a package of use cases, whereas on this project each student got a single use case.

If you’re going to leverage parallelism in development you have to do things a little bit differently.  Here’s an overview of the process we’re following:

1.     Plan for Parallelism (identify dependencies and architect for parallelism)

2.     Build the Right System (discover requirements, prototype areas of technical risk, and agree on conceptual designs)

3.     Build the System Right (carefully review detailed designs)

4.     Integrate as often as necessary

Enterprise Architect is a key enabler of the above process.   I would never attempt this approach without a good solid modeling tool at the heart of it. This article will show how we’ve used EA to accomplish the 4 steps above. 

 Read Part 3 of this Case Study

 

Introduction (See attachment for full article)

 

For the past several years I’ve enjoyed a mostly informal association with the University of Southern California Center for Systems and Software Engineering (USC CSSE).  I was on-staff at USC a few years ago teaching SysML and Model Based Systems Engineering, but for the last few years I’ve been mentoring Computer Science grad students in two Masters courses: CS577 Software Engineering and CS590 Directed Research.   The Directed Research (DR) course is basically a mechanism for students who are about to graduate from the Masters program but are one or two units short of the required number to pick them up by participating in a project with a mentor from industry (that would be me).   Students are expected to work 5 hours per week per unit.  

Teaching at USC is fun (I graduated from SC back in ancient times), gives me an opportunity to work with a lot of bright young software engineers, to stay current on new technology (in particular cloud-connected mobile app development) and also gives me an excuse to work with Prof. Boehm (author of Balancing Agility and Discipline among numerous other titles), who has happily taken an interest in some of my ideas related to improving productivity by innovating better software processes and allowing me to test my ideas out with USC grad students.  

This process work has included the development of the Resilient Agile process, an attempt to develop a better agile methodology which started out as an experiment called Massively Parallel Use Case Modeling that we did with the CS577 class a few years ago where we developed a complete location-based advertising system by handing one use case to each of 47 grad students and having each student develop their use case independently.  

This semester I’m working with a group of 15 Masters students, mostly taking a single unit of  DR.  One student is taking two units, so my team has an effective time budget of 80 student hours per week.  Although the semester at USC is 16 weeks long, by the time the student teams get formed, and with midterms and finals, we’ve got about 12 usable weeks of student time.  So it works out to a time budget of roughly 1000 student hours (that’s about half-a-person-year at 40 hours a week) over a 3 month schedule. 

Because I like challenges, we’re attempting a “crowdsouced bad driver reporting system” this semester, and because we need to be really productive, we’re using Enterprise Architect to coordinate all of the student homework.  This is the first article in a series that will describe our progress.

Are we crazy to think that we can get this system built in 3 months with a total of half-a-person-year of developer time?  Stay tuned for our next article to see how we’re doing.

Read Part 2 of this Case Study

 

Overview

Your organization is committed to Agile, Scrum and TDD. That’s not going to change.  But somehow things aren’t going as smoothly as you’d like. Your project is on the “underplanned” end of the spectrum and you feel like everything could be working more smoothly.

If I’ve just described your current situation, then this article might be for you.  The Resilient Agile (RA) process combines Test Driven Development with Design Driven Testing, resulting in improved coverage for both unit and acceptance testing, and helps you plan your sprints better by introducing visual modeling of user stories, epics, and tasks.  

RA is agile on the project management side, and scenario driven on the technical side.  Enterprise Architect can help you to build, manage and enhance your Agile development projects.  This article show shows how Enterprise Architect can be used to implement Resilient Agile in your project.

BetterAgileProcess.png

Resilient Agile interfaces Design Driven Testing to Test Driven Development -- and it's only supported by Enterprise Architect

 

Resilient Agile is balanced between planning and feedback

The agile movement properly recognizes the value of getting to code early and not spending years on “Big Design Up Front” without demonstrating executable software early in the lifecycle. Agilists are correct in believing that a great deal of knowledge will be gained about system requirements by putting live software in the hands of users early. However, agile development as currently practiced across much of the industry has removed too much engineering from software engineering.

Agile/scrum and kanban put the focus on project management rather than engineering. Engineering tends to be left to the discretion of the developers, and there is an active mindset of “code first, refactor later” as opposed to systematic exploration of requirements and designs. Agile in practice is often used as an excuse for not doing architecture, not doing engineering and avoiding thinking about rainy day scenarios, leading to buggy, fragile (non-resilient) software. There is a general mindset on agile projects of achieving quality by testing rather than achieving quality by design. This mindset very often tends to be overly-focused on unit testing because acceptance testing requires that rainy-day scenarios be fully modeled and accounted for.

RA puts the engineering back but still fits with scrum and TDD

Complicating the problem, basic software engineering skills like developing quality use case models that fully explore sunny/rainy day behavior, how to properly decompose a use case into models, views, and controllers (MVC) and how to model the problem domain as a set of conceptual objects are often not taught adequately at the university level.Resilient Agile (RA) is an attempt to put the necessary engineering back into software development without losing the “get to code early” focus that agile gets right. RA is based on a time-tested method (ICONIX/DDT) of developing an initial problem domain model, then decomposing a system into collaborating use cases and elaborating each use case by doing a conceptual MVC decomposition.

 

ICONIX Resilient Agile - A Better Agile Methodology

 

Friday, 22 July 2011 00:00

Design Driven Testing for Systems

Design Driven Testing (DDT) for software was first outlined in the book Use Case Driven Object Modeling with UML: Theory and Practice (by Doug Rosenberg and Matt Stephens), and then described in detail in Design Driven Testing: Test Smarter, Not Harder by the same authors.

DDT is a highly methodical approach to testing, allowing you to know when you’ve finished – i.e. when you’ve written enough tests to cover the design and the requirements. It helps you to “zoom in” and write algorithmic tests to cover intensive or mission‐critical sections of code, and to know when it’s safe to “zoom out” and write fewer tests for boilerplate or less critical code.
In this Article, Doug extends the concept of DDT for hardware/software systems, allowing SysML‐based designs to be tested in a highly rigorous, systematic way. It’s still Design Driven Testing, but now the design elements that need to be tested include all of the “four pillars of SysML”, whereas DDT for software focuses on testing behavior.

DDT Systems

The attached presentation, from the recent ESRI Developer Summit, presents the design of the hotel mapping application, and discusses bugs that were detected and fixed before product release following the DDT approach.


ICONIX has collaborated with Sparx Systems to produce a uniquely powerful set of capabilities for driving testing from Enterprise Architect UML and SysML models.    By combining the functionality of Sparx's structured scenario editor and that of the Agile/ICONIX add-in, it's now possible to automatically generate test cases at multiple levels of abstraction.   These include two levels of developer testing, including automatic generation of unit test code for JUnit, NUnit and FlexUnit, and two levels of acceptance/QA testing, including generation of test cases for Requirements, and scenario-based testing using a "use case thread expander" which automatically generates test scripts covering all permutations of sunny-day/rainy-day scenarios.

We've published our approach in a book called Design Driven Testing (DDT), which I co-authored with Matt Stephens, and we proved our concepts on a real, commercially available product, an interactive mapping system built with ESRI's ArcGIS Server geospatial engine, and in commercial use on a travel website, VResorts.com.

The attached presentation, from the recent ESRI Developer Summit, presents the design of the hotel mapping application, and discusses bugs that were detected and fixed before product release following the DDT approach.

Here is Chapter 7 of Design Driven Testing. This chapter focuses on Acceptance Testing, and leverages Enterprise Architect's Structured Scenario editor heavily to accomplish something we call "use case thread expansion" where all of the sunny day / rainy day permutations of a use case are expanded out into a complete set of tests.

In "test driven" approaches to development, unit testing often gets most of the attention.  However, unit testing is generally most useful in discovering "errors of Commission" (more poetically, "whoops, I coded that wrong").  Unit testing is of much less help in discovering "errors of Omission" (more poetically, "whoops, I didn't think of that").  In general errors of Omission are much trickier to detect, and there is very little automated support for detecting them.  We worked very closely with the development team at Sparx as they developed the "use case thread expander", and it brings a very unique and useful capability to the industry.

As you read this chapter, make sure you don't miss the discussion at the end of the chapter called "And the moral of the story is..." where we describe some actual "errors of Omission" that were caught and fixed before the release of our mapping software using these acceptance testing techniques, and how fixing these errors improved the user experience.

In today's agile universe, we often hear about "test-driven" approaches to development (TDD). TDD emphasizes unit testing to such an extent that in many companies, regression testing frameworks like JUnit have largely replaced upfront design. Design Driven Testing makes the case that skipping design in favor of unit testing is not only backwards, but also Too Damn Difficult. We thought the best way to prove it was by designing and testing a real production application...

My new book Design Driven Testing (co-authored with Matt Stephens) addresses both unit testing by developers and acceptance testing, performed by an independent QA organization.  Somewhat uniquely, the book features a real production system as it's teaching example.  A worldwide interactive hotel mapping (GIS) application, designed with ICONIX Process,  built using Java, Flex, and the ESRI ArcGIS Server mapping software, that we call "mapplet 2.0", which is in production use on the VResorts.com travel website.


This sample chapter presents the full ICONIX Process design of the mapplet project, starting from functional requirements and use cases, all the way down to reverse engineered class diagrams from the final code.   One of the unique virtues of using a production example as a teaching example in a book like DDT, is that it's possible for readers to look at the use cases in the attached chapter, and then compare them to the released software as deployed on VResorts.com.  



Next month I'll be posting another sample chapter from the DDT book which describes the scenario testing we did before releasing the software and some very real improvements that were made to the usability of the final product as a result.  If you'd like to work through the design and testing of mapplet 2.0 with me, in person, our Hands On ICONIX Process open enrollment classes give you exactly that opportunity.   In addition to 1 day modules on Business Process Modeling, Service Oriented Architectures, and Embedded Systems Development using SysML, we'll be working through this example for 2 of the 5 day lab sessions.

Also, between now and the end of the year, anyone who orders Design Driven Testing directly from ICONIX will get a free copy of Agile Development with ICONIX Process.

This article (which actually represents the third incarnation of the ICONIX Business Modeling Roadmap), leverages two new capabilities from Sparx Systems, now available in Enterprise Architect. These are the Structured Scenario Editor and the Business Rule Composer. The article describes how these two quantum leaps in technology combine synergistically to enable a new process by combining business process modeling with behavioral code generation for business rules.

Page 1 of 3