Enterprise Architect version 14.0

download
purchase

enzh-CNcsnlfrdehiiditkoplptruskessv

My Profile

Social Media Channels

facebook  twitter  youtube   linkedin

Thanks to my colleague Matt Stephens for writing this tutorial on the CodeBot that he's built.  You can download the CodeBot tutorial here, and also watch a CodeBot Tutorial video  here.
codebot
Parallel Agile CodeBot™ is accesible through the Parallel Agile Add-in for Enterprise Architect.  The CodeBot generates a complete REST API from your UML domain model. This article illustrates step-by-step how to run CodeBot from within EA, and incorporate the instant generated API into your domain driven project.
Find out more about Parallel Agile and try out CodeBot for free.
By the way, we also offer training, consultancy, and support for projects using CodeBot and the Parallel Agile development process. If this sounds like just what your organization needs, contact us at This email address is being protected from spambots. You need JavaScript enabled to view it.
Published in Tutorials

This article will introduce you to both the Parallel Agile (PA) process and to the Parallel Agile Add-in for Enterprise Architect, which enables the PA process for Sparx customers.

Download the article in PDF here.

 

Parallel Agile Addin for Enterprise Architect

 

The Parallel Agile Add-in generates database access code and REST APIs from domain models, and it works in conjunction with the ICONIX Agile DDT Add-in, which generates acceptance tests from requirements and use cases.   Both add-ins are free.

We’ll discuss:

  • What’s Parallel Agile?
  • Compressing schedules with parallel development
  • Improving quality while compressing schedule
  • Why did we build an Enterprise Architect Add-in?
  • What’s an Executable Architecture?
  • What’s a Parallel Agile CodeBot?
  • Using the CodeBot to generate code for your domain model
  • Using our Cloud-Based Hosting Service to Test Your Generated API
  • Use Case Complexity Analyzer
  • Parallel Agile MDG Technology - supports Sprint Plans

 

 

 

Published in News

 

In Parts 1, 2, and 3 of this article series we introduced a student project that I’m managing at the University of Southern California Center for Systems and Software Engineering.  This article (Part 4) describes the results of our first semester’s effort.  We are currently getting started again on enhancements with a new group of students for the Spring 2017 semester.  USC’s location in downtown Los Angeles is at the epicenter of a lot of bad driving, so we’re attempting a “crowdsourced bad driver reporting system”.

Our system consists of a voice-activated “dashboard-cam” mobile app connected to a Mongo database in the cloud, via a Node JS REST API, and some Angular JS webpages to file, review and query bad driver reports.  This technology stack for our web-app is sometimes referred to as MEAN stack (Mongo, Express, Angular, Node).  We developed a native Android mobile app in Java, and a native iOS app in SWIFT.  Following the Resilient Agile process, and using Enterprise Architect to model the project, we attempted to go from zero to a working system in about 12 weeks of class time, by having students develop use cases in parallel with each other.  Previous articles in the series have presented snippets of the UML model but everybody knows that successful implementation is where the rubber meets the road.  So this article will show you how far we got.  

When the driver issues either the “Report Bad Driver” or “Emergency Alert” command, the mobile app triggers video upload and server-side creation of the Bad Driver Report or Emergency Alert Report, as appropriate .   The server then sends an email to the driver’s posting account with a link to a new report that’s pre-populated with the video.

While submitting the report, the poster reviews the video, records the license plate number, and grabs a single video frame that most clearly captures the offending vehicle.  The report is then made available for independent reviewers to evaluate.  The system requires unanimous agreement from 3 independent reviewers that the report is accurate.  Once this consensus has been achieved, the report is entered into a database that is queryable by insurance companies.

From a development standpoint, we were able to exploit parallelism among the students to complete this set of use cases (including defining requirements, UML design, and coding) in approximately 12 weeks of calendar time with 15 students each contributing 5 hours a week.  This calculates out to 900 student-hours or 22.5 equivalent full-time work weeks.  In other words, about half a person-year total effort.

Published in Case Studies