User Tools

Site Tools


MBSE Usability


ISO defines usability as “The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use.“

Our purpose is to identify how systems engineering languages such as SysML, processes, and supporting tools can be made easier to learn and use and to promote usability improvements.

Our definition of the Model Based Systems Engineering (MBSE) Environment is that it consists of:

  1. The Language in which people express models, (for now, predominantly SysML)
  2. The Tools that people use to build and manipulate models
  3. The Methods - that people follow to build models
  4. Synthesis - Transcends single category

Measure of Success

  1. Identify usability levels to focus discussions
  2. Paper 3 - SysML Usability Challenges (Usability Study) New as of Jan 2014 IW


DateMilestoneStatusPoint of Contact
Jan 2011Develop Usability Use CasesOngoingDavid Lempia
Nov 2012Complete Library Exemplar Document PublishedSubmitted for INCOSE 2013 ISScott Workinger
Feb 2013Condense Library Exemplar Paper to 15 pagesRon Lyles & David Lempia
June 2013Library Best PracticesWorking Group Sessions at ISBjorn Cole
Jan 2014Re-focus the Library PaperReady for IWBjorn Cole
Nov 2014Complete Paper 2 & 3 for the IS 2015Ready for IWBjorn Cole

Team Members

NameOrganizationContact Information
Armin MuellerScopeSET GmbH[email protected]
Dick Welling
Jen Narkevic
Lynn Baroff NASA Engineering & Safety Center [email protected]
Patricia Collins
Peter Campbell
Quoc Do
Ron Lyells Honeywell Inc. [email protected]
Scott Workinger
Tim Tritsch
Werner Altmann
Bjorn Cole Jet Propulsion Laboratory [email protected]


  • Emerging Usability Patterns in the Application of Modeling Libraries Abstract: This paper summarizes the conclusions of the MBSE Usability Group for 2012. Five perspectives are offered with each perspective grounded by an exemplar from the authors’ practice using Model Based Systems Engineering Environments. These exemplars range from very detailed step-by-step descriptions with identification of fine-grained usability issues to system of systems simulations developed by geographically dispersed operations in a global company. Discussion of each exemplar within the Usability Group led to significant lessons learned for the team. This paper summarized the chief lessons learned during this process. Today, the process continues. In the words of Ron Lyells, “Usability is an emergent quality.”

Usability Dimensions

A presentation on What is usability?

The usability dimensions describe key aspects of usability that are measurable during experiments. The four usability dimensions are:

  • Ease of Learning
  • Efficiency of Use (routine, non-routine)
  • Error Tolerance
  • Subjective (Satisfaction)


The scope of the usability groups work includes:

  • Process
  • Language / models (representational capabilities)
  • Tools

Usability Use Cases

Use cases will be collected that describe how a systems engineer creates engineering artifacts for a specific customer during a specific systems engineering process step. The type of information we will collect for use cases include:

  • Goal - What is the goal of the use case? (Focus on the produced engineering artifacts and the needs of the customer)
  • Actors – Who are the actors involved in this use case? Who is the customer?
  • Usability Hypothesis – What are the expected usability issues that may be encountered in this use case?
  • What systems engineering process is supported?
  • Pre-condition – What is the state of the tools and engineering artifacts before the use case begins. What are the inputs needed to start this use case.
  • Post-condition – What is the state of the tools and engineering artifacts after the use case finishes. What are the outputs from this use case.
  • Sequence of tasks - What are the tool independent tasks the primary actor does (Starts with a verb) (What SysML element(s) and/or diagram(s) is used?)
  • MBSE Value Added - What is the value added to this use case because I used MBSE as opposed to traditional methods?

Goal - Assemble components and associated behaviors from a library of primitives to meet the mission need

  1. Actors - Architect, systems engineers/component designers, librarian, interface designer, Change Control Board
  2. Process - Collaboration, Analysis & simulation, Configuration Management Tools
  3. Pre-condition - Mission needs understood, Library of components, each meeting criteria for reuse
  4. Post condition - Architect and collection of components, meeting the proposed mission need
  5. Sequence of tasks
    1. SE searches repository for components, based on criteria/desired function
    2. SE selects and connects component abstractions in system model
    3. SE/integrator initiates performance analysis, simulation to verify behavior
    4. SE reconfigures components as necessary

Goal - Conduct a Design Review using MBSD Environment

  1. Actors - SE, Architect/Designer, Customer, PM, Eng Mgmt, Peers
  2. Process - Collect changed artifacts and supporting artifacts into a form that can be commonly sharable, i.e. document or html form. Highlight all changed items.
  3. Pre-condition - Change Complete, Completed design review checklist
  4. Post condition - All issues adjudicated, Ready for re-baseline
  5. Sequence of tasks
    1. Identify modeling artifacts and external artifacts that have changed or support the change
    2. Create review artifact that is sharable across all reviewers. Artifact should highlight all changed items, both textual changes, changes to a diagrams/tables and any model element properties, including logical/physical elements, requirements, relationships, etc.
    3. Distribute review artifact and initiate review process
    4. Collect issues, resolve and capture resolution
    5. Review adjudication with reviewers
    6. Merge changes and re-baseline

Goal - Make assertions on current design

  1. Actors - Accountable engineer makes assertions, reviewers evaluate
  2. Process - Cross-cutting – focused on constraints
  3. Pre-condition - Assertions are made, simulations and analysis run
  4. Post condition - Reviews have concurred or not concurred that assertions are properly validated/tested
  5. Sequence of tasks -

Goal - Define system architecture and conduct architectural analysis

  1. Actors - Systems engineer performing architecting function
  2. Process - Many candidate methodologies for sys arch, UML/SysML diagram tips: Pkg diagrams, BDDs, Class diagrams
  3. Pre-condition - Architectural approach/methodology adopted, Modeling languages and tool(s) selected, Profiles exist and have been imported
  4. Post condition - System architecture (logical/conceptual) captured in system model and validated
  5. Sequence of tasks -
    1. Specify architectural properties and constraints/drivers
    2. Consider arch alternatives
    3. Describe architecture (use views/view points)
    4. Refine requirements and design
    5. Develop scenarios===== Reference Links =====

Documents and Presentations

2013 International Symposium

2011 International Symposium

2011 International Workshop

Use Case Analysis

MBSE Usability Collaboration Issues & Path Forward - May, 2011

Usability Reference Materials

Library and Reuse Materials

Candidate Usability Use Cases

Usability Exemplars from Industry

mbse/usability.txt · Last modified: 2015/07/21 15:50 by dlempia