This is an old revision of the document!
ISO defines usability as “The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use.“
Our purpose is to identify how systems engineering languages such as SysML, processes, and supporting tools can be made easier to learn and use and to promote usability improvements.
Our definition of the Model Based Systems Engineering (MBSE) Environment is that it consists of:
Date | Milestone | Status | Point of Contact |
---|---|---|---|
Jan 2011 | Develop Usability Use Cases | Ongoing | David Lempia |
Nov 2012 | Complete Library Exemplar Document Published | Submitted for INCOSE 2013 IS | Scott Workinger |
Feb 2013 | Condense Library Exemplar Paper to 15 pages | Ron Lyles & David Lempia | |
June 2013 | Library Best Practices | Working Group Sessions at IS | Bjorn Cole |
Dec 2013 | Complete Library Best Practices | Ready for IW | Bjorn Cole |
Name | Organization | Contact Information |
---|---|---|
Armin Mueller | ScopeSET GmbH | [email protected] |
David Lempia | Rockwell Collins | [email protected] |
Dick Welling | ||
Jen Narkevic | ||
Lynn Baroff | NASA Engineering & Safety Center | [email protected] |
Patricia Collins | ||
Peter Campbell | ||
Quoc Do | ||
Ron Lyells | Honeywell Inc. | [email protected] |
Scott Workinger | ||
Tim Tritsch | ||
Werner Altmann | ||
Bjorn Cole | Jet Propulsion Laboratory | [email protected] |
A presentation on What is usability? Emerging Usability Patterns in the application of modeling libraries, Published Feb 2013 Emerging Usability Patterns in the Application of Modeling Libraries Abstract: This paper summarizes the conclusions of the MBSE Usability Group for 2012. Five perspectives are offered with each perspective grounded by an exemplar from the authors’ practice using Model Based Systems Engineering Environments. These exemplars range from very detailed step-by-step descriptions with identification of fine-grained usability issues to system of systems simulations developed by geographically dispersed operations in a global company. Discussion of each exemplar within the Usability Group led to significant lessons learned for the team. This paper summarized the chief lessons learned during this process. Today, the process continues. In the words of Ron Lyells, “Usability is an emergent quality.” Usability Use Cases collected January 31, 2011 The usability dimensions describe key aspects of usability that are measurable during experiments. The four usability dimensions are: * Ease of Learning * Efficiency of Use (routine, non-routine) * Error Tolerance * Subjective (Satisfaction) The scope of the usability groups work includes: * Process * Language / models (representational capabilities) * Tools Use cases will be collected that describe how a systems engineer creates engineering artifacts for a specific customer during a specific systems engineering process step. The type of information we will collect for use cases include: * Goal - What is the goal of the use case? (Focus on the produced engineering artifacts and the needs of the customer) * Actors – Who are the actors involved in this use case? Who is the customer? * Usability Hypothesis – What are the expected usability issues that may be encountered in this use case? * What systems engineering process is supported? * Pre-condition – What is the state of the tools and engineering artifacts before the use case begins. What are the inputs needed to start this use case. * Post-condition – What is the state of the tools and engineering artifacts after the use case finishes. What are the outputs from this use case. * Sequence of tasks - What are the tool independent tasks the primary actor does (Starts with a verb) (What SysML element(s) and/or diagram(s) is used?) * MBSE Value Added - What is the value added to this use case because I used MBSE as opposed to traditional methods? Goal - Assemble components and associated behaviors from a library of primitives to meet the mission need - Actors - Architect, systems engineers/component designers, librarian, interface designer, Change Control Board - Process - Collaboration, Analysis & simulation, Configuration Management Tools - Pre-condition - Mission needs understood, Library of components, each meeting criteria for reuse - Post condition - Architect and collection of components, meeting the proposed mission need - Sequence of tasks - SE searches repository for components, based on criteria/desired function - SE selects and connects component abstractions in system model - SE/integrator initiates performance analysis, simulation to verify behavior - SE reconfigures components as necessary Goal - Conduct a Design Review using MBSD Environment - Actors - SE, Architect/Designer, Customer, PM, Eng Mgmt, Peers - Process - Collect changed artifacts and supporting artifacts into a form that can be commonly sharable, i.e. document or html form. Highlight all changed items. - Pre-condition - Change Complete, Completed design review checklist - Post condition - All issues adjudicated, Ready for re-baseline - Sequence of tasks - Identify modeling artifacts and external artifacts that have changed or support the change - Create review artifact that is sharable across all reviewers. Artifact should highlight all changed items, both textual changes, changes to a diagrams/tables and any model element properties, including logical/physical elements, requirements, relationships, etc. - Distribute review artifact and initiate review process - Collect issues, resolve and capture resolution - Review adjudication with reviewers - Merge changes and re-baseline Goal - Make assertions on current design - Actors - Accountable engineer makes assertions, reviewers evaluate - Process - Cross-cutting – focused on constraints - Pre-condition - Assertions are made, simulations and analysis run - Post condition - Reviews have concurred or not concurred that assertions are properly validated/tested - Sequence of tasks - Goal - Define system architecture and conduct architectural analysis**