Participants
TMT:
GMT:
LSST:
JPL:
ESO:
NoMagic:
NRAO:
Caltech:
INAF:
SKA:
Mexican Telescope:
Agenda
Topic | Who |
Telescope Modeling Challenge Team Activities | Jamie & Robert |
Overview of GMT MBSE Efforts | Josema |
Static and Dynamic Roll-Ups (Mass, Power, etc) | Robert |
Error Budget Modeling | Sebastian |
Monte Carlo Simulation and Acquisition Behaviors | Gelys |
LSST, Using Syndeia to Link Requirements & Verification Planning Environment in MagicDraw to Test Execution Environment in JIRA | Brian S. |
Consolidation of Syndeia, Datahub, OpenMBEE Graph Models | Discussion |
Cameo/Collaborator Requirement Management Tool & CM Using NoMagic by GMT | Brian W. |
OpenMBEE Tool Infrastructure and View & Viewpoint Paradigm (e.g. Systems Reasoner, docgen, jupyter notebook) | Robert |
Agreed Actions
Already received interested participants to contribute to next version of SE Cookbook, Jamie will coordinate efforts
GMT model to be shared with community in support of collaboration (George & Josema to follow up)
High resolution spectrograph for E-ELT SysML model to be shared on public teamwork server (Marco to follow up)
Jason and Saulius to follow up on MagicDraw updates; requirements and Excel synchronization
MBSE workshops to be scheduled on a regular basis, coordinated by Gelys
Presentations
Notes
Telescope Modeling Challenge Team Activities
MBSE pillars are SE fundamentals, MBSE tools, MBSE language
Challenges include general obscurity of SE concepts, which can be resolved in models by flexible MBSE tools/language when coupled with rigorous SE efforts and analyses; learning curve, time commitment, managerial buy-ins; limited resources; need for dominant standard to communicate
Benefits: rigorous analysis to decompose system into underlying parts, analyze their interactions to meet objectives with requirements; multi-scale integration; enhancements and optimization of SE fundamentals; standardization and integration to improve communication
TMT application to better understand complex system behaviors; the model captures requirements and scenarios, enables analysis, document production, standardized communication
MBSE is not applied to full system, only for optimization of complex pieces
Telescope Modeling Challenge Team is building on previous generation’s work to revise SE Cookbook and include TMT experiences and examples and produce TMT Case Study for SEBOK fall publication
Other telescope teams should think about doing something similar
Overview of MBSE Effort by GMT
Continuously working efforts to apply modeling to GMT
Operational life cycle model
Complex mission: science cases which are often open ended
Complex environment and life time goals
But still with these complexities, there’s a need to make a machine with precise functionality to deliver the mission
Modeling playground which shows various levels, views, and processes and how MBSE can overcome these challenges
Model is a representation of the system’s architecture, and we need a standard language with formal semantics to decompose the system
Observatory as a function; analysis should be pointed at the data to fulfill the science (performance function); also need to check efficiency and safety metrics
Define observing cases with properties and hierarchical structure relevant to POV of observatory’s performance function
Mapping of observing cases vs science cases to understand what science will be impacted if…
This mapping allows data to be managed and still usable when science cases change over lifetime of the observatory
Observing performance modes show system level properties that need to occur simultaneously; independent from design choices
Observing case analysis block with properties as input and system performance output
Analysis is a representation, not performed in SysML model; it defines which analyses need to be performed
Observatory configurations split between nominal and non-nominal configurations
Requirements flow down: science cases (articulates science book) → observing case (articulates science reqs) → observing performance mode (articulates observatory reqs) → observatory configuration (articulates observatory architecture) → data products and system element properties
Analysis performed at each step to enable science case, observing case; enable implies that a single input can provide several outputs, need to add normalization rules
Operation concept: performance, safety and efficiency functions all provide feedback to the observatory
MBSE benefits: system decomposition, requirements flow down, articulating life-cycle concepts, organizing analyses; formal framework to consider correctness and truth; finds holes in specifications
Summary:
Modeling goals
Consistency in flow down of requirements
Traceability between analysis and requirements
Support for system decomposition
Develop observatory top-level life-cycle concepts model
Playground
Enterprise Level - Needs View
Mission Level - Needs view, requirements View, ISO Applicable Processes
System Level - same as above
System Element Level - same as above
ISO/IEC/IEEE 1440
Stakeholder, Concern
Observatory as a function
Observing Cases
Moe - name, units - quantity kind - initial value
Static and Dynamic Roll-Ups (Mass, Power, etc.)
Dynamic roll-ups used for operational modes (such as standby, peak operating, etc) to get a power profile and compare maximum against requirement
State machines with constraints to describe behavior of system power
Parametric diagram exposes recursive characteristics of power roll-up patterns
Time analysis provides timeline of states and power profiles
Driving scenario: requirements determine constraints on timing, power budgets etc.; formalize requirements in SysML; analyze design wrt mass, power time in the tool; results verify if requirements are met (pass/fail)
Mass updates; JPL is working on web based app to update the model
Challenge is integration of SE discipline with domain specifications; goal is to consistently update the model to reflect updated CBEs in design
Error Budget Modeling
Requirements for how accurate system performance needs to be and check against CBEs
Challenges: mass and power roll-ups can refer to PBS, but error budget is not directly linked to the PBS; need integration with system design and check requirements for accuracy
Excel error budget
Accuracy of pupil measurement/alignment to 0.03% of diameter as motivating example
Error roll-up pattern similar to mass roll-up pattern; element with values requirement, CBE, and margin
Margin = (allocated – CBE) / allocated
Pattern application:
inherit from specific error roll-up type
subset “subError” property of ErrorRollUpPattern element
Define default values for all “leaf”-CBEs
Parametric diagram for linking requirements and PBS
Parametric solver using CST (or others); result = roll-up
Excel error budget vs SysML: SysML is better for linking and collaboration
Model can produce visualizations similar to Excel (such as instance tables)
NoMagic is working on requirements improvements; synchronization of Excel and SysML tables instead of one-time import; Jason & Saulius to schedule webinar to discuss this update
New pattern needed for interfaces
DataHub – NASA JPL uses for Instance specs
OpenMBEE docgen allows for documents that include updated simulations from the model
MagicDraw 19.0 early access to subset of features
Recording of session of R&D discussion on SUBSET of v19.o features: https://youtu.be/A5YAyumWlzE