User Tools

Site Tools


mbse:telescope_mbse_sig_meetings_pasadena_april_2017

Participants

TMT:

  • Jamie Nakawatase (jamie@tmt.org)
  • Gelys Trancho (gtrancho@tmt.org)
  • Kayla Hardie (khardie@tmt.org), dial-in

GMT:

  • George Angeli (gangeli@gmto.org)
  • Josema Filgueira (jmf@gmto.org)
  • Brian Walls (bwalls@gmto.org)
  • John Miles (jmiles@gmto.org)
  • Adam Contos (bwalls@gmto.org)
  • Oliver McIrwin (omcirwin@gmto.org)
  • Rebecca Bernstein (rbernstein@gmto.org)
  • Maria Hernandez (mhernandez@gmto.org)
  • Leni Malherbe (lmalherbe@gmto.org)
  • Ruth Paredes (rparedes@gmto.org)
  • James Fanson (Project Manager)

LSST:

  • Brian Selvy (BSelvy@lsst.org)
  • Michael Reuter (mareuter@lsst.org), dial-in

JPL:

  • Robert Karban (Robert.Karban@jpl.nasa.gov)
  • Sebastian Herzig (Sebastian.J.Herzig@jpl.nasa.gov)
  • Frank Dekens (Frank.G.Dekens@jpl.nasa.gov)

ESO:

  • Gianluca Chiozzi (gchiozzi@eso.org), dial-in

NoMagic:

  • Saulius Pavalkis (saulius.pavalkis@nomagic.com)
  • Jason Wilson (jwilson@nomagic.com)

NRAO:

  • Richard Prestage (rprestag@nrao.edu)

Caltech:

  • Solange Ramirez (solange@ipac.caltech.edu), dial-in

INAF:

  • Marco Riva (marco.riva@brera.inaf.it), dial-in

SKA:

  • Antonio Chrysostomou (A.Chrysostomou@skatelescope.org), dial-in
  • Daniel Hayden (D.Hayden@skatelescope.org), dial-in

Mexican Telescope:

  • Gengis Toledo (gengis.toledo@cidesi.mx), dial-in

Agenda

TopicWho
Telescope Modeling Challenge Team Activities Jamie & Robert
Overview of GMT MBSE Efforts Josema
Static and Dynamic Roll-Ups (Mass, Power, etc) Robert
Error Budget Modeling Sebastian
Monte Carlo Simulation and Acquisition Behaviors Gelys
LSST, Using Syndeia to Link Requirements & Verification Planning Environment in MagicDraw to Test Execution Environment in JIRA Brian S.
Consolidation of Syndeia, Datahub, OpenMBEE Graph Models Discussion
Cameo/Collaborator Requirement Management Tool & CM Using NoMagic by GMT Brian W.
OpenMBEE Tool Infrastructure and View & Viewpoint Paradigm (e.g. Systems Reasoner, docgen, jupyter notebook) Robert

Agreed Actions

  • Already received interested participants to contribute to next version of SE Cookbook, Jamie will coordinate efforts
  • GMT model to be shared with community in support of collaboration (George & Josema to follow up)
  • High resolution spectrograph for E-ELT SysML model to be shared on public teamwork server (Marco to follow up)
  • Jason and Saulius to follow up on MagicDraw updates; requirements and Excel synchronization
  • MBSE workshops to be scheduled on a regular basis, coordinated by Gelys

Presentations

Notes

Telescope Modeling Challenge Team Activities

  • MBSE pillars are SE fundamentals, MBSE tools, MBSE language
  • Challenges include general obscurity of SE concepts, which can be resolved in models by flexible MBSE tools/language when coupled with rigorous SE efforts and analyses; learning curve, time commitment, managerial buy-ins; limited resources; need for dominant standard to communicate
  • Benefits: rigorous analysis to decompose system into underlying parts, analyze their interactions to meet objectives with requirements; multi-scale integration; enhancements and optimization of SE fundamentals; standardization and integration to improve communication
  • TMT application to better understand complex system behaviors; the model captures requirements and scenarios, enables analysis, document production, standardized communication
  • MBSE is not applied to full system, only for optimization of complex pieces
  • Telescope Modeling Challenge Team is building on previous generation’s work to revise SE Cookbook and include TMT experiences and examples and produce TMT Case Study for SEBOK fall publication
  • Other telescope teams should think about doing something similar

Overview of MBSE Effort by GMT

  • Continuously working efforts to apply modeling to GMT
  • Operational life cycle model
  • Complex mission: science cases which are often open ended
  • Complex environment and life time goals
  • But still with these complexities, there’s a need to make a machine with precise functionality to deliver the mission
  • Modeling playground which shows various levels, views, and processes and how MBSE can overcome these challenges
  • Model is a representation of the system’s architecture, and we need a standard language with formal semantics to decompose the system
  • Observatory as a function; analysis should be pointed at the data to fulfill the science (performance function); also need to check efficiency and safety metrics
  • Define observing cases with properties and hierarchical structure relevant to POV of observatory’s performance function
  • Mapping of observing cases vs science cases to understand what science will be impacted if…
  • This mapping allows data to be managed and still usable when science cases change over lifetime of the observatory
  • Observing performance modes show system level properties that need to occur simultaneously; independent from design choices
  • Observing case analysis block with properties as input and system performance output
  • Analysis is a representation, not performed in SysML model; it defines which analyses need to be performed
  • Observatory configurations split between nominal and non-nominal configurations
  • Requirements flow down: science cases (articulates science book) → observing case (articulates science reqs) → observing performance mode (articulates observatory reqs) → observatory configuration (articulates observatory architecture) → data products and system element properties
  • Analysis performed at each step to enable science case, observing case; enable implies that a single input can provide several outputs, need to add normalization rules
    • Set of analysis or synthesis cases
  • Operation concept: performance, safety and efficiency functions all provide feedback to the observatory
  • MBSE benefits: system decomposition, requirements flow down, articulating life-cycle concepts, organizing analyses; formal framework to consider correctness and truth; finds holes in specifications
  • Summary:
  • Modeling goals
    • Consistency in flow down of requirements
    • Traceability between analysis and requirements
    • Support for system decomposition
    • Develop observatory top-level life-cycle concepts model
  • Playground
    • Enterprise Level - Needs View
    • Mission Level - Needs view, requirements View, ISO Applicable Processes
    • System Level - same as above
    • System Element Level - same as above
  • ISO/IEC/IEEE 1440
    • Built an architecture framework to build vocabulary to describe system
    • They modified it.
  • Stakeholder, Concern
    • Formal semantics about the elements they use
  • Observatory as a function
    • Performance Function
    • Efficiency Function
    • Safety Function - function of observatory case, environment = obs_safety_metrics
  • Observing Cases
    • Set of properties relevant to the astronomer that need to occur simultaneously
  • Moe - name, units - quantity kind - initial value

Static and Dynamic Roll-Ups (Mass, Power, etc.)

  • Dynamic roll-ups used for operational modes (such as standby, peak operating, etc) to get a power profile and compare maximum against requirement
  • State machines with constraints to describe behavior of system power
  • Parametric diagram exposes recursive characteristics of power roll-up patterns
  • Time analysis provides timeline of states and power profiles
  • Driving scenario: requirements determine constraints on timing, power budgets etc.; formalize requirements in SysML; analyze design wrt mass, power time in the tool; results verify if requirements are met (pass/fail)
  • Mass updates; JPL is working on web based app to update the model
  • Challenge is integration of SE discipline with domain specifications; goal is to consistently update the model to reflect updated CBEs in design

Error Budget Modeling

  • Requirements for how accurate system performance needs to be and check against CBEs
  • Challenges: mass and power roll-ups can refer to PBS, but error budget is not directly linked to the PBS; need integration with system design and check requirements for accuracy
  • Excel error budget
    • Pro: simple interface
    • Con: no explicit linking, verification is done by manual checking or independent model
  • Accuracy of pupil measurement/alignment to 0.03% of diameter as motivating example
  • Error roll-up pattern similar to mass roll-up pattern; element with values requirement, CBE, and margin
  • Margin = (allocated – CBE) / allocated
  • Pattern application:
    • inherit from specific error roll-up type
    • subset “subError” property of ErrorRollUpPattern element
    • Define default values for all “leaf”-CBEs
  • Parametric diagram for linking requirements and PBS
  • Parametric solver using CST (or others); result = roll-up
    • formalized requirement
    • automated roll-up
    • automated verification of requirement
  • Excel error budget vs SysML: SysML is better for linking and collaboration
  • Model can produce visualizations similar to Excel (such as instance tables)
  • NoMagic is working on requirements improvements; synchronization of Excel and SysML tables instead of one-time import; Jason & Saulius to schedule webinar to discuss this update
  • New pattern needed for interfaces

DataHub – NASA JPL uses for Instance specs

OpenMBEE docgen allows for documents that include updated simulations from the model

MagicDraw 19.0 early access to subset of features
Recording of session of R&D discussion on SUBSET of v19.o features: https://youtu.be/A5YAyumWlzE

  • MAIN attention is on presentation of “Final scope of Excel integration”
  • Covers:
    • OSLC consumer use cases
    • Extending the hyperlink concept
    • Final scope of Excel integration
    • Final implementation of Find in Diagrams: demo
mbse/telescope_mbse_sig_meetings_pasadena_april_2017.txt · Last modified: 2018/07/09 18:14 by acrawford