User Tools

Site Tools


start

**This is an old revision of the document!** ----

A PCRE internal error occured. This might be caused by a faulty plugin

====== Model Interchange Wiki ====== Welcome to the Model Interchange Wiki, the public portal for the Model Interchange Working Group (MIWG). The MIWG is chartered to: * demonstrate model interchange among MOF-based tools that implement modeling languages such as UML, SysML and UPDM and use XMI as the interchange standard * identify and resolve interchange issues associated with specifications and vendor implementations * establish a demonstration infrastructure to support the above, including validation tools, demonstration processes and guidelines Beginning in December 2008, the MIWG has defined a [[start#Test Suite|test suite]] of 40 test cases to demonstrate interchange of UML, SysML, SoaML and UPDM models between different modeling tools. Participating tool vendors have agreed to publicly post XMI exports from their tools for the MIWG test cases. The objective is to enable the public at large to assess model interchange capability of the modeling tools by comparing the vendor XMI exports to the expected reference XMI file for each test case. These assessments may be used for a variety of purposes, including: * Evaluation of the interchange capability of a particular tool as part of a tool selection process * Assessment of the model interchange capabilities and limitations of a tool or tools to set expectations and to develop interchange strategies for the use of the tools on a project The test suite and guidance for how to assess interchange capability using this test suite are summarized below. Please send any questions or requests for further information to [[[email protected]]]. ====== Quick Links ====== * [[start#Test Suite]] * [[http://svn.omg.org/repos/OMG-Model-Interchange/branches/Public/Tests/|Vendor Test Submission Repository]] (Username: guest/Password: guest) (see also [[start#How to Use the Vendor-Provided Test Submissions]]) * [[http://validator.omg.org/se-interop/tools/validator|NIST Validator]] (see also [[start#How to Use the NIST Validator to Assess Model Interchange]]) * {{MIWG-roadmap-120116-reva-draft-sf.ppt|Roadmap}} * [[MIWGInternal|MIWG Internal Wiki]] ====== Press Releases ====== * [[http://www.omg.org/news/releases/pr2011/12-01-11.htm|UML/SysML Tool Vendor Model Interchange Test Case Results Now Available, December 1, 2011]] * [[MIWG Update|MIWG Update presented at OMG meeting in Arlington, VA, March 23, 2010]] * [[InteropDemo1|OMG Conducts Model Interoperability Demonstration at Long Beach CA, December 7, 2009]] * [[http://www.omg.org/news/releases/pr2009/07-08-09.htm|OMG Announces Model Interoperability Working Group, July 8, 2009]] ====== Participating Tool Vendors ====== ^ Vendor ^ Point of Contact ^ Tool ^ | PTC (Atego) | Simon Moore | Artisan<sup>®</sup> Studio | | IBM | | RSx | | IBM/Sodius | Eldad Palachi/Mickael Albert| IBM Rhapsody | | No Magic | Nerijus Jankevicius | MagicDraw | | SOFTEAM | Etienne Brosse | Modelio | | Sparx Systems | J. D. Baker | Enterprise Architect | ====== Other Participants ====== * Peter Denno, NIST (Validator) * Sandy Friedenthal, SAF Consulting * Leonard Levine, DISA * Pete Rivett, Adaptive * Ed Seidewitz, Model Driven Solutions (Model Interchange SIG Chair) ====== Test Suite ====== The MIWG Test Suite currently consists of 48 test cases. Thirty of these are for UML 2.4.1 (twenty-five have UML 2.3 versions), some of which are also applicable to SysML, thirteen are for SysML 1.3 (ten have SysML 1.2 versions), one is for SoaML 1.0.1 and four are for UPDM 2.0.1. These test cases cover approximately 94% of UML metaclasses (for UML 2.4.1) and 100% of SysML stereotypes (for SysML 1.3, not including deprecated stereotypes). (For a spreadsheet showing detailing this coverage, click {{uml-and-sysml-coverage-reports-updated-2.xlsx|here}}.) Each test case consists of one or more diagrams and a corresponding reference "valid XMI" file for the model represented in the diagrams (for Test Case 3 and 19 there are two XMI files). All XMI conforms to either v2.1 of the XMI specification (for UML 2.3, SysML 1.2, SoaML 1.0.1 and UPDM 2.0.1) or v2.4.1 (for UML 2.4.1 and SysML 1.3). ===== UML Test Cases ===== * Test Case 1 ([[Test Case 1 UML 2.4|UML 2.4.1]]) ([[Test Case 1 UML 2.3|UML 2.3]]) - Simple Class Model * Test Case 2 ([[Test Case 2 UML 2.4|UML 2.4.1]]) ([[Test Case 2 UML 2.3|UML 2.3]]) - Advanced Class Model * Test Case 3 ([[Test Case 3 UML 2.4|UML 2.4.1]]) ([[Test Case 3 UML 2.3|UML 2.3]]) - Definition and Application of Profile* * Test Case 4 ([[Test Case 4 UML 2.4|UML 2.4.1]]) ([[Test Case 4 UML 2.3|UML 2.3]]) - Simple Activity Model (fUML subset - executable)* * Test Case 5 ([[Test Case 5 UML 2.4|UML 2.4.1]]) ([[Test Case 5 UML 2.3|UML 2.3]]) - Advanced Activity Model (fUML subset - executable)* * Test Case 6 ([[Test Case 6 UML 2.4|UML 2.4.1]]) ([[Test Case 6 UML 2.3|UML 2.3]]) - Composite Structure * Test Case 7 ([[Test Case 7 UML 2.4|UML 2.4.1]]) ([[Test Case 7 UML 2.3|UML 2.3]]) - State Machines* * Test Case 8 ([[Test Case 8 UML 2.4|UML 2.4.1]]) ([[Test Case 8 UML 2.3|UML 2.3]]) - Use Cases* * Test Case 9 ([[Test Case 9 UML 2.4|UML 2.4.1]]) ([[Test Case 9 UML 2.3|UML 2.3]]) - Interactions* * Test Case 12b ([[Test Case 12b UML 2.4|UML 2.4.1]]) ([[Test Case 12b UML 2.3|UML 2.3]]) - Activity Swim Lanes * Test Case 13 ([[Test Case 13 UML 2.4|UML 2.4.1]]) ([[Test Case 13 UML 2.3|UML 2.3]]) - Instance Specifications * Test Case 15 ([[Test Case 15 UML 2.4|UML 2.4.1]]) ([[Test Case 15 UML 2.3|UML 2.3]]) - Structured Activity Nodes * Test Case 17b ([[Test Case 17b UML 2.4|UML 2.4.1]]) ([[Test Case 17b UML 2.3|UML 2.3]]) - Collaborations * Test Case 19 ([[Test Case 19 UML 2.4|UML 2.4.1]]) ([[Test Case 19 UML 2.3|UML 2.3]]) - Simple Model Federation * Test Case 23 ([[Test Case 23 UML 2.4|UML 2.4.1]]) ([[Test Case 23 UML 2.3|UML 2.3]]) - Components and Component Realization * Test Case 24 ([[Test Case 24 UML 2.4|UML 2.4.1]]) ([[Test Case 24 UML 2.3|UML 2.3]]) - Components: Ball-and-Socket Notation * Test Case 25 ([[Test Case 25 UML 2.4|UML 2.4.1]]) ([[Test Case 25 UML 2.3|UML 2.3]]) - Deployments * Test Case 26 ([[Test Case 26 UML 2.4|UML 2.4.1]]) ([[Test Case 26 UML 2.3|UML 2.3]]) - Classifier Templates * Test Case 27 ([[Test Case 27 UML 2.4|UML 2.4.1]]) ([[Test Case 27 UML 2.3|UML 2.3]]) - Activities: Data Store Related * Test Case 28 ([[Test Case 28 UML 2.4|UML 2.4.1]]) ([[Test Case 28 UML 2.3|UML 2.3]]) - Parameter Sets * Test Case 29 ([[Test Case 29 UML 2.4|UML 2.4.1]]) ([[Test Case 29 UML 2.3|UML 2.3]]) - Additional Invocation-Related Actions * Test Case 30 ([[Test Case 30 UML 2.4|UML 2.4.1]]) ([[Test Case 30 UML 2.3|UML 2.3]]) - Sequenced Actions * Test Case 31 ([[Test Case 31 UML 2.4|UML 2.4.1]]) ([[Test Case 31 UML 2.3|UML 2.3]]) - Variable-Related Actions * Test Case 32 ([[Test Case 32 UML 2.4|UML 2.4.1]]) ([[Test Case 32 UML 2.3|UML 2.3]]) - Link-Object-Related Actions * Test Case 33 ([[Test Case 33 UML 2.4|UML 2.4.1]]) ([[Test Case 33 UML 2.3|UML 2.3]]) - Classification and Reduction Actions * Test Case 34 ([[Test Case 34 UML 2.4|UML 2.4.1]]) - Interactions: Fragments * Test Case 44 ([[Test Case 44 UML 2.4|UML 2.4.1]]) - Interactions: Interaction Use * Test Case 45 ([[Test Case 45 UML 2.4|UML 2.4.1]]) - Timing * Test Case 46 ([[Test Case 46 UML 2.4|UML 2.4.1]]) - Imports and Dependencies * Test Case 47 ([[Test Case 47 UML 2.4|UML 2.4.1]]) - Information Flows (* These test cases are also applicable to SysML.) ===== SysML Test Cases ===== * Test Case 10 ([[Test Case 10 SysML 1.3|SysML 1.3]]) ([[Test Case 10 SysML 1.2|SysML 1.2]]) - Simple SysML Structure (BDDs and IBDs) * Test Case 11 ([[Test Case 11 SysML 1.3|SysML 1.3]]) ([[Test Case 11 SysML 1.2|SysML 1.2]]) - Requirements * Test Case 12a ([[Test Case 12a SysML 1.3|SysML 1.3]]) ([[Test Case 12a SysML 1.2|SysML 1.2]]) - Activity Swim Lanes * Test Case 14 ([[Test Case 14 SysML 1.3|SysML 1.3]]) ([[Test Case 14 SysML 1.2|SysML 1.2]]) - Parametrics (including Value Types with Units) * Test Case 16 ([[Test Case 16 SysML 1.3|SysML 1.3]]) ([[Test Case 16 SysML 1.2|SysML 1.2]]) - Allocations * Test Case 35 ([[Test Case 35 SysML 1.3|SysML 1.3]]) ([[Test Case 35 SysML 1.2|SysML 1.2]]) - Views and Viewpoints * Test Case 36 ([[Test Case 36 SysML 1.3|SysML 1.3]]) ([[Test Case 36 SysML 1.2|SysML 1.2]]) - Blocks and Properties (Advanced) * Test Case 37 ([[Test Case 37 SysML 1.3|SysML 1.3]]) ([[Test Case 37 SysML 1.2|SysML 1.2]]) - Requirements (Advanced) * Test Case 38 ([[Test Case 38 SysML 1.3|SysML 1.3]]) ([[Test Case 38 SysML 1.2|SysML 1.2]]) - Activities (Advanced) * Test Case 39 ([[Test Case 39 SysML 1.3|SysML 1.3]]) ([[Test Case 39 SysML 1.2|SysML 1.2]]) - Allocate Activity Partition * Test Case 48 ([[Test Case 48 SysML 1.3|SysML 1.3]]) - Full Ports * Test Case 49 ([[Test Case 49 SysML 1.3|SysML 1.3]]) - Nested Ports * Test Case 50 ([[Test Case 50 SysML 1.3|SysML 1.3]]) - Directed Features ===== SoaML Test Cases ===== * Test Case 17a ([[Test Case 17a SoaML 1.0.1|SoaML 1.0.1]]) - Service Architecture ===== UPDM Test Cases ===== * Test Case 18 ([[Test Case 18 UPDM 2.0.1|UPDM 2.0.1]]) - OV-2 Performer * Test Case 20 ([[Test Case 20 UPDM 2.0.1|UPDM 2.0.1]]) - CV-1 Architectural Description * Test Case 21 ([[Test Case 21 UPDM 2.0.1|UPDM 2.0.1]]) - CV-2 Capability Taxonomy * Test Case 22 ([[Test Case 22 UPDM 2.0.1|UPDM 2.0.1]]) - CV-4 Capability Dependencies * Test Case 40 - Reserved for future use * Test Case 41 - Reserved for future use * Test Case 42 - Reserved for future use * Test Case 43 - Reserved for future use ====== How to Use the Vendor-Provided Test Submissions ====== To exercise a MIWG test case, a tool vendor creates a model in their tool by reproducing the diagrams for the test case. The vendor then uses the capabilities of the tool to export the model to an XMI 2.1 file, which is submitted as the result of the test for that tool. Each of the participating MIWG vendors have submitted results from the latest version of their tools for all the test cases that their tools support. These test submissions are maintained in a Subversion repository and may be updated from time to time to reflect new releases of the vendor tools. They can be used in two ways: - You can download a vendor submission for a test case from this repository at any time and run it through the [[http://syseng.nist.gov/se-interop/sysml/validator|NIST Validator]] (as described below) in order to assess XMI conformance of the vendor's tool for this test case. - You can take submitted XMI for a test case exported from one tool and attempt to import it into a different tool, in order the assess the actual ability to interchange models between those tools in the area covered by the test case. The public vendor test submission repository is available [[https://dev.enterprisecomponent.com/repository/repos/OMG-Model-Interchange/branches/Public/Tests/|here]]. Log in using the user name //guest// with password //guest//. The repository has two directories, UML2.3-XMI2.1 for UML tests and SysML1.2-XMI2.1 for SysML tests. Within these directories, there are subdirectories for each test case, with the following content: * **Test-Case-N/** -- where N is a number (1,2,3...) of the test case * **diagram.png** -- the test case diagram * **valid.xmi** -- valid XMI for the test case * **valid-canonical.xmi** -- canonical XMI version of the valid XMI for the test case * **Submissions/** -- A directory for vendor upload of results of Test-Case-N * **toolnameX/** -- A directory for results from a specific vendor's tool (Note that the toolname includes version information.) * **our-diagram.png** -- Screen shot of vendor's implementation of diagram.png * **our-export.xmi** -- Vendor's XMI corresponding to his implementation of diagram.png ====== How to Use the NIST Validator to Assess Model Interchange ====== You can use the Validator tool from NIST in order to assess the XMI exported from a tool against a specific MIWG test case. For a participating MIWG vendor, you can download submitted test results from the public MIWG repository, as discussed [[start#How to Use the Vendor-Provided Test Submissions|above]]. For other vendors, you can request that they generate appropriate test results for you to assess. (Or you can encourage them to participate in the MIWG!) To assess model interchange using the Validator, do the following: - Go to the Validator Web page [[http://validator.omg.org/se-interop/tools/validator|here]]. - Enter the path for the XMI file to be assessed, as exported from the tool being tested. - Select the MIWG test case to which it is to be compared. - Click on "Upload and Process". The Validator will then report both on the correctness of the submitted XMI file as a representation of a UML or SysML model and on how it compares with the valid XMI for the test case. Technically, the results from the Validator do not assess interoperability directly. Instead, they assess the conformance of the export from the tool to XMI and other OMG standards used. However, it is such conformance -- that is, the correct implementation of these standards -- that provides the basis for interoperability. ===== Special Instructions for Test Case 3 ===== Unlike other test cases, Test Case 3 has two XMI files. One contains a profile and the other contains a model with that profile applied. A test submission for Test Case 3 should therefore also have two corresponding exported XMI files, each of which may be separately assessed against the appropriate valid XMI file using the Validator. However, in order for the Validator to properly handle the profile application, the exported XMI for the Test Case 3 profile has to be "loaded" //before// the exported XMI for the Test Case 3 model is assessed. Do this as follows: - Go to the Validator "Load Profiles" page [[http://validator.omg.org/se-interop/tools/load-profiles|here]]. - Enter //%%valid.profile.xmi%%// in the field "Reference using this URI". - Enter the path for the exported XMI file for the Test Case 3 profile. - Click on "Upload and Process". Once this is done, you can then process the Test Case 3 model XMI as described previously. ===== Notes on XMI Comparison and Canonical XMI ===== For the XMI comparison, the Validator uses the //canonical XMI// version of the valid XMI test case. The XMI standard allows a number of different options and variability in how a model may be represented in XMI (for example, whether a property is represented as an XML attribute or element, or how XMI IDs are formed). Canonical XMI is an additional conformance point for the XMI specification that eliminates variability in generating the XMI for a model. So, there is only one way in which any model may be correctly represented in canonical XMI. This makes it much simpler to test whether two canonical XMI files represent the same model and, if not, what their differences are. At this time, the tools that import canonical XMI include Atego Artisan Studio, and IBM RSx. No tools export canonical XMI directly. ====== Interoperability Reports for Tool Users ====== Currently, the NIST Validator tools are targeted for use by the tool vendor participants of the MIWG -- the tools are focused on helping the vendors fix problems in interoperability. Due to this focus, it isn't always easy for the typical tool user to determine, through use of the Validator, the prospects for interoperability of tools he is considering. To address the needs of typical tool users, we have begun development of a new method of reporting results from interoperability testing. The new method will reduce the effort required to run tests and it will provide a very different perspective on the results of testing. In the new method, a user will select a tool from among those participating in the MIWG, and receive a report summarizing the interoperability concerns identified in the testing of that tool. The report, which will be in a spreadsheet format, will provide a page of information for each of the (currently) 16 tests of the MIWG test suite. Each page will provide the following information: * A description of the test domain (e.g. Class diagrams, composite structure, etc.). * An enumeration of the objects and their properties covered in the test case. * A figure depicting the diagram on which the test is based. * A description of the interoperablity concerns encountered, including: - A description, intended to be intelligible to the typical tool user, of the concern. - Identification of the object types on which the problem is occuring. - An assessment of the severity of the concern ("quite significant", "moderately significant" , "minor")

start.1442415018.txt.gz · Last modified: 2015/09/16 10:50 by edseidewitz