Fine-Tuning Model Transformation: Change Propagation in Context of Consistency, Completeness, and Human Guidance. (bibtex)
by Alexander Egyed, Andreas Demuth, Achraf Ghabi, Roberto E. Lopez-Herrejon, Patrick Mäder, Alexander Nöhrer, Alexander Reder
Abstract:
An important role of model transformation is in exchanging modeling information among diverse modeling languages. However, while a model is typically constrained by other models, additional information is often necessary to transform said models entirely. This dilemma poses unique challenges for the model transformation community. To counter this problem we require a smart transformation assistant. Such an assistant should be able to combine information from diverse models, react incrementally to enable transformation as information becomes available, and accept human guidance - from direct queries to understanding the designer(s) intentions. Such an assistant should embrace variability to explicitly express and constrain uncertainties during transformation - for example, by transforming alternatives (if no unique transformation result is computable) and constraining these alternatives during subsequent modeling. We would want this smart assistant to optimize how it seeks guidance, perhaps by asking the most beneficial questions first while avoiding asking questions at inappropriate times. Finally, we would want to ensure that such an assistant produces correct transformation results despite the presence of inconsistencies. Inconsistencies are often tolerated yet we have to understand that their presence may inadvertently trigger erroneous transformations, thus requiring backtracking and/or sandboxing of transformation results. This paper explores these and other issues concerning model transformation and sketches challenges and opportunities.
Reference:
Fine-Tuning Model Transformation: Change Propagation in Context of Consistency, Completeness, and Human Guidance. (Alexander Egyed, Andreas Demuth, Achraf Ghabi, Roberto E. Lopez-Herrejon, Patrick Mäder, Alexander Nöhrer, Alexander Reder), In Proceedings of the 4th International Conference on Theory and Practice on Model Transformation (ICMT 2011), Zürich, Switzerland (Jordi Cabot, Eelco Visser, eds.), Springer, volume 6707, 2011.
Bibtex Entry:
@Conference{DBLP:conf/icmt/EgyedDGLMNR11,
  author    = {Alexander Egyed and Andreas Demuth and Achraf Ghabi and Roberto E. Lopez{-}Herrejon and Patrick Mäder and Alexander Nöhrer and Alexander Reder},
  title     = {Fine-Tuning Model Transformation: Change Propagation in Context of Consistency, Completeness, and Human Guidance.},
  booktitle = {Proceedings of the 4th International Conference on Theory and Practice on Model Transformation (ICMT 2011), Zürich, Switzerland},
  year      = {2011},
  editor    = {Jordi Cabot and Eelco Visser},
  volume    = {6707},
  series    = {Lecture Notes in Computer Science},
  pages     = {1-14},
  publisher = {Springer},
  abstract  = {An important role of model transformation is in exchanging modeling
	information among diverse modeling languages. However, while a model
	is typically constrained by other models, additional information
	is often necessary to transform said models entirely. This dilemma
	poses unique challenges for the model transformation community. To
	counter this problem we require a smart transformation assistant.
	Such an assistant should be able to combine information from diverse
	models, react incrementally to enable transformation as information
	becomes available, and accept human guidance - from direct queries
	to understanding the designer(s) intentions. Such an assistant should
	embrace variability to explicitly express and constrain uncertainties
	during transformation - for example, by transforming alternatives
	(if no unique transformation result is computable) and constraining
	these alternatives during subsequent modeling. We would want this
	smart assistant to optimize how it seeks guidance, perhaps by asking
	the most beneficial questions first while avoiding asking questions
	at inappropriate times. Finally, we would want to ensure that such
	an assistant produces correct transformation results despite the
	presence of inconsistencies. Inconsistencies are often tolerated
	yet we have to understand that their presence may inadvertently trigger
	erroneous transformations, thus requiring backtracking and/or sandboxing
	of transformation results. This paper explores these and other issues
	concerning model transformation and sketches challenges and opportunities.},
  doi       = {10.1007/978-3-642-21732-6_1},
  file      = {:Conferences\\ICMT 2011 - Fine-Turing Model Transformation\\Fine-Tuning Model Transformation-preprint.pdf:PDF},
  keywords  = {FWF P23115, FWF M1268, EU IEF 254965},
  slides    = {Fine-Tuning Model Transformation:Conferences\\ICMT 2011 - Fine-Turing Model Transformation\\Egyed Keynote ICMT - Fine-Tuning Transformation.pdf},
}
Powered by bibtexbrowser