Dbm (discuter | contributions) m |
Dbm (discuter | contributions) m |
||
Ligne 13 : | Ligne 13 : | ||
::#''model analysis'': DBMS models, UML data model | ::#''model analysis'': DBMS models, UML data model | ||
− | :*<b>The Individual model</b>. Reference [P74-04] reports on the first version on the Individual model (a variant of the ER model), which was the main component of the MERISE methodology. This model emerged from hot discussions in a French-Belgian think tank [P74-03]. Hubert Tardieu, the architect of the Merise methodology, was a member of the team (though not a co-author of [P74-03] for reasons I can't remember any more). | + | :*<b>The Individual model</b> ''(abstract models)''. Reference [P74-04] reports on the first version on the Individual model (a variant of the ER model), which was the main component of the MERISE methodology. This model emerged from hot discussions in a French-Belgian think tank [P74-03]. Hubert Tardieu, the architect of the Merise methodology, was a member of the team (though not a co-author of [P74-03] for reasons I can't remember any more). |
− | :*<b>The IDA Entity-relationship model</b>. In 1983, François Bodart and Yves Pigneur published a book describing the IDA methodology, comprising models and design methods for various aspects of information system conceptual design [B83]. One of the models (called Entity-relationship model, though significantly different from P. Chen’s model) was devoted to information structures specifications. A second book addressed logical database design [B86]. | + | :*<b>The IDA Entity-relationship model</b> ''(abstract models)''. In 1983, François Bodart and Yves Pigneur published a book describing the IDA methodology, comprising models and design methods for various aspects of information system conceptual design [B83]. One of the models (called Entity-relationship model, though significantly different from P. Chen’s model) was devoted to information structures specifications. A second book addressed logical database design [B86]. |
− | :*<b>The GAM</b>. The Generalized Access Model derived from the early work on technology-independent logical models [P74-01]. This binary model was the basis of three research lines: (1) database performance analysis [P76-02] [P77-01], (2) a first study on schema transformation [P81-02] and (3) logical design methodology [B86]. | + | :*<b>The GAM</b> ''(abstract models)''. The Generalized Access Model derived from the early work on technology-independent logical models [P74-01]. This binary model was the basis of three research lines: (1) database performance analysis [P76-02] [P77-01], (2) a first study on schema transformation [P81-02] and (3) logical design methodology [B86]. |
− | :*<b>The GER model</b>. The Generic Entity-relationship model (GER) is a wide-spectrum information/data structure specification model. It encompasses the main concepts and constructs of most popular modeling formalisms, be they value-based or object-based, it has been given a precise semantics via an extended version of the NF2 (non-first normal form, or nested) relational model [P89-1] [P96-10]]. Through a specialization mechanism, such usual models as Entity-relationship, UML class diagrams and ORM can be rigourously specified and compared [P90-01]. Similarly, the GER can be used to define standard data models such as the relational, object-relational, CODASYL, IMS, XML or plain file structure models. The GER model has been used to study transformation-based database engineering processes [P06-10]. | + | :*<b>The GER model</b> ''(abstract models)''. The Generic Entity-relationship model (GER) is a wide-spectrum information/data structure specification model. It encompasses the main concepts and constructs of most popular modeling formalisms, be they value-based or object-based, it has been given a precise semantics via an extended version of the NF2 (non-first normal form, or nested) relational model [P89-1] [P96-10]]. Through a specialization mechanism, such usual models as Entity-relationship, UML class diagrams and ORM can be rigourously specified and compared [P90-01]. Similarly, the GER can be used to define standard data models such as the relational, object-relational, CODASYL, IMS, XML or plain file structure models. The GER model has been used to study transformation-based database engineering processes [P06-10]. |
− | :*<b>Relational model (theory)</b>. Some important theoretical aspects of the relational model, in particular the normalization process, have been developed in books [B09] and [B07]. The emphasis is on concepts and techniques applicable to database design problems solving by practitioners. | + | :*<b>Relational model (theory)</b> ''(abstract models)''. Some important theoretical aspects of the relational model, in particular the normalization process, have been developed in books [B09] and [B07]. The emphasis is on concepts and techniques applicable to database design problems solving by practitioners. |
− | :*<b>The SPHINX data models</b>. The SPHINX DBMS was based on a hierarchy of two data models. The first one would be qualified, according to the current terminology, <i>conceptual</i> [P74-02] and the second one, <i>logical</i> [P74-01], though both were technology independent (I must confess that the titles of the paper are misleading!). The concept of technology-independent (PIM in the MDE vocabulary) logical model was popular in the seventies. This idea was later reused in the GAM, GER and DB-MAIN models. The SPHINX system and its data models are described in project reports [R78-01] to [R78-05]. | + | :*<b>The SPHINX data models</b> ''(DBMS models)''. The SPHINX DBMS was based on a hierarchy of two data models. The first one would be qualified, according to the current terminology, <i>conceptual</i> [P74-02] and the second one, <i>logical</i> [P74-01], though both were technology independent (I must confess that the titles of the paper are misleading!). The concept of technology-independent (PIM in the MDE vocabulary) logical model was popular in the seventies. This idea was later reused in the GAM, GER and DB-MAIN models. The SPHINX system and its data models are described in project reports [R78-01] to [R78-05]. |
− | :*<b>The NDBS data model</b>. NDBS (Network Database System, 1986-1996) was an educational database management environment allowing Turbo-Pascal programs to manage and use complex data in an efficient, though very intuitive, way. The NDBS data model was a variant of the ER model that offered flat entity types (with simple attributes), single-component identifiers and one-to-many relationship types [NDBS-86] | + | :*<b>The NDBS data model</b> ''(DBMS models)''. NDBS (Network Database System, 1986-1996) was an educational database management environment allowing Turbo-Pascal programs to manage and use complex data in an efficient, though very intuitive, way. The NDBS data model was a variant of the ER model that offered flat entity types (with simple attributes), single-component identifiers and one-to-many relationship types [NDBS-86] |
− | :*<b>Virtual data models (wrappers)</b>. Several architectures have been based on the concept of ''wrapper''. A wrapper is a software component attached to a data source (file, spreadsheet, database, web page) and that provides its users (typically application programs) with a data model that is different from that of the data source. A wrapper forms a virtual DBMS that offers a virtual data model as well as a virtual data manipulation language (to some extent, JDBC, ODBS and ADO are generic wrappers). Wrappers have been used as natural interfaces to network databases [P81-01], as building blocks in federated databases [P01-03] and in an evolutive migration architecture allowing easy conversion of application programs [P08-05]. Normally, a wrapper simulates a new technology on top of a legacy data manager [P01-03]. In the latter application, ''inverse wrappers'' simulate the legacy technology (typically CODASYL or standard files) on top of an SQL DBMS in order to minimize the source code adaptation [P04-04]. | + | :*<b>Virtual data models (wrappers)</b> ''(DBMS models)''. Several architectures have been based on the concept of ''wrapper''. A wrapper is a software component attached to a data source (file, spreadsheet, database, web page) and that provides its users (typically application programs) with a data model that is different from that of the data source. A wrapper forms a virtual DBMS that offers a virtual data model as well as a virtual data manipulation language (to some extent, JDBC, ODBS and ADO are generic wrappers). Wrappers have been used as natural interfaces to network databases [P81-01], as building blocks in federated databases [P01-03] and in an evolutive migration architecture allowing easy conversion of application programs [P08-05]. Normally, a wrapper simulates a new technology on top of a legacy data manager [P01-03]. In the latter application, ''inverse wrappers'' simulate the legacy technology (typically CODASYL or standard files) on top of an SQL DBMS in order to minimize the source code adaptation [P04-04]. |
− | :*<b>Decision support data model</b>. References [P90-02] and [B94] describe a decision support model coupling a database with a computing model. The data model is a simple Entity-relationship model (just like that of NDBS) in which attributes are either basic or derived. A derived attribute is defined by a derivation expression referencing other attributes of the schema. This application is described in more detail in the theme ''Databases and Computing Models'', which shows that such a model can be implemented as an active database. | + | :*<b>Decision support data model</b> ''(specific models)''. References [P90-02] and [B94] describe a decision support model coupling a database with a computing model. The data model is a simple Entity-relationship model (just like that of NDBS) in which attributes are either basic or derived. A derived attribute is defined by a derivation expression referencing other attributes of the schema. This application is described in more detail in the theme ''Databases and Computing Models'', which shows that such a model can be implemented as an active database. |
− | :*<b>Temporal data models</b>. The DB-MAIN model (conceptual, logical and physical) has been extended to express temporal aspects of data (transaction, valid, bi-temporal). A specific methodology has been designed and code generation rules have been implemented for active relational databases [P01-02] | + | :*<b>Temporal data models</b> ''(specific models)''. The DB-MAIN model (conceptual, logical and physical) has been extended to express temporal aspects of data (transaction, valid, bi-temporal). A specific methodology has been designed and code generation rules have been implemented for active relational databases [P01-02] |
− | :*<b>The DB-MAIN model</b>. This model is a partial, graphical, implementation of the GER. It has been developed for the DB-MAIN CASE environment. A precise definition can be found in the DB-MAIN manuals [T09-01], in DB design tutorials [T02-01] [T02-02] and in text book [B09]. | + | :*<b>The DB-MAIN model</b> ''(models for CASE tools)''. This model is a partial, graphical, implementation of the GER. It has been developed for the DB-MAIN CASE environment. A precise definition can be found in the DB-MAIN manuals [T09-01], in DB design tutorials [T02-01] [T02-02] and in text book [B09]. |
− | :*<b>DBMS models</b>. Descriptions of data models specific to the most popular DBMS are available in various references. SQL2, SQL3 in [B09], hierarchical or IMS in [P09-02] and [B02-02], network or CODASYL DBTG in [P09-03], [R03-01] and [B02-02]. Elementary knowledge in legacy data models is important in database reverse engineering. Also, it can't be bad to show young IT professionals that database technology has not begun with their first Ctrl-Alt-Del. On the contrary, it has a long history and many of the seemingly nice innovative data management features already existed in the seventies (e.g., triggers, predicates, dynamic DML, | + | :*<b>DBMS models</b> ''(model analysis)''. Descriptions of data models specific to the most popular DBMS are available in various references. SQL2, SQL3 in [B09], hierarchical or IMS in [P09-02] and [B02-02], network or CODASYL DBTG in [P09-03], [R03-01] and [B02-02]. Elementary knowledge in legacy data models is important in database reverse engineering. Also, it can't be bad to show young IT professionals that database technology has not begun with their first ''Ctrl-Alt-Del''. On the contrary, it has a long history and many of the seemingly nice innovative data management features already existed in the seventies (e.g., triggers, predicates, dynamic DML, metadata), and sometimes before! |
− | :*<b>UML data model</b>. UML class diagrams are often proposed to express database schemas. The ability of this formalism to describe conceptual schemas has been studied in references [R02-01] and [B09]. It appears that by discarding some ill-designed constructs and by adding a small number of constructs (such as identifiers and other basic constraints) it is possible de define a variant of UML (called DB-UML) quite fitted to database schemas. DB-UML has also been implemented in the DB-MAIN CASE tool, together with bi-directional global schema transformations with the DB-MAIN GER model. | + | :*<b>UML data model</b> ''(model analysis)''. UML class diagrams are often proposed to express database schemas. The ability of this formalism to describe conceptual schemas has been studied in references [R02-01] and [B09]. It appears that by discarding some ill-designed constructs and by adding a small number of constructs (such as identifiers and other basic constraints) it is possible de define a variant of UML (called DB-UML) quite fitted to database schemas. DB-UML has also been implemented in the DB-MAIN CASE tool, together with bi-directional global schema transformations with the DB-MAIN GER model. |
:Let's finally mention an interesting proposal by Yannis Tzitzikas to optimize the graphical layout of large ER schemas based on three physical laws (electricity, magnetism, mechanics)[P05-06]. | :Let's finally mention an interesting proposal by Yannis Tzitzikas to optimize the graphical layout of large ER schemas based on three physical laws (electricity, magnetism, mechanics)[P05-06]. | ||
*'''Keywords''' | *'''Keywords''' | ||
− | :ER model, Individual model, Merise model, UML class diagrams, wide-spectrum model, GER model, DB-MAIN model, logical data model, temporal model, legacy data model, relational model, network model, hierarchical model, OO model, OR model, XML model, large schema layout, semantic and statistical aspects of models, IS-A relation | + | :ER model, Individual model, Merise model, UML class diagrams, wide-spectrum model, GER model, DB-MAIN model, logical data model, temporal model, legacy data model, relational model, network model, hierarchical model, OO model, OR model, XML model, decision support, large schema layout, semantic and statistical aspects of models, IS-A relation |
*'''Resources''' | *'''Resources''' | ||
:'''[B09]''' Jean-Luc Hainaut. <u>Bases de données - Concepts, utilisation et développement</u>, Dunod, Collection Sciences Sup, Paris, 2009. [http://info.fundp.ac.be/~dbm/mediawiki/index.php/DUNOD2009|[book description]] | :'''[B09]''' Jean-Luc Hainaut. <u>Bases de données - Concepts, utilisation et développement</u>, Dunod, Collection Sciences Sup, Paris, 2009. [http://info.fundp.ac.be/~dbm/mediawiki/index.php/DUNOD2009|[book description]] |
<Back to Themes & Resources page>