OUTILS / TOOLS
<Retour à la page d'accueil / Back>
SQLfast
- SQLfast is a software environment that allows non expert users to easily create, examine, modify, query and process relational databases through a user friendly graphical interface. The SQLfast environment also provides a simple but rich programming language intended primarily to casual users with no or little experience in database programming.
- SQLfast is portable, self-contained, self-documented, ready to run and requires no installation. Launching it just requires double-clicking on SQLfast.exe, nothing more. The software, the manuals and the tutorials are available here.
- Tutorial last updates
- 2017-07-25: Case study: Temporal databases - Part 1 [full text].
- 2017-07-25: Case study: Schema-less databases - Part 3 [full text].
- 2017-07-07: Case study: Schema-less databases - Part 2 [full text].
- 2017-06-30: Case study: Schema-less databases - Part 1 [full text].
- 2017-06-10: Case study: Four hours to save the library [full text].
- 2017-05-17: SQLfast tutorial - Part 1 [full text]
- 2017-04-07: Dictionary of SQLfast commands and parameters [full text]
- 2017-03-05: Case study: Directory management [full text].
- 2016-12-05: Case study: Path finders, rovers and Ariadne's thread [full text]
- Type: High-level database programming interface and software environment for casual and non expert users
- Description of the SQLfast language
- SQLfast is a language allowing easy database processing and manipulation. The core of the SQLfast language is SQL, the universal language for database definition and manipulation. Any sequence of SQL statements is a valid SQLfast script, whatever its length. SQLfast also comprises statements that provide for interaction with script users, text generation, file manipulation, decision and iteration.
- The main goal of SQLfast is to allow the rapid and easy development of small database applications by casual, non expert users. It does not compete with standard programming languages, such as C, Java or C# in which large and complex database applications are written. By automating most tedious and obscure parts of database programming and user interaction, it allows script writers to concentrate on the problem logic.
- This document (in English) and this one (in French) compare the SQLfast language with standard programming languages for querying a database, interacting with the user and downloading a file. They show that a single SQLfast statement replaces from 80 to 100 statements written in Java or Python.
- Despite its simplicity, the SQLfast language includes powerful features such as the full SQL language, variables, GUI, comprehensive control structures (procedures, recursive calls, if-else, for-endfor, while-endwhile, etc.), a versatile text generator, dynamic statement and script execution, external program interface, transactions, automatic metadata management, file management, web access, SMTP protocol, base64 and SHA256 conversion, debugging and much more.
- Description of the SQLfast software environment
- The SQLfast environment does not require any installation. Once the SQLfast archive has been downloaded, the SQLfast folder is extracted and copied anywhere, from the desktop to a USB flash memory. SQLfast is selfcontained. There is no need to install a database manager nor GUI libraries, to compile modules to adapt them to the user equipment, to create accounts and to define access rights.
- Due to its simplicity, its ease of use and its power, SQLfast is intended to audiences that have little time and/or knowledge to develop professional programs. In particular, it can be used by:
- non-expert users: SQLfast provides an easy way to write small data processing tools customized to their exact needs, without resorting to complex programming or using powerful but complex softwares.
- non-expert but motivated users: SQLfast is quite appropriate for the development of small and simple single-user applications for, e.g., library management, statistical analysis, simple data mining, data extraction and transformation (ETL), geographic information systems, photo album, text analysis, still and animated graphical display of data, and so on.
- expert users: SQLfast is a fine environment for fast database application prototyping.
- students: the SQLfast language basically is SQL but it is also an ideal support to practical work in database introductory and advanced courses. It allows students to play with SQL in a secure, off-line, environment and also to develop in a few lines powerful scripts without the burden of learning and using standard programming languages. In addition, the graphical interface provides a user friendly way to approach database concepts without even using SQL: databases can be built, examined and their contents can be viewed, queried, modified through direct manipulation.
- teachers: SQLfast includes a powerful tutorial engine that allows instructors to develop and use didactic material. An SQLfast tutorial is a hypertext comprising formatted text, images, SQL and SQLfast code (that can be executed from within the tutorial), internal and external links. The standard distribution already includes several dozens of tutorials in three domains: SQLfast user manuals, SQL learning and database programming. New tutorials can be written and integrated in minutes through the SQLtuto language.
- SQLfast runs on Windows, OS X and Linux machines (only the Windows version is available at present time). Its default database manager is SQLite 3 but versions for MySQL and MS Access are being developed.
- Finally, an important feature of the SQLfast environment is its extendability: new functions and new resources can be developed and integrated quite easily.
- Technical detail: An SQLfast script (or program) is not limited in size: it can include just one statement but also several millions of statements. The environment is developed in Python 2.7, Tkinter and an integrated RDBMS (SQLite 3 in the standard distribution). Its main design objectives are to hide the complexity of database programming (connections, cursors, transactions, metadata management, etc.), of graphical interactions and of data exchange with external sources (e.g., files, web data and email) to provide very high level functions and to provide a secure user-friendly execution environment. The SQLfast language and environment comprise the following components:
- the full SQL language
- variables in which data coming from a database, external files, computations, GUI or web sites can be stored. Variables are typeless and have no size limit. A variable can contain data of any kind but also script fragments. The SQLfast macro processor can (recursively) replace variable references by their values.
- control structures (if-then-else, for-endfor, while-endwhile, procedure call, error management, libraries, external applications, stop, pause)
- reading/writing in external files, encoding/decoding into secure formats (hexadecimal, base64), secure hashing, encryption
- large object management (text, video, sound, BLOB, geographic objects, charts, drawings)
- generation of complex formats (csv, xml, html, rtf, LateX, JSON, Key-value, SQL, tuples, etc.) These formats are defined by generic parameters.
- import processors for standard formats (e.g., dbf, csv, Access, ESRI shapefiles)
- a rich extensible GUI:
- elementary boxes: messages, data entry forms, radio buttons, check buttons, text, images, drawing, charts, etc.
- composite dialogue boxes: made of an arbitrary number of elementary boxes
- extensible through Python external libraries
- an integrated graphical engine to render SQLdraw scripts to produce still and animated drawing (charts, drawing, maps, animated simulation, etc.). SQLdraw scripts can be generated or manually produced through graphical SQLdraw Editor.
- extensibility mechanisms:
- technical SQLfast scripts (e.g., parameters setting),
- functional SQLfast procedures (e.g., data loading, data checking, report generation),
- Python external libraries (e.g., string manipulation, file management, geometric algorithms, web access),
- Python external applications (e.g., data import, interactive data conversion, complex dialogues, map digitizer, graphical engine),
- language extension (through script precompilation)
- external programs (web browsers, image processors, media processors, Office suite, etc.)
- a built-in schema database (data dictionary)
- a tutorial engine, that can be called from the graphical interface or from within any script
- reflexivity: an SQLfast script can generate, examine, modify and execute SQLfast scripts, including itself.
- development and debugging tools
- Requirements: Windows XP, 7, 8 and 10. All the resources needed by the basic version (built on SQLite 3) are included in the distribution.
- Date: 2012-
- Status: ongoing. Version 2 available since June 2017.
- Availability: From the SQLfast web page.
- References and resources:
- SQLfast environment for Windows. Get it from the SQLfast web page.
- Python 2.7. Not required to execute SQLfast. Must be used to develop extensions to SQLfast. [download page]
- SQLite3 DBMS is included in the SQLfast software.
- Dictionary of SQLfast commands and parameters, 62 pages, PReCISE Research Center publication, draft version, April 7, 2017. [full text]
- Jean-Luc Hainaut. SQLfast: Database processing made easy - Tutorial and Case studies, Volumes 1 and 2, 900 pages, PReCISE Research Center publication, draft version. The first volume is available as a single document while Volume 2 is available as a series of independent case studies. These chapters are continuously updated, so that it is recommended to check them regularly to get their last versions.
- Both volumes are described here below.
- SQLfast Tutorial - Part 1: INTRODUCTION TO DATABASE PROGRAMMING
[get the full text here]
-
-
- Chapter contents: What is SQLfast?. The target audiences of SQLfast. Installing and starting SQLfast . The levels of the SQLfast interface. Starting SQLfast with the Basic interface. Selecting another level. Starting SQLfast with the Expert interface (Level 1). The example databases: ORDERS.db and CLICOM.db. SQLfast resources and documentation.
-
- Chapter contents: Preparing the work. Writing and running an SQLfast script. Saving the current script. Loading and modifying a script. Other ways to execute a script. Representing scripts and results in this tutorial. How to write SQLfast statements?. Questions on this first script?. A second script. just as simple. Using variables - First encounter. Variable’s way of life. Interactive SQLfast scripts. To quote or not to quote. SQLfast sessions. Summary.
- Chapter 3. Creating a database
- Chapter contents: How to create a new database. How (not) to loose data. Creating and loading the ORDERS database. Calling secondary scripts (first look). The schema of the ORDERS.db database. A new skill: writing blank lines. Temporary tables. In memory databases. Upper case or lower case?. Summary.
-
- Chapter contents: Extending user-driven data selection. Data modification statements: a short reminder. Inserting rows into a table. About single quotes in SQL values. Updating rows of a table. Variables in statement customization. Variables in full statement specification. Improving data entry box layout. Dialogue box for updating values. Summary.
- Chapter 5. SQL in a nutshell
- Chapter contents: Introduction. Creating and modifying the schema of a database. Single table extraction queries. Subqueries. SQL functions. Aggregate (statistical) functions. Joining tables. Set operators. Data grouping. Generalization of the select and from clauses. Data modification. SQL views. Stored procedures and triggers. About SQLite.
- Chapter 6. More on variables
- Chapter contents: Time variables and statements. Measuring elapsed time. Oh! Wait a minute!. Variable delimiters - Reminder. We can change the delimiters. Sometimes variable substitution must be disabled. Beware substitution ambiguity. Numeric computation. Character string computation. Set or compute?. Multi-variables assignment. Simulating an array. Variables and SQLfast procedures. What is a variable value after all?. Summary.
- Chapter 7. Taking decisions
- Chapter contents: Alternative execution of a script. Skipping statements. If-else-endif block structure. Better controlling user values. Conditions. If. goto. labels and variables. Complement: more on goto and label statements. Summary.
-
- Chapter contents: Can we process individual rows?. Forcing a query to return one row only. Can we force a query to returns two rows?. Queries that return no row. Looping in the database. Embedded loops. A more elegant script. SQLfast arrays revisited. Empty sets, null values and statistics. Summary.
- Chapter 9. Looping - The final chapter
- Chapter contents: Introduction. SQL for loop. Range for loop. File for loop. While loop. Forced loop iteration. Forced loop exit. While (True) loop. Correct loop structure. Variables in loops and if statements. How does a loop work, actually?. Summary.
- Chapter 10. SQLfast expressions
- Chapter contents: Introduction. Value of a variable. Binary values. Syntax of SQLfast expressions. Numeric functions. Character string functions. Logical functions. Temporal functions. File/web functions. General functions. User defined functions (UDF). Functions from external libraries. Summary.
- Chapter 11. Output channels, files and directories
- Chapter contents: The SQLfast output window: the default output channel. The (SQLfast) output window. User output files. The standard output file. Output variables. Closing an output channel. Setting the default output channel and output mode. Clearing the output window. Example: playing with output channels. Managing files and directories. Summary.
-
- Chapter contents: The SQLfast GUI. Standard data entry box. Controlling user action. Update a data set. Text entry box. Selecting one item among several. Selecting items among several. Summary.
- Chapter 13. Displaying data
- Chapter contents: Information message box. Displaying a list of values on the screen. Displaying a text. Displaying a picture. Displaying drawings. Won't you please, please help me. Summary.
- Chapter 14. Advanced dialogues
- Chapter contents: Multicolumn data entry and selection. Logical field grouping. Logical button grouping. No label please!. Application: building a 2D menu. Data entry through selection in a predefined value list. Predefined values from the database. Displayed vs returned values. Customizing standard buttons. Composite dialogue boxes. Heterogeneous composite boxes. Of styles and colors. Example of complex composite box. Summary.
- Chapter 15. Large data objects
- Chapter contents: Beyond elementary data items. Displaying large character strings. Reading and writing (large) text files. Reading and writing large binary files. Storing large objects in a database. About binary values. Binary data encoding/decoding. A touch of cryptography. Where to store large objects. Loading large volumes of data. Summary.
- Chapter 16. Writing data tables - Part 1
- Chapter contents: An in-depth look at output tables. Changing the layout of an output table. CSV output format. Key-Value output format. List output format. Summary.
- Chapter 17. Writing data tables - Part 2
- Chapter contents: XML output format. HTML output format. Tuple output format. RTF output format. SQL output format. LateX output format. JSON output format.
- Chapter 18. Importing and exporting data
- Chapter contents: External data import. CSV data import/export. dBase data import. SQL-DML data import/export. JSON data export/import. Summary.
- Chapter 19. Recursive programming
- Chapter contents: Introduction. SQL recursive queries. SQL recursive queries: the Staff application. Recursive queries against cyclic data. Can recursive queries compute everything?. Tree traversal ordering. SQL recursive triggers. Loop-based SQLfast implementation of recursive procedures. SQLfast recursive scripts. Comparison and conclusions.
- Chapter 20. Metadata - Walking on the wild side
- Chapter contents: Introduction. The SQLfast dictionary tables. Dictionary or no dictionary?. Examining the schema of an unknown database. From metadata to data. Complement: SQL group_concat function. Summary.
- Chapter 21. SQLdraw: vector graphics, writing in progress.
- Chapter 22. Extending SQLfast
- Chapter contents: Introduction. SQLfast utilities. Python external applications and libraries. User-defined functions (UDF). SQLfast language extension. Foreign applications. Developing custom-made external applications and libraries. Developing custom-made UDF for SQLfast and SQL. SQL queries in UDF. Complements: standard external SQLfast function libraries.
- Chapter 23. Aid to SQLfast development, writing in progress.
- Appendix. A basket of examples, writing in progress.
- Appendix contents: Introduction. Consulting customers. Interactive data loading. Comparing the contents of two tables. Displaying the contents of two table in a single table. Dumping a database in XML format (version 1). Dumping a database in XML format (version 2). Data-driven multiple aggregations (table pivoting). Data mining. Condorcet voting paradox. Paged reading of data. Raising temperature. Managing a file of parameters. Dynamic branching. Simulating arrays in SQLfast. Smart writing of data table. Checking functional dependencies in a table. Image selection tool. Managing a photo album. A picture viewer. Collecting web statistics. Computing missing data in incomplete statistical series. The Travelling Salesman problem. Ordering. and more ...
-
- SQLfast Tutorial - Part 2: PROBLEM SOLVING WITH DATABASES - CASE STUDIES
- Case 1. Four hours to save the library, draft version, June 10, 2017.[full text]
- Objective: This case study describes the emergency writing of a small application that was to implement the core functions of the management of a small library. The challenge was to replace the current software, lost in a recent crash of the server. All that was left after the accident was the last backup of the database, unfortunately in an unknown format.
- Keywords: rapid application development, application prototyping, aplication architecture, GUI
- Case 4. Schema-less databases - Part 1, draft version, June 6, 2017. [full text]
- Objective: This document is the first of a series of three case studies that explore alternative data models that organize the data in structures that provide more flexibility than standard relational tables. We observe that this flexibility makes it easier to dynamically modify the schema of the database but, as an unfortunate consequence, the schema is less expressive. In some of these data models, the schema is practically devoid of any information, hence the name schema-less.
- In this study, we explore two extreme data models. In the Universal table model, according to which the data of all the source tables are stored in a single table comprising the union of the columns of the source tables. In the second model, called Column-oriented model, each column of the source tables is implemented in an independent table.
- Keywords: non-relational data model, NoSQL, schema-less database, universal table model, universal relation, column-oriented data model, Cassandra, data migration, schema conversion
- Case 5. Schema-less databases - Part 2, draft version, July 7, 2017. [full text]
- Objective: This document studies a third family of alternative data models, namely the Key-Value data structure. In these models, the information is represented by triples that each defines the value of an attribute of an entity. Several approaches are described, with increasing levels of genericity.
- Keywords: non-relational data model, NoSQL, schema-less database, key-value model, triple, triplestore, RDF, SPARQL, description logic, A-BOX, OWL, Redis, Berkley DB, data migration, schema conversion
- Case 6. Schema-less databases - Part 3, draft version, July 25, 2017. [full text]
- Objective: This document studies a fourth family of alternative data models, namely the object (or document) models. In these models, the information is represented by complex objects in which properties can be multivalued and composite.
- Keywords: non-relational data model, key-value model, object model, NoSQL, schema-less database, multivalued property, composite property, document-oriented DBMS, MongoDB, CouchDB, Azure, Datastore Oracle, metadata, index, data migration, schema conversion
- Case 9. Temporal databases - Part 1, draft version, July 25, 2017. [full text]
- Objective: In this study we examine various ways to organize the data describing the evolution of a population of entities. The basic model consists in storing the successive states of each entity, completed by the time period during which the state was observable. We distinguish between the transaction time, that refers to the data modification time in the database and the valid time, referring to modification events of entities in the real world. This study particularly develops entity-based, attribute-based and event-based temporal database models. In these models, data management is ensured by triggers that automate as far as possible entity creation, modification and deletion operations.
- The next study will be devoted to temporal database querying.
- Keywords: temporal data type, temporal database, history, entity type, time point, time period, time interval, evolution, event, state, transaction time, valid time, system time, bitemporal data, document-oriented model, JSON, trigger
- Case 14. The book of which you are the hero, draft version, June 2, 2014. [full text]
- Chapter contents: Interactive adventure books and computer games. Choosing a sample game book. Designing a text-based interactive adventure game. Story telling. Structure of a game. Analyzing a game. Counting the number of paths. Merging episodes. Toward a better game engine.
- Case 15. Directory management, draft version. March 5, 2017. [full text].
- Chapter contents: File and directory management. File and directory hierarchy representation. Tree processing. Recursive queries. Hierarchy evolution. Hierarchy comparison. Searching for file clones.
- Case 21. Test Database Generation, draft version, December 4, 2014. [full text]
- Chapter contents: Database performance evaluation. Generating high volume of synthetic data. Integrating heterogeneous data sources. Data cleaning. Data anonymization. Random data extraction. Executing very large scripts. Query performance.
- Case 24. Kings of France - Part 1, draft version, October 1, 2014. [full text]
- Chapter contents: Introduction. Data structures. Warming up: some easy queries. Rebuilding royal families. King succession. Seeking missing kings. Royal conflicts.
- Case 25. Kings of France - Part 2, draft version, October 1, 2014. [full text]
- Chapter contents: The ancestor/descendant relationships. Counting descendants. The descendants of a member. Graphical representation of king genealogy - version 1. Graphical representation of king genealogy - version 2. Recovering the source data from their closure. Extracting the hierarchy of kings.
- Case 27. Conway's Game of Life, draft version, June 2, 2014. [full text]
- Chapter contents: Introduction. World and automata representation. Computing the next state. Graphical display of automaton evolution. Generation of SQLdraw scripts. Packaging the LIFE application. First performance analysis. Optimization 1: indexing tables. Optimization 2: windowing table access. Optimization 3: building a cheap world. A last, desperate, and (fortunately) unsuccessful optimization. Putting things together. Some representative automata. Family life. Conclusion.
- Case 28. From data bulk loading to database book writing, draft version, January 10, 2017.[full text]
- Chapter contents: Introduction. The database loading problem. Representing the table graph of a database. Topological sorting of the table graph. Coping with non acyclic graphs. Cyclic vs hard cyclic kernel of a table graph. Contracting the circuits of the hard kernel. Finding the independent circuits of a graph. Finding the super circuits of a graph. Building the final acyclic graph. Optimizing the final acyclic graph. Loading data: a complex abstract example. The complete data loading generation procedure. Loading the data of a super circuit. Loading data: a real example. Brute force data loading technique. Performance. Complement: extracting the table graph of a database. Topological sorting: application to book development. Other applications.
- Case 31. Path finders, rovers and Ariadne's thread, draft version. December 5, 2016. [full text].
- Chapter contents: Shortest paths between two cities. Dijkstra’s Algorithm. Representation of the graph. Computing the shortest paths from a starting city. Building a general purpose shortest path engine. Ariadne’s thread: solving mazes. Exploring a planet. Performance and optimization.
Case studies in preparation
- Case 2. The human factor (Managing a small library), writing in progress.
- Case 2. Interactive SQLfast interpreters (incl. Self-assessment SQL training tutor), writing in progress.
- Case 7. Schema-driven code generation, writing in progress.
- Case 8. Active databases, writing in progress.
- Case 10. Temporal databases - Part 2, writing in progress.
- Case 11. Pivot tables: from columns to rows, writing in progress.
- Case 12. Geographic information systems, writing in progress.
- Case 13. Database security, writing in progress.
- Case 16. Lies, damned lies, and statistics, writing in progress.
- Case 17. Bis repetita non (always) placent (Detecting and controlling redundancies), writing in progress.
- Case 18. Data analysis and cleaning, writing in progress.
- Case 19. Database reverse engineering, writing in progress.
- Case 20. Database prototyping, writing in progress.
- Case 22. Database Performance, writing in progress.
- Case 23. Object/Relational mapping vs Active database, writing in progress.
- Case 26. Text processing, writing in progress
- Case 29. Bill of material, rewriting in progress.
- Case 30. Classification (Formal Concept Analysis), writing in progress.
- Case 32. Databases and the internet, writing in progress.
- Case 33. Worflow management - Building a workflow engine, writing in progress.
DB-MAIN
- Type: CASE and MetaCASE environment
- Description: Graphical CASE tool that supports most database engineering processes. Provides functions for conceptual information analysis, logical design, physical design, code generation, schema integration, schema normalization, quality evaluation, schema mapping, reverse engineering (for a dozen legacy and modern DBMS), schema analysis, code analysis (a.o., depency graphs, pattern searching and program slicing), data analysis, schema transformation. High speed kernel and graphical interface written in C++. New functions can be developed in Voyager 2 (the DB-MAIN external language) or in Java. DB-MAIN includes a method engine and features to extend its meta-schema. It allows the representation of data structures of various paradigms and at any level of abstraction. It relies on the GER (Generic Entity-relationship model) and on transformational operators.
- Requirements: MS Windows, Linux.
- Date: 1993-2014+
- Status: ongoing
- Availability: Free of charge (except for advanced processors). See below
- References and resources:
- DB-MAIN v6 (free, max 500 objects)
- DB-MAIN v8 (free, max 2000 objects)
- DB-MAIN v9 (nouvelle interface, gratuit, sans limitation du nombre d'objets)
- First Steps with DB-MAIN (short tutorial)
- Introduction to database design with DB-MAIN (long tutorial)
- Visit also the DB-MAIN official site www.db-main.eu or the site of ReveR s.a. www.rever.eu, now in charge of the maintenance and evolution of DB-MAIN.
TRAMIS/Master
- Type: CASE environment
- Description: TRAMIS/Master was an extension of ORGA distributed by the small French company CONCIS. In particular, the following features were added:
- support for conceptual analysis
- a larger transformation toolbox
- evaluation functions (notably volume estimation)
- the specification database was implemented in MDBS (a CODASYL-like DBMS)
- Requirements: MS-DOS; MS Windows 3.11; MDBS
- Date: 1986-1991
- Status: discontinued; principles integrated into DB-MAIN
- Availability: software no longer available
- References and resources:
- Jean-Luc Hainaut,Mario Cadelli,Bernard Decuyper and Olivier Marchand. TRAMIS:a transformation-based database CASE tool, in Proceedings of 5th International Conference on Software Engineering and Applications, EC2 Publish., 1992. [description]
- Jean-Luc Hainaut, Mario Cadelli, Bernard Decuyper and Olivier Marchand. Database CASE Tool Architecture : Principles for Flexible Design Strategies, in Proceedings of the 4th International Conference on Advanced Information System Engineering (CAiSE’92), Lecture Notes in Computer Science, Volume 593, pages 187-207, Springer-Verlag, 1992. [description]
ORGA
- Type: CASE environment
- Description: ORGA was a companion CASE tool of the industrial IDA environment developed by the team of professor F. Bodard in the 80's and distributed (as well as ORGA) by METSI, a small French company in Information system design. IDA was mainly devoted to the conceptual specification of various aspects of Information systems. The goal of ORGA was the translation of information conceptual schemas into DDL code according to three data models, namely relational (SQL-87), CODASYL DDL (IDS2) and standard files (COBOL). In modern words, ORGA supported the logical design and code generation processes. Three key features of ORGA:
- the architecture of ORGA relies on a transformational toolbox. The conceptual/logical conversion was based on three transformation plans, one for each target data model. In addition, the user was allowed to apply elementary transformations to refine the logical schema. In modern words, ORGA supported Model-Driven Engineering.
- ORGA was developed in C and the specification database was managed through libraries emulating dBASE III structures [1] [2].
- The methodological principles of ORGA were developed in the (French) book Conception assistée des applications informatiques - Conception de la base de données, published by Masson (now DUNOD) in 1986.
- Requirements: MS-DOS; hard disk (recommended!)
- Date: 1984-1985
- Status: discontinued; principles integrated into TRAMIS/Master then in DB-MAIN
- Availability: software no longer available
- References and resources:
PHENIX
- Type: Reverse Engineering CASE environment
- Description: PHENIX was a CASE tool devoted to the reverse engineering of the file structures of large COBOL applications. It was developed with powerful knowledge management systems (which were very popular in the 80's and 90's), a decision which accounted for the poor performance of the code analyzer. In particular, it included:
- a code analyzer that detected program patterns
- a large transformation toolbox
- an inference engine to detect and identify implicit constructs.
- Requirements: Unix, LeLisp, SMECI, Aida/Masai
- Date: 1989-1992
- Status: discontinued; principles integrated into DB-MAIN
- Availability: software no longer available
- References and resources:
- Jean-Luc Hainaut, Catherine Tonneau, Michel Joris and Muriel Chandelon. Transformation-based Database Reverse Engineering, in Proceedings of 12th International Conference on Entity-Relationship Approach (ER’93), Lecture Notes in Computer Science, Volume 823, pages 364-375, Springer-Verlag, 1994. [description]
- Jean-Luc Hainaut, Muriel Chandelon, Catherine Tonneau and Michel Joris. Contribution to a Theory of Database Reverse Engineering, in Proceedings of the 1st Working Conference on Reverse Engineering (WCRE’93), pages 161-170, IEEE Computer Society, 1993. [description]
- Jean-Luc Hainaut, Catherine Tonneau, Michel Joris and Muriel Chandelon. Schema Transformation Techniques for Database Reverse Engineering, in Proceedings of 12th International Conference on the Entity-Relationship Approach (ER’93), pages 353-372, E/R Institute Publish., 1993. [description]
- Michel Joris, R. Hoe, Jean-Luc Hainaut, Muriel Chandelon, Tonneau Catherine and Bodart François. PHENIX : methods and tools for database reverse engineering, in Proceedings of 5th International Conference on Software Engineering and Applications, EC2 Publish., 1992. [description]
- Jean-Luc Hainaut. Database Reverse Engineering, Models, Techniques and Strategies, in Proceedings of the 10th International Conference on the Entity-Relationship Approach (ER’91), pages 729-741, ER Institute, 1991. [description] [full text]
VisiComp
- Type: Graphical computer simulator
- Description: VisiComp (name inspired by VisiCalc) is a graphical simulator for an educational computer. This computer has a very simple architecture: a main memory of up to 100 cells (each accomodating a signed 5-digit number), the standard handful of registers (P-counter, instruction register, logical register, accumulator + its extension register), 37 instructions organized into 7 instruction families, 10 addressing modes, a sequential input device, a sequential output device, bootstrap (kind of MBR). The simulator also includes a monitor that manages and controls all the parts of the computer (interactive + script interpreter), configuration files, input file loader, etc. The computer has been used to illustrate architecture and programming principles such as Von Neuman architecture, basic program structures, I/O buffering, recursion, relocatable programs, bootstrapping and even a complete business application (order management and invoicing with 3 files, almost as powerful as SAP!) The simulator and its additional components required at runtime no more than 12 KB (12,000 bytes!) in RAM. It was developed in TRS-80 Basic then in CP/M Basic.
- Requirements: TRS-80 Model 1 - Level 2 (16 KB RAM); CP/M; BASIC language processor;
- Date: 1982
- Status: discontinued
- Availability: Source code available on paper
- References and resources:
- Jean-Luc Hainaut, THE VISIBLE COMPUTER - A graphical computer simulator. Technical report, 1982. 75 pages. [full text] This document results from the scanning of poorly printed pages. It is incomplete and probably includes too many errors. In addition, it will probably be considered fairly naive. Consider that it was written 30 years ago and that computer technology has considerable evolved since then, to such an extent that is has become quite complex and impossible to describe in detail to plain lay readers.
- Source code available on paper.
- Note: just for the fun, I have developed a new graphical interface for Visicomp in Python/Tkinter. It would take no more than 2-3 weeks to add the logic behind. Anyone interested?
NDBS
- Type: Database Management System
- Description: NDBS (Network Database System) is an educational database management environment allowing Turbo-Pascal programs to manage and use complex data in an efficient, though very intuitive, way. NDBS comprised a runtime library (database handler), a DDL compiler, a 4GL Query Language, a data dictionary, a report generator, a SQL/NDBS converter, an import/export tool, etc. As far as we now, the only industrial use was by an Indian Statistics department, which developed several applications programs based on NDBS. The last version (under the name Pyramid) was developed by D. Rossi, then Master student in the University of Namur. Two original aspects: (1) the data model was a variant of the ER model, (2) the physical engine was a fairly strict implementation of the principles developed in the course of Database Technology I gave in the 80's.
- Requirements: Windows; written in Turbo-Pascal
- Date: 1986-1996
- Status: discontinued
- Availability: source code (but I must search floppy disks for it).
- References and resources:
- [TR86-01]Jean-Luc Hainaut, NDBS - A simple database system for small computers. Technical report, 1986. 144 pages. [full text] This text has been recovered from various materials written in the eighties (Mac Word documents) and roughly assembled into a unique FrameMaker document. The result would require in-depth polishing, but, to be frank, I currently have no spare time to carry out this task. Sometimes in the near future perhaps!
SQL-Script Interpreter
- Type: Interpreter for an SQL-based scripting language
- Description: Empty Access application (*.mdb) that allows users to execute SQL scripts on any database. Includes a simple dialog box with two buttons: select a script file and execute it. For information, MS Access provides two ways to execute SQL statements: as Access queries (simple but single queries only) and embedded in VB programs (powerful but complex). This tool allows one to execute in one click a script made up of a sequence of SQL statements, but also much more complex scripts with variables, macros, loops, alternatives and interaction with users. This application has been frozen in 2012 since the development of SQLfast (see above).
- Requirements: Microsoft Access
- Date: 2005-2011+
- Status: ongoing
- Availability: available online
- References and resources:
SPHINX
- Type: Database Management System, Natural Language interface, Data dictionary
- Description: SPHINX is a database management system based on the ER model. It was developed in the 70's as a result of large national project devoted to the development and management of large administrative databases. SPHINX comprised several components:
- a runtime database handler
- a high-level DML language integrated to COBOL
- NUL: an incremental user-oriented query language
- a meta-system (based on SPHINX)
- Requirements: Siemens BS-2000; SESAM data manager (one of the first schema-less inverted-file data manager);
- Date: 1971-1977
- Status: discontinued
- Availability: software no longer available
- References and resources:
- Baudouin Le Charlier and Jean-Luc Hainaut. Modèles, Langages et Systèmes pour la conception et l’exploitation de bases de données, in Actes du congrès AFCET 1978, pages 179-189, Editions Hommes et Techniques, 1978. [description] [full text]
- Claude Deheneffe and Henri Hennebert. NUL: A Navigational User’s Language for a Network Structured Data Base, in Proceedings of the 1976 ACM SIGMOD International Conference on Management of Data, pages 135-142, ACM, 1976. [description] [full text]
- Claude Deheneffe, Henri Hennebert and Walter Paulus. A Relational Model for a Data Base, in Proceedings of the IFIP congress 74, pages 1022-1025, North-Holland, 1974. [description] [full text]
- Jean-Luc Hainaut and Baudouin Le Charlier. An Extensible Semantic Model of Data Base and Its Data language, in Proceedings of the IFIP Congress 74, pages 1026-1030, North-Holland, 1974. [description] [full text]
- [TR78-01]Jean-Luc Hainaut, Baudouin Le Charlier, et al., Système de conception et d'exploitation de bases de données - Volume 1 : Modèles et Langages. Rapport final du projet CIPS I2/15, Institut d'informatique, Université de Namur, 1978. [full text]
- [TR78-02]Jean-Luc Hainaut, Baudouin Le Charlier, et al., Système de conception et d'exploitation de bases de données - Volume 2 : Manuel de référence des langages. Rapport final du projet CIPS I2/15, Institut d'informatique, Université de Namur, 1978. [full text]
- [TR78-03]Jean-Luc Hainaut, Baudouin Le Charlier, et al., Système de conception et d'exploitation de bases de données - Volume 3 : Une implémentation du modèle d'accès. Rapport final du projet CIPS I2/15, Institut d'informatique, Université de Namur, 1978. [full text]
- [TR78-04]Jean-Luc Hainaut, Baudouin Le Charlier, et al., Système de conception et d'exploitation de bases de données - Volume 4 : Le système SPHINX, Utilisation, fonctionnement et description interne. Rapport final du projet CIPS I2/15, Institut d'informatique, Université de Namur, 1978. [full text]
- [TR78-05]Jean-Luc Hainaut, Baudouin Le Charlier, et al., Système de conception et d'exploitation de bases de données - Volume 5 : Exemples d'application. Rapport final du projet CIPS I2/15, Institut d'informatique, Université de Namur, 1978. [full text]
DB-MAIN
- Type: CASE and MetaCASE environment
- Description: Graphical CASE tool that supports most database engineering processes. Provides functions for conceptual information analysis, logical design, physical design, code generation, schema integration, schema normalization, quality evaluation, schema mapping, reverse engineering (for a dozen legacy and modern DBMS), schema analysis, code analysis (a.o., depency graphs, pattern searching and program slicing), data analysis, schema transformation. High speed kernel and graphical interface written in C++. New functions can be developed in Voyager 2 (the DB-MAIN external language) or in Java. DB-MAIN includes a method engine and features to extend its meta-schema. It allows the representation of data structures of various paradigms and at any level of abstraction. It relies on the GER (Generic Entity-relationship model) and on transformational operators.
- Requirements: MS Windows, Linux.
- Date: 1993-2014+
- Status: ongoing
- Availability: Free of charge (except for advanced processors). See below
- References and resources:
- DB-MAIN v6 (free, max 500 objects)
- DB-MAIN v8 (free, max 2000 objects)
- DB-MAIN v9 (nouvelle interface, gratuit, sans limitation du nombre d'objets)
- First Steps with DB-MAIN (short tutorial)
- Introduction to database design with DB-MAIN (long tutorial)
- Visit also the DB-MAIN official site www.db-main.eu or the site of ReveR s.a. www.rever.eu, now in charge of the maintenance and evolution of DB-MAIN.
TRAMIS/Master
- Type: CASE environment
- Description: TRAMIS/Master was an extension of ORGA distributed by the small French company CONCIS. In particular, the following features were added:
- support for conceptual analysis
- a larger transformation toolbox
- evaluation functions (notably volume estimation)
- the specification database was implemented in MDBS (a CODASYL-like DBMS)
- Requirements: MS-DOS; MS Windows 3.11; MDBS
- Date: 1986-1991
- Status: discontinued; principles integrated into DB-MAIN
- Availability: software no longer available
- References and resources:
- Jean-Luc Hainaut,Mario Cadelli,Bernard Decuyper and Olivier Marchand. TRAMIS:a transformation-based database CASE tool, in Proceedings of 5th International Conference on Software Engineering and Applications, EC2 Publish., 1992. [description]
- Jean-Luc Hainaut, Mario Cadelli, Bernard Decuyper and Olivier Marchand. Database CASE Tool Architecture : Principles for Flexible Design Strategies, in Proceedings of the 4th International Conference on Advanced Information System Engineering (CAiSE’92), Lecture Notes in Computer Science, Volume 593, pages 187-207, Springer-Verlag, 1992. [description]
ORGA
- Type: CASE environment
- Description: ORGA was a companion CASE tool of the industrial IDA environment developed by the team of professor F. Bodard in the 80's and distributed (as well as ORGA) by METSI, a small French company in Information system design. IDA was mainly devoted to the conceptual specification of various aspects of Information systems. The goal of ORGA was the translation of information conceptual schemas into DDL code according to three data models, namely relational (SQL-87), CODASYL DDL (IDS2) and standard files (COBOL). In modern words, ORGA supported the logical design and code generation processes. Three key features of ORGA:
- the architecture of ORGA relies on a transformational toolbox. The conceptual/logical conversion was based on three transformation plans, one for each target data model. In addition, the user was allowed to apply elementary transformations to refine the logical schema. In modern words, ORGA supported Model-Driven Engineering.
- ORGA was developed in C and the specification database was managed through libraries emulating dBASE III structures [3] [4].
- The methodological principles of ORGA were developed in the (French) book Conception assistée des applications informatiques - Conception de la base de données, published by Masson (now DUNOD) in 1986.
- Requirements: MS-DOS; hard disk (recommended!)
- Date: 1984-1985
- Status: discontinued; principles integrated into TRAMIS/Master then in DB-MAIN
- Availability: software no longer available
- References and resources:
PHENIX
- Type: Reverse Engineering CASE environment
- Description: PHENIX was a CASE tool devoted to the reverse engineering of the file structures of large COBOL applications. It was developed with powerful knowledge management systems (which were very popular in the 80's and 90's), a decision which accounted for the poor performance of the code analyzer. In particular, it included:
- a code analyzer that detected program patterns
- a large transformation toolbox
- an inference engine to detect and identify implicit constructs.
- Requirements: Unix, LeLisp, SMECI, Aida/Masai
- Date: 1989-1992
- Status: discontinued; principles integrated into DB-MAIN
- Availability: software no longer available
- References and resources:
- Jean-Luc Hainaut, Catherine Tonneau, Michel Joris and Muriel Chandelon. Transformation-based Database Reverse Engineering, in Proceedings of 12th International Conference on Entity-Relationship Approach (ER’93), Lecture Notes in Computer Science, Volume 823, pages 364-375, Springer-Verlag, 1994. [description]
- Jean-Luc Hainaut, Muriel Chandelon, Catherine Tonneau and Michel Joris. Contribution to a Theory of Database Reverse Engineering, in Proceedings of the 1st Working Conference on Reverse Engineering (WCRE’93), pages 161-170, IEEE Computer Society, 1993. [description]
- Jean-Luc Hainaut, Catherine Tonneau, Michel Joris and Muriel Chandelon. Schema Transformation Techniques for Database Reverse Engineering, in Proceedings of 12th International Conference on the Entity-Relationship Approach (ER’93), pages 353-372, E/R Institute Publish., 1993. [description]
- Michel Joris, R. Hoe, Jean-Luc Hainaut, Muriel Chandelon, Tonneau Catherine and Bodart François. PHENIX : methods and tools for database reverse engineering, in Proceedings of 5th International Conference on Software Engineering and Applications, EC2 Publish., 1992. [description]
- Jean-Luc Hainaut. Database Reverse Engineering, Models, Techniques and Strategies, in Proceedings of the 10th International Conference on the Entity-Relationship Approach (ER’91), pages 729-741, ER Institute, 1991. [description] [full text]
VisiComp
- Type: Graphical computer simulator
- Description: VisiComp (name inspired by VisiCalc) is a graphical simulator for an educational computer. This computer has a very simple architecture: a main memory of up to 100 cells (each accomodating a signed 5-digit number), the standard handful of registers (P-counter, instruction register, logical register, accumulator + its extension register), 37 instructions organized into 7 instruction families, 10 addressing modes, a sequential input device, a sequential output device, bootstrap (kind of MBR). The simulator also includes a monitor that manages and controls all the parts of the computer (interactive + script interpreter), configuration files, input file loader, etc. The computer has been used to illustrate architecture and programming principles such as Von Neuman architecture, basic program structures, I/O buffering, recursion, relocatable programs, bootstrapping and even a complete business application (order management and invoicing with 3 files, almost as powerful as SAP!) The simulator and its additional components required at runtime no more than 12 KB (12,000 bytes!) in RAM. It was developed in TRS-80 Basic then in CP/M Basic.
- Requirements: TRS-80 Model 1 - Level 2 (16 KB RAM); CP/M; BASIC language processor;
- Date: 1982
- Status: discontinued
- Availability: Source code available on paper
- References and resources:
- Jean-Luc Hainaut, THE VISIBLE COMPUTER - A graphical computer simulator. Technical report, 1982. 75 pages. [full text] This document results from the scanning of poorly printed pages. It is incomplete and probably includes too many errors. In addition, it will probably be considered fairly naive. Consider that it was written 30 years ago and that computer technology has considerable evolved since then, to such an extent that is has become quite complex and impossible to describe in detail to plain lay readers.
- Source code available on paper.
- Note: just for the fun, I have developed a new graphical interface for Visicomp in Python/Tkinter. It would take no more than 2-3 weeks to add the logic behind. Anyone interested?
NDBS
- Type: Database Management System
- Description: NDBS (Network Database System) is an educational database management environment allowing Turbo-Pascal programs to manage and use complex data in an efficient, though very intuitive, way. NDBS comprised a runtime library (database handler), a DDL compiler, a 4GL Query Language, a data dictionary, a report generator, a SQL/NDBS converter, an import/export tool, etc. As far as we now, the only industrial use was by an Indian Statistics department, which developed several applications programs based on NDBS. The last version (under the name Pyramid) was developed by D. Rossi, then Master student in the University of Namur. Two original aspects: (1) the data model was a variant of the ER model, (2) the physical engine was a fairly strict implementation of the principles developed in the course of Database Technology I gave in the 80's.
- Requirements: Windows; written in Turbo-Pascal
- Date: 1986-1996
- Status: discontinued
- Availability: source code (but I must search floppy disks for it).
- References and resources:
- [TR86-01]Jean-Luc Hainaut, NDBS - A simple database system for small computers. Technical report, 1986. 144 pages. [full text] This text has been recovered from various materials written in the eighties (Mac Word documents) and roughly assembled into a unique FrameMaker document. The result would require in-depth polishing, but, to be frank, I currently have no spare time to carry out this task. Sometimes in the near future perhaps!
SQL-Script Interpreter
- Type: Interpreter for an SQL-based scripting language
- Description: Empty Access application (*.mdb) that allows users to execute SQL scripts on any database. Includes a simple dialog box with two buttons: select a script file and execute it. For information, MS Access provides two ways to execute SQL statements: as Access queries (simple but single queries only) and embedded in VB programs (powerful but complex). This tool allows one to execute in one click a script made up of a sequence of SQL statements, but also much more complex scripts with variables, macros, loops, alternatives and interaction with users. This application has been frozen in 2012 since the development of SQLfast (see above).
- Requirements: Microsoft Access
- Date: 2005-2011+
- Status: ongoing
- Availability: available online
- References and resources:
SPHINX
- Type: Database Management System, Natural Language interface, Data dictionary
- Description: SPHINX is a database management system based on the ER model. It was developed in the 70's as a result of large national project devoted to the development and management of large administrative databases. SPHINX comprised several components:
- a runtime database handler
- a high-level DML language integrated to COBOL
- NUL: an incremental user-oriented query language
- a meta-system (based on SPHINX)
- Requirements: Siemens BS-2000; SESAM data manager (one of the first schema-less inverted-file data manager);
- Date: 1971-1977
- Status: discontinued
- Availability: software no longer available
- References and resources:
- Baudouin Le Charlier and Jean-Luc Hainaut. Modèles, Langages et Systèmes pour la conception et l’exploitation de bases de données, in Actes du congrès AFCET 1978, pages 179-189, Editions Hommes et Techniques, 1978. [description] [full text]
- Claude Deheneffe and Henri Hennebert. NUL: A Navigational User’s Language for a Network Structured Data Base, in Proceedings of the 1976 ACM SIGMOD International Conference on Management of Data, pages 135-142, ACM, 1976. [description] [full text]
- Claude Deheneffe, Henri Hennebert and Walter Paulus. A Relational Model for a Data Base, in Proceedings of the IFIP congress 74, pages 1022-1025, North-Holland, 1974. [description] [full text]
- Jean-Luc Hainaut and Baudouin Le Charlier. An Extensible Semantic Model of Data Base and Its Data language, in Proceedings of the IFIP Congress 74, pages 1026-1030, North-Holland, 1974. [description] [full text]
- [TR78-01]Jean-Luc Hainaut, Baudouin Le Charlier, et al., Système de conception et d'exploitation de bases de données - Volume 1 : Modèles et Langages. Rapport final du projet CIPS I2/15, Institut d'informatique, Université de Namur, 1978. [full text]
- [TR78-02]Jean-Luc Hainaut, Baudouin Le Charlier, et al., Système de conception et d'exploitation de bases de données - Volume 2 : Manuel de référence des langages. Rapport final du projet CIPS I2/15, Institut d'informatique, Université de Namur, 1978. [full text]
- [TR78-03]Jean-Luc Hainaut, Baudouin Le Charlier, et al., Système de conception et d'exploitation de bases de données - Volume 3 : Une implémentation du modèle d'accès. Rapport final du projet CIPS I2/15, Institut d'informatique, Université de Namur, 1978. [full text]
- [TR78-04]Jean-Luc Hainaut, Baudouin Le Charlier, et al., Système de conception et d'exploitation de bases de données - Volume 4 : Le système SPHINX, Utilisation, fonctionnement et description interne. Rapport final du projet CIPS I2/15, Institut d'informatique, Université de Namur, 1978. [full text]
- [TR78-05]Jean-Luc Hainaut, Baudouin Le Charlier, et al., Système de conception et d'exploitation de bases de données - Volume 5 : Exemples d'application. Rapport final du projet CIPS I2/15, Institut d'informatique, Université de Namur, 1978. [full text]