Impact Analysis of Database Schema Changes Andy Maule, Wolfgang Emmerich and David S. Rosenblum...
-
Upload
claud-mccormick -
Category
Documents
-
view
217 -
download
1
Transcript of Impact Analysis of Database Schema Changes Andy Maule, Wolfgang Emmerich and David S. Rosenblum...
Impact Analysis of Database Schema Changes
Andy Maule, Wolfgang Emmerich and David S. Rosenblum
London Software Systems
Dept. of Computer Science, University College London
{a.maule|w.emmerich|d.rosenblum}@cs.ucl.ac.uk
2008
Database Management Systems (DBMS)
Provide• concurrent access• efficient execution of complex queries over large datasets
Often used with OO languages (C++, Java, and C#)– Results in impedance mismatch problem
• Complicates impact analysis
Motivation
The effects of DB schema change estimated manually
fragile and difficultexpensive
Proposition of assessing the effects of DB schema change in an automatic more reliable and cost-effective way
Not the reconciliation of the impacts themselves,but rather the difficulty of discovering and predicting them
The Impact of Schema Change
• any location in the application which will behave differently• required to behave differently
Change1Change2Change3(delete Name)
Data Access Practices
Entry point
types DBConnection, DBRecord, DBRecordSet, DBParambelonging to the persistence API
Need to know:• State of these types• Exact query• Where/how results are
used
Approach
• Program Analysis (PA) is compile-time techniques for approximating the run-time properties of program
– Previously has been used to extract queries from OO languages
• String Analysis (SA) is a form of PA where the possible run-time values of string variables are predicted for selected locations in the program.– Gould et al used SA to predict the values of strings
passed to the Java JDBC library methods, in order to check that the queries are type safe with respect to the DB schema
– Christensen et al created the JSA application using SA
Approach Overview(Requirements for Context-Sensitivity)
• Specifies how precisely the calling context of procedures are represented in dataflow analyses
k-CFA
• where all or some of the propagated data in the dataflow analysis include a call string that represents the last k calling call-sites
Approach Overview(Required Precision of Context-Sensitivity)
distinguishes between different values of the variables belonging to separate calling contexts
Program Slicing
• Program Slicing– Extraction of a subset of the source application that can
affect, or be affected by the DB calls • Why?
– As k increases, k-CFA analysis has exponential complexity with respect to program size
Program Slicing Example
• Cite the code used in the paper as an example and show the how the slicing will reduce the code.
Dataflow Analysis
• Computing set of runtime properties that can occur at a given point in time of a program– The analysis is based on string analysis by Choi et al.
• Two modifications in this existing string analysis– Increase Context Sensitivity– Addition of query types to the string analysis
Dataflow Analysis – Increasing Context Sensitivity
• Modify Choi et al’s algorithm from 1-CFA to k-CFA– Modify the property space of dataflow analysis
– Abstract variables and abstract heap locations are distinguished from context locations
– Extending identifiers to include a string of k call sites
Dataflow Analysis – Add query types to String Analysis
• query types – denotes all query representing types and those that are involved in execution and use of database queries
• Generate a dataflow graph by performing a standard fixed point iteration of the graph from the slicing stage
• Result: – All query representation types have an estimated set of
possible runtime values – Other query type objects (returned result sets) have unique
identifiers associated based on their instantiation information
Impact Calculation
• Involved in the prediction of possible effects of database schema change– Use CrocoPat [tool that efficiently executes relational
programs against arbitrary data]
Implementation
• Currently used only for C# applications that use SQL Server databases
• Total size of SUITE = 19 KLOC (written in C#)
Evaluation - Basis
• What is this evaluation for ?– Evaluate the feasibility of the technique presented in this
paper
• To generalize the evaluation, the subject application had to represent the real world practice for database driven applications
Evaluation - Setup
• Subject Application: irPublish (content management system)– Consists of 127 KLOC of C# code– Uses database schema of up to 101 tables and 615 columns
and 568 stored procedures
• Three (interesting changes) out of 62 previous schema changes were chosen for evaluation
• System Configuration used:– 2.13GHz Intel Pentium Processor– 1.5GB RAM
Evaluation – Setup (contd…)
• Schema changes used in this evaluation are as shown:
• Value of k used for dataflow analysis is 2 (there were places where the value for k need to be set up to 7)
Evaluation Summary – Comparison of Predicted changes vs. Observed changes
• Program Slicing reduced the size to 37% (from 191173 instructions to 70050 instructions)
Summary of Results
• Highlights the importance of context sensitive program analysis– High level of context sensitivity is required in many real world
architectures where similar architecture patterns are used
• The types of schema change that occurred agree with the predictions of the study
Related Work
• Impact analysis of database schema– A. Karahasanovic - Supporting Application consistency in
Evolving Object-Oriented Systems by Impact Analysis and Visualization [2002]
• String analysis– A. S. Christensen, A. Moller, and M. I. Schwartzbach. -
Precise analysis of string expressions [2003]– C. Gould, Z. Su, and P. Devanbu. Static Checking of
Dynamically Generated Queries in Database Applications. [2004]
– T.-H. Choi, O. Lee, H. Kim, and K.-G. Doh. - A practical string analyzer by the widening approach. [2006]
• Dataflow analysis– S. Horwitz, T. Reps, and M. Sagiv. - Demand interprocedural
dataflow analysis [1995]
Future Work
• Investigate alternatives to program slicing technique to reduce cost of dataflow analysis
• To analyze impact of the available impact analysis techniques on the development of database applications