SAP Standard Data Volume Mgmnt

38
Version: 1.0 April 2007 SAP Standard Data Volume Management Whitepaper Active Global Support SAP AG © 2007 SAP AG SAP Standard Data Volume Management Version: 1.0 Page 1 of 38

Transcript of SAP Standard Data Volume Mgmnt

Version: 1.0 April 2007

SAP Standard Data Volume Management

Whitepaper Active Global Support SAP AG

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 1 of 38

SAP Standard Data Volume ManagementTable of ContentTable of Content................................................................................................2 1 2 3 44.1 4.2 4.3 4.4 4.4.1

Management Summary ........................................................................4 SAP Standards for E2E Solution Operations.....................................5 Data Volume Management Standard at a Glance ..............................8 What is the Basic Concept of the DVM Standard? ............................9Architecture and Process Flow..........................................................................9 Monitoring Data Growth...................................................................................11 Data Volume Management Scoping................................................................12 Data Volume Management Strategy ...............................................................14 Data Volume Management Strategy Implement...........................................14

4.4.1.1 Data Volume Management Strategy Implement Business Blueprint.........16 4.4.1.2 Data Volume Management Strategy Implement Technical Design ...........18 4.4.1.3 4.4.2 4.4.3 4.5 4.6 Data Volume Management Strategy Implement Deploy ...........................19 Data Volume Management Strategy Operate ..............................................21 Data Volume Management Strategy - Optimize ..............................................21 Data Volume Management Reporting .............................................................23 Data Volume Management for Upgrade..........................................................24

55.1 5.1.1 5.1.2 5.1.2.2 5.1.3 5.1.4 5.1.5 5.2 5.2.1

How to Implement the DVM Standard? ............................................ 27Methodology - Implementation ........................................................................27 Data Analysis...................................................................................................27 Technical Archiving Customizing.....................................................................27 Archiving object-specific customizing ..............................................................28 Setup of SAP AS Archive Information Structures............................................28 Document Management ..................................................................................29 End-user Training ............................................................................................29 Methodology - Operations ...............................................................................30 Job Scheduling ................................................................................................30

5.1.2.1 Cross-archiving object customizing .................................................................27

5.2.1.1 Scheduling Write Jobs.....................................................................................30 5.2.1.2 Scheduling Delete Jobs...................................................................................30 5.2.2 Documentation ................................................................................................30 5.2.3 Database Reorganization ................................................................................31 2007 SAP AG SAP Standard Data Volume Management Version: 1.0 Page 2 of 38

SAP Standard Data Volume Management5.2.4 5.2.5 5.2.5.2 5.2.6 5.3 5.4 5.5 House-Keeping................................................................................................32 Troubleshooting...............................................................................................32 Delete Job Termination ...................................................................................33 Monitoring........................................................................................................34 Tools................................................................................................................34 People .............................................................................................................35 Available sources of information......................................................................36

5.2.5.1 Write Job Terminates ......................................................................................32

6

How to Measure the Success of the Implementation?.................... 37

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 3 of 38

SAP Standard Data Volume Management

1 Management SummaryManaging complexity, risk, costs as well as skills and resources is at the heart of implementing mission critical support for SAP-centric solutions. The complexity rises even further with the trend of outtasking and outsourcing of process components. To help customers manage their SAP-centric solutions, SAP provides a comprehensive set of standards for solution operations. Out of this set of standards, the data volume management standard. Due to the tendency to run highly interconnected systems (internal and external) and the business needs to have data instantly accessible, data volumes are growing and growing. The data volume management standard defines how to manage data growth by avoiding data creation, archiving data, and deleting data. The standard helps customers to increase the availability of their solution because administration tasks like software upgrade, database reorganization, data backups or recovery can be done in less time. In addition, the customer can reduce time needed for managing disk space and benefit from more efficient use of resources such as hard drives, memory, and CPU. This paper provides details regarding the data volume management standard. At first, it explains the basic concept of the standard. It explains the basic concept of the standard and describes the different steps within the process flow. Then it provides details regarding implementation and operation methodologies for data volume management and lists the tools and roles involved in the process. Finally, the paper discusses how key performance indicators can be used to measure the success of data volume management and to control corresponding service level agreements.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 4 of 38

SAP Standard Data Volume Management

2 SAP Standards for E2E Solution OperationsMission-critical operations is a challenge. While the flexibility of SAP-centric solutions rises, customers have to manage complexity, risks, costs, as well as skills and resources efficiently. Customers have to run and incrementally improve the IT solution to ensure stable operation of the solution landscape. This includes the management of availability, performance, process and data transparency, data consistency, IT process compliance, and other tasks. Typically, multiple teams in the customer organization are involved in the fulfillment of these requirements. They belong to the key organizational areas Business Unit and IT. While the names of the organizations may differ from company to company, their function is roughly the same. They run their activities in accordance with the corporate strategy, corporate policies (for example, corporate governance, compliance and security), and the goals of their organizations.

The different teams specialize in the execution of certain tasks: On the business side, end users use the implemented functionality to run their daily business. Key users provide firstlevel support for their colleagues. Business process champions define how business processes are to be executed. A program management office communicates these requirements to the IT organization, decides on the financing of development and operations, and ensures that the requirements are implemented. On the technical side, the application management team is in direct contact with the business units. It is responsible for implementing the business requirements and providing support for end users. Business process operations covers the monitoring and support of the business applications, their integration, and the automation of jobs. Custom development takes care of adjusting the solution to customer-specific requirements and developments. SAP technical operations is responsible for the general administration of systems and de 2007 SAP AG SAP Standard Data Volume Management Version: 1.0 Page 5 of 38

SAP Standard Data Volume Managementtailed system diagnostics. And the IT infrastructure organization provides the underlying IT infrastructure (network, databases, ). Further specialization is possible within these organizations as well. For example, there may be individual experts for different applications within SAP technical operations. Efficient collaboration between these teams is required to optimize the operation of SAPcentric solutions. This becomes even more important if customers engage service providers to execute some of the tasks or even complete processes. Customers have to closely integrate the providers of outtasking and outsourcing services into the operation of their solutions. Key prerequisite for efficient collaboration of the involved groups is the clear definition of processes, responsibilities, service level agreements (SLAs), and key performance indicators (KPIs) to measure the fulfillment of the service levels. Based on the experiences gained by SAP Active Global Support while serving more than 36,000 customers, SAP has defined process standards and best practices, which help customers to set up and run End-to-End (E2E) Solution Operations for their SAP-centric solutions. This covers not only applications from SAP but also applications from ISVs, OEMs, and custom code applications integrated into the customer solution. There are 16 standards for solution operations defined by SAP: Incident Management describes the process of incident resolution Exception Handling explains how to define a model and procedures to manage exceptions and error situations during daily business operations Data Integrity avoids data inconsistencies in end-to-end solution landscapes Change Request Management enables efficient and punctual implementation of changes with minimal risks Upgrade guides customers and technology partners through upgrade projects eSOA Readiness covers both technical and organizational readiness for enterprise service-oriented architectures (eSOA) Root Cause Analysis defines how to perform root cause analysis end-to-end across different support levels and different technologies Change Control Management covers the deployment and the analysis of changes Minimum Documentation defines the required documentation and reporting regarding the customer solution Remote Supportability contains five basic requirements that have to be met to optimize the supportability of customer solutions Business Process and Interface Monitoring describes the monitoring and supervision of the mission critical business processes Data Volume Management defines how to manage data growth Job Scheduling Management explains how to manage the planning, scheduling, and monitoring of background jobs SAP Standard Data Volume Management Version: 1.0 Page 6 of 38

2007 SAP AG

SAP Standard Data Volume Management Transactional Consistency safeguards data synchronization across applications in distributed system landscapes System Administration describes how to administer SAP technology in order to run a customer solution efficiently System Monitoring covers monitoring and reporting of the technical status of IT solutions

Out of this list, this white paper describes the data volume management standard.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 7 of 38

SAP Standard Data Volume Management

3 Data Volume Management Standard at a GlanceDue to the fact of highly interconnected systems (internal and external) and the business needs to have those data accessible on the fingertip, the usual data volume is growing and growing. Just adding additional disks to storage area networks (SANs) and storage subsystems over time sometimes worsens the situation e.g. as system management can become more and more difficult or performance may worsen. The data volume management (DVM) standard defines how to manage data growth by avoiding data creation, deletion and archiving data. Data volume scoping is the starting point of data volume management. A detailed look at customers systems identifies the major pain points and gives an outlook on the most beneficial measures to take (e.g. deletion or data archiving) when implementing a data volume management strategy. The focus of a data volume management strategy is on the implementation of the methodologies of avoidance, summarization, deletion and data-archiving just depending on the kind and type of data and by considering the business and external requirements (legal compliance, ILM) on this data. Data volume reporting lists the archiving activities already performed and identifies additional reduction potential which may be caused e.g. by change in the business processes. Scheduling this service on a regular basis will show the growth of the database broken down on business object level - and point out how the growth could be minimized. Reaching a Steady State, that is the balance between additional new data and archived data is definitely an SAP Standard that is required to run SAP within given service level agreements through future periods of time. Business process operations owns the data volume management standard. if the process identifies data to be reorganized or archived, the execution of these tasks will be done by sap technical operations or IT infrastructure. Data volume management increases the availability of the customer solution because administration tasks like software upgrade, database reorganization, data backups or recovery can be done in less time. In addition, customer can reduce time needed for managing disk space and benefit from more efficient use of resources such as hard drives, memory, and CPU.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 8 of 38

SAP Standard Data Volume Management

4 What is the Basic Concept of the DVM Standard?The data volume management standard describes how to achieve running an SAP system that is as large as necessary to fulfill business and legal requirements and as small as possible to allow a smooth system operation.

4.1 Architecture and Process FlowThe data volume management standard covers the following components: Data Volume Management Scoping Data Volume Management Strategy Data Volume Management Reporting.

Figure 4.1: Components of Data Volume Management Standard The first starting point to detect the need for the implementation of a DVM strategy is a defined monitoring of data growth. As soon as the threshold is reached, a DVM scoping is initiated to determine if the data growth and volume can best be handled by the implementation of a DVM strategy or if (in 2007 SAP AG SAP Standard Data Volume Management Version: 1.0 Page 9 of 38

SAP Standard Data Volume Managementaddition) a review of the business process is the more promising approach to limit the future creation of data.

Figure 4.2: Data Volume Management Process View As soon as the scoping determined that DVM means (e.g. deletion of data, avoidance, or data archiving) should be applied, an implementation phase starts in which the business process and legal requirements on data are checked against the technical need for data reduction. After the means are implemented, the DVM strategy is in the operating phase, and the archiving and delete jobs are scheduled regularly, the success of the implementation has to be monitored continuously by DVM Reporting to detect new or additional potential for improvement that may be caused by changed business processes. Only when considering all aspects of data volume management (Implementation, Operation and Reporting), it will be possible to run a system that is as large as necessary and as small as possible on the long term. The DVM standard requires a high degree of cooperation between business process champions and IT departments. From the business department the businesses process champions and from the IT department persons of the business process operations team must be involved.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 10 of 38

SAP Standard Data Volume Management

4.2 Monitoring Data GrowthPurpose In Monitoring data growth as first step of Data Volume Management, the behavior of the database regarding the data growth is monitored. In this step the size of the database volume and their monthly growth (e.g. of the last 12 months) is identified. This information helps to become aware of a need for implementing a data volume management strategy to reduce data growth or total data volume. Experience shows that overall database growth or growth of specific tables causes a growing administration effort or reaches some limits (e.g. available disk space in storage system), Performance problems.

Process Flow The business process operations team should check the size of the database and determine their growth behavior in the past on monthly level (min. the last 12 month) on regular basis using system monitoring functions like database monitor (e.g. transaction DB02) or Early watch Alerts. As next step, the identified data volume size and / or the monthly data growth is now compared against the DVM KPIs. DVM KPIs are threshold values which define criterions for the implementation of a Data Volume Management strategy as recommended or mandatory. The KPIs are depending on the solution and the following criterions are currently used as default, based on experience of the last years. Solution Data Volume Size (GB) DVM recommended Enterprise Systems Business Intelligence Customer Relationship Mgmt. Supplier Relationship Mgmt. 500 500 200 200 DVM mandatory 800 800 500 500 Monthly Data Growth (GB) 20 20 15 15

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 11 of 38

SAP Standard Data Volume ManagementThe implementation of a Data Volume Management strategy should be considered if one or both of the criterions (data volume size, monthly data growth) are applicable. The Data Volume Size and the Monthly Data Growth are mostly used as criterions, but in specific situations other criterions will be used as decision for a Data Volume Management strategy like General performance issues or Quality assurance issues regarding reduction of expired data independent of data volume size and monthly data growth.

Result The result is that the DVM Scoping process is initiated for a specific system or system landscape if the DVM KPIs are exceeded. Otherwise the regular monitoring of the data growth will be continued.

4.3 Data Volume Management ScopingPurpose If monitoring data growth considers the implementation of a Data Volume Management strategy, DVM scoping evaluates the system or system landscape regarding the anticipating data reduction possibilities. DVM scoping examines for which business areas and business objects the data volume and the data growth can be reduced by the DVM standard reduction methodologies data avoidance, data summarization, data deletion and data archiving, the reduction possibilities of the data growth must be evaluated by a business process analysis, if the DVM standard reduction methodologies could not be used. This will be the case if for a business object DVM reduction possibilities are not available or the data of this business object are not in the scope of data deletion or data archiving regarding their residence time. Business process analysis is not in the scope of the DVM standard.

Process Flow The Focus of DVM scoping is identifying and recommending the best strategy for reducing the data volume size or the data growth or both. Based on the result of the investigation regarding system data, top tables, data volume growth, performed archiving activities and distribution of the data over the years, The implementation of a DVM strategy will be recommended, o if only a potential for data avoidance, data summarization or data deletion will be detected (e.g. only data from the last 12 month are available and archiving of data from the current year will not be possible), If a potential for data avoidance, data summarization, data deletion and data archiving will be detected, SAP Standard Data Volume Management Version: 1.0 Page 12 of 38

o

2007 SAP AG

SAP Standard Data Volume Managemento o If a potential for data archiving will be detected. If a potential for data avoidance, data summarization, data deletion and data archiving will be detected even though data deletion and data archiving is currently performed. Data archiving activities performed longer as 2 years in the past are not considered and are classified as non relevant for the decision making. If no relevant potential for data avoidance, data summarization, data deletion and data archiving will be detected, If a DVM service has already been performed. If a huge data growth will be detected, but standard data avoidance, data summarization, data deletion or data archiving functionality is not available or could not used, if the related data are within their residence time. So the data growth could only be reduced by a business process re-design. Example: The data avoidance and summarization options covered in a standard DVM will not show the required effect. The data could neither be deleted nor archived because their residence time is not expired. In this case, other means of data avoidance, e.g. by changing the business process, have to be evaluated. This evaluation is not in the scope of the DVM standard and will be performed by an application specific business expert team. The following process steps must be performed getting the necessary information as basis for the decisions described above: Identification of general system information like o o o o o o Solution Component Release-, Support-Package Level Date of production start Productive clients (multi-client system) Current data volume size (data and indexes) Monthly data growth (default: last 12 month)

The assessment of an existing DVM strategy will be recommended,

o

The implementation of a data monitoring strategy will be recommended, o o

A business process analysis will be recommended, o

Identification of the top largest and fastest growing tables (default: 20 tables) Categorization of these top tables o o o o Regarding their affiliation to a business area and business object Identification of the reduction possibilities of this business object Identification of corresponding deletion object Identification of corresponding archiving object SAP Standard Data Volume Management Version: 1.0 Page 13 of 38

2007 SAP AG

SAP Standard Data Volume Managemento Identification of the yearly distribution of the business object data as basis for the estimation of the archiving potential from a technical point of view Identification of the performed archiving objects with the status completed archiving runs Identification of their execution period (date of first and last run) Archive objects with a date of last archive run older than 2 years are considered as not active Evaluation of the number archived objects and the size of archived files The definition of recommendations (see above), The effort, time schedule estimation and the scope for the implementation or assessment of a DVM Strategy using DVM Best Practices.

Identification of the archiving history o o o o

Evaluation of categorization and archiving history information for o o

Result The result of the DVM scoping process is the recommendation of the follow up process like Implementation or assessment of a DVM strategy or Implementation of DVM reporting as data monitoring strategy or Considering / performing a business process analysis.

If the customer proceeds with DVM strategy as recommended follow up process, the effort and time schedule could be estimated and the scope of service could be defined for a proposal.

4.4 Data Volume Management Strategy4.4.1 Data Volume Management Strategy ImplementThe initial implementation of a data volume management strategy is best handled as a project. The project organization will help to integrate all required teams and roles. Especially when business objects that are required for internal or external reporting purposes (e.g. legal compliance) are in the scope of data-archiving, it is absolutely vital to involve the business process champion. The technical implementation aspects have to be covered by the business process operations team. Usually, the implementation is driven by the business process operations team The following list shows the key success factors of a DVM implementation: Management Support (e.g. for sponsoring) Team Setup incl. IT and Business Unit SAP Standard Data Volume Management Version: 1.0 Page 14 of 38

2007 SAP AG

SAP Standard Data Volume Management A committed Business Unit that understands the need for DVM Proper Scoping achieved by a qualified pre data analysis to set the focus right (e.g. on Quick-Wins in the first implementation phase) Know How of Archive Administrators Test System for Mass and Performance Tests

A successful implementation should always try to find the balance between defining a short residence time (to minimize storage costs) and defining a comfortable residence time (in order to effect the business processes as less as possible). In some cases, very short residence times require the definition and development of customer-specific workarounds that will raise the implementation costs (e.g. the need for implementation of Z-reports for accessing archived data.)

Figure 4.3: Storage Costs vs. Project Costs If data of different countries with different legal aspects is included in the project, it is realistic to expect a runtime of several months for all the necessary clarification and tests.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 15 of 38

SAP Standard Data Volume Management

4.4.1.1 Data Volume Management Strategy Implement Business BlueprintPurpose In this step the technical requirements (IT-driven) for data volume management (i.e. usually the removal of data from the DB) is matched against the business requirements on data (i.e. usually to keep the data on the DB to allow flexible and fast reporting). Process Flow Crucial for the successful completion of this step is the involvement of the Business Units. The business process champion and the key users discuss in several workshops about the options with the application team. Usually each workshop focuses on a specific application or document type. The workshop should be driven and prepared by the Business Process Operation team. A proper preparation is recommended and should include A detailed and profound data analysis (performed by application management e.g. by use of transaction TAANA) o to identify the business object specific potential for data volume management (e.g. by year, type, company code, plant, ) This helps to clearly present the expected effect of any measures so that the effect of the definition of a residence time will become obvious. In addition, it may help when specific data has to be excluded e.g. from data-archiving for business reasons. o to identify the appropriate archiving object or deletion report. In some cases more than one archiving object can be applied for the same table and it is necessary to find out which archiving object is the most appropriate one to use. to identify possible showstoppers for data volume management measures (e.g. non-completed documents will not pass the checks for data-archiving and will remain on the DB).

o

Getting familiar with all details and features of the planed measure (e.g. of a specific archiving object) (performed by application management) o Usually, test archiving that is performed before the workshop and allows to demo how archived data still can be displayed is very helpful in the discussion and takes fears of data loss off the key users.

A good starting point is the Output of the DVM scoping, which exactly shows which means are available for every of the large tables.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 16 of 38

SAP Standard Data Volume Management Additional useful information (e.g. data management guide, reports and transactions for accessing archived data) can be found at the data archiving site on the Service Marketplace: http://service.sap.com/data-archving -> Media Library -> Literature & Brochures A more technical approach is SAP transaction DB15 that shows the linkage between an archiving object and its related tables and vice versa. In addition, the SAP Notes search can be used to find additional information with usually a more technical focus. The SAP DVM service offers guidance and detailed knowledge transfer for the definition and implementation of a DVM strategy

The following list shows the minimum key questions that have to be answered by the business process champion: Is the data required at all or is it possible to avoid future postings? Is the data required at the current level of detail or is it possible to summarize the data? Is the data required, but only for a limited timeframe and can be deleted afterwards? How long has the data to remain in the online DB before it should be archived? (=> definition of residence time) How long do the archive files have to be stored for retrieval for business and audit purposes? Where should the archive files be stored? (external Content Server or directly on file system) Is the usually limited available standard functionality on archived data sufficient for business and audit purposes? Which level of comfort for accessing archived data is required? o o Which minimum search criteria should be available. Is a fast indexed access (for dialog accesses on archived data) required or is a non-indexed sequential read of archived data in a batch process sufficient? This depends strongly on the access frequency.

Are there any actions to be taken before data archiving to ensure legal compliance (e.g. create a DART (Data Retention Tool) extract for tax audits (US, Germany). Dependencies between archiving and deletion have to be considered. In some cases, the business process defines the sequence of the archiving objects, in other cases the archiving objects follow a pre-defined sequence that is achieved by implemented existing checks. E.g. CO total records can only be archived as soon as all corresponding CO line items have been archived before. What are the effects of DVM measures on the system landscape? Are there any other systems connected via an interface that will be affected in a negative way? E.g. it is important to consider the interface between a transactional system (e.g. R/3, CRM) and an attached BI system as the upload of data to BI that

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 17 of 38

SAP Standard Data Volume Managementalready has been archived in R/3 or CRM is possible, but it is also difficult and time consuming. In case of data archiving, the application management team needs to provide the information about available features on displaying of and reporting on archived data, so that the business process champion gets a profound understanding of the effects and consequences of data archiving. Result The result of this process is a preliminary definition of the data volume management strategy. This preliminary definition has to be finally confirmed by the business process champion after an intensive test phase with involvement of the key users.

4.4.1.2 Data Volume Management Strategy Implement Technical DesignPurpose In this step, the transfer of the specified business requirements in customizing and variant definition is performed. In addition, this step includes the definition of the technical setup of data-archiving, for example, where to store the archive files or the retention time of an archive file until its destruction (see ILM concept). Examples: In case of summarization, the customizing is set up. In case of archiving the customizing of residence times for an archiving object is performed and the selection variables for the archive write job are defined. Process Flow There is no single general place to avoid creation of data or to start and schedule deletion. How to proceed and which report or customizing option to use depends strongly on the business object or technical table. For data archiving, the central place for customizing and variant definition is the Netweaver Archive Administration (SAP transaction SARA). For details see chapter: 5.1 Methodology Implementation. A full and detailed understanding of customizing parameters and the report variant definition is required. Result The system is setup and prepared for functional and performance tests.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 18 of 38

SAP Standard Data Volume Management4.4.1.3 Data Volume Management Strategy Implement Deploy4.4.1.3.1 Purpose After the business requirements of the preliminary DVM strategy have been transformed to customizing, the effects and results have to be tested. Process Flow Before starting the tests, the latest SAP Notes should be checked and implemented if necessary. The Notes search should search for the name of the archiving object, name of deletion report or customizing transaction for avoidance. The test should cover two aspects: Functional correctness and business acceptance o Is the result as expected? Is the data after avoidance still sufficient for reporting and retrieval needs? Does the deleted data match the data selected in the deletion report? o Display functionality on archived data? Is the way of display which may differ from the display of online data acceptable and includes all required details? Is the performance of accessing archived data sufficient? Is the search criteria offered on archived data sufficient? Is the display of linked data (e.g. business objects in document flow, PDF documents, attachments, documents) sufficient? o Effects on interfaces and other systems in system landscape Are there any systems that try to access archived data? E.g. a BW system may try to upload data that has already been archived in the ERP system Archiving in CRM may send out BDocs to update the corresponding documents in ERP. Performance of the deletion report and archiving process. This mass test is important to estimate the runtime in the production system. Based on the result, the data volume processed in one single job may have to be adjusted to ensure a reasonable runtime that should not exceed 6-8 hours. Test

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 19 of 38

SAP Standard Data Volume ManagementIf possible, this mass test should best be performed on a system of comparable size to the production system. A test case with all detailed steps is prepared by the application management team. The functional test is then performed by the key users of the business unit based on the test case. Result The results are o o Sign-Off / Confirmation of the planned activities by the business process champion Tested setup of job variants and customizing that can be used in the production environment Go-Live and CatchUp Phase

4.4.1.3.2 Purpose

The activities and measures defined in the DVM strategy should be implemented carefully and step by step. As archiving and deletion processes will put additional load on the system, the effects on the overall system performance has to be monitored carefully. Process Flow It should be started with only a very low level of jobs running in parallel. After gaining some experience, this number can then be increased if necessary. It is not necessary to start many different archiving objects or deletion jobs at the same time. Instead, for 2-3 business objects the usually existing backlog of obsolete data should be archived / deleted first before the next jobs should be scheduled. The removal of the backlog (catchup phase) may require some time, depending on the overall system loads and the resulting limitations in scheduling the jobs, this phase may take several months. In contrast to the following phase of standard operations the catchup phase requires manual job scheduling and monitoring. Usually, no automated or dynamic variant definitions can be used. A different selection variant definition may be required for every job. Intensive monitoring is also required to find out about the possible maximum of additional workload that the DVM jobs may cause without any negative side effect on standard operation. o o A pre-defined list of jobs to be scheduled as result of the test phase. Troubleshooting Guide explaining the most common errors and fixes or workaround. It is especially important to distinguish between errors that can be handled by the IT unit and errors that need an integration of the business unit.

Result 2007 SAP AG SAP Standard Data Volume Management Version: 1.0 Page 20 of 38

SAP Standard Data Volume ManagementAfter the go-live and catchup phase, the DVM jobs can be moved to standard operation mode. During the catchup phase, the business process operation team gained enough experience in scheduling the jobs, their runtime and their side-effects on other processes. So, they are well prepared for the definition of automated jobs that will run without manual interaction.

4.4.2 Data Volume Management Strategy OperatePurpose The operation of a DVM strategy is handled quite differently among SAP customers and is strongly dependent on the number of jobs and the frequency defined for scheduling the jobs. Process Flow Some customers schedule all archiving and deletion jobs manually with a low frequency, e.g. once a year after fiscal years closing. This approach is only feasible for systems with a medium data volume. The are two disadvantages to that approach: As with all manual tasks, the manual scheduling bears the risk that the scheduling simply may be forgotten or missed after an organizational change in the team (e.g. change of the person responsible for scheduling the jobs) A complete years data volume that has to be archived and deleted once a year may be hard to handle in just one single job. Tight cooperation between the SAP technical operations team and the business process operations team is required to manage the additional system load that is caused by these additional jobs.

The recommended approach is to implement jobs that run on a regular basis (e.g. monthly or quarterly) without any manual interaction. The advantage is that these jobs can be included in the general monitoring concept and will no longer require manual effort for scheduling and monitoring. To achieve this, profound experience (gained in the go-live and catchup phase) on the jobs runtimes is necessary. In addition, an optimal technical customizing is required to support this automated approach. (See chapter 5.2 Methodology - Operation) Result The result is a set of well defined jobs that run on a regular basis in the defined job windows with as less manual interaction as possible.

4.4.3 Data Volume Management Strategy - OptimizeThe following list shows some of the indicators for the need to optimize the implemented DVM activities.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 21 of 38

SAP Standard Data Volume Management After the implementation of all DVM means and after the catchup phase, there is still a significant growth of the database. In this case a more detailed analysis is required if this growth is caused by o tables and business objects that have not been included so far in the scope of the DVM implementation. These business objects may be caused by new business processes that have gone live. => New business objects should added to the DVM strategy. o new data on already covered business objects, e.g. a new company code or plant has been implemented => The DVM strategy should be optimized: The deletion reports selection or the customizing and selection of the corresponding archiving object has to be adjusted. o growing business data volume in the recent periods. Example: if in year 2004 1 million documents have been created, in year 2006 1.5 million documents, the growth cannot be stopped by just archiving years 2004 data. Growing business volume is the most common reason for questioning if the DVM implementation has been successful. => No optimization of the DVM strategy is possible (besides shortening the residence times if the business volume increased by a significant factor, e.g. doubled.) o data that does not meet the requirements for archiving or deletion on a large scale, e.g. Purchase Orders that have not been fully delivered. => Obsolete data has to be transferred in an archiveable or deletable status (usually done by a z-report). In addition, the business process should be reviewed and adjusted in order to avoid non-archivable data in the future. o Fragmentation of tables and indexes. If only smaller parts of tables are deleted, e.g. only data for a specific company code, this empty space may be too small and too fragmented to be re-used for inserts of new data. In this case, the tables just keep on growing as before despite data archiving or deletion. => Reorganization of those tables is required and the DVM strategy should be optimized by changing the customizing / selection variants to avoid fragmentation. The DVM reporting is the recommended tool to detect the need for optimization step. A history of the growth rate of the largest tables is required to check the effect of the taken DVM measures on the monthly growth rate. In addition, an analysis of the table content with SAP transaction TAANA should be performed, especially to check if and which data exceeds the defined residence time. This information is provided by DVM reporting. 2007 SAP AG SAP Standard Data Volume Management Version: 1.0 Page 22 of 38

SAP Standard Data Volume ManagementResult The result is an adjusted DVM strategy that helps to handle an additional another piece of data growth.

4.5 Data Volume Management ReportingPurpose The process DVM reporting evaluates the effectiveness of the strategy implemented by the process DVM strategy. The focus is on the reporting of the data archiving and data deletion activities. Additionally, the reduction potential will be listed. In the case of an already implemented strategy to manage data volumes, customers are interested in the effectiveness of their efforts. DVM reporting reports the archiving, deletion activities, the reduction potential and possibilities. Besides, the report of DVM reporting could be used to ease the creation of a condensed management summary of the data archiving and data deletion activities. Used on regular basis, DVM reporting reports the data growth for systems which are going live in the near future and for which a strong data growth is expected. Adequate actions could be scheduled for data reduction proactively. Scheduling DVM reporting on a regular basis will show the growth of the database and point out how the growth could be minimized. It can be used As reporting tool for performed data archiving and deletion activities. As reporting tool for observing the data growth. As reviewing tool for the DVM strategy for the purpose of adaption the strategy on new business requirements. Process Flow In the DVM Reporting process, the following steps must be performed to get the necessary information for evaluating the implemented strategy: Evaluation of DVM relevant system information like solution component, release-, support-package level, date of production start, productive clients (multi-client system), current data volume size (data and indexes) and monthly data growth (default: last 12 month), Evaluation of the top largest and fastest growing tables (default: 20 tables, customizable) with Categorization regarding their affiliation to a business area and business object. In the case that the content of a top table is assigned to several business objects, the most important (e.g. top 3) business objects are identified, listed and further on considered. The reduction possibilities of this business object and the corresponding deletion and archiving object are identified. From the technical point of view, the reduction potential for the identified business object is evaluated with regard to the yearly distribution of the business object data, the compliSAP Standard Data Volume Management Version: 1.0 Page 23 of 38

2007 SAP AG

SAP Standard Data Volume Managementance of the archiving and deletion criterions (e.g. deletion indicator, deletion flag, overall status) and the residence time with default values as proposal and changeable for customer needs. Evaluation of the archiving history regarding the archived objects (with the status completed archiving runs), their execution period (date of first and last run), the number of archived objects and the size of archived files. Regarding execution period, archive objects with a date of last archive run older than 2 years are considered as not active. Evaluation of the efficiency of the implemented strategy by assessing the categorization and archiving history information o o o o o Result The DVM reporting reports on size and growth of the database top tables of the database performed data archiving and data deletion activities the potential for additional reduction as result of the efficiency analysis Recommendations of follow up activities like continuous monitoring or improvement of the used strategy caused by new business requirements. The performed archive objects correspond with the archive objects of the identified top business objects (low or high identity), Amount of data exceeding the business object specific residence time (default or customer value) Identification for which business objects the retention period (customer defined) for the archive files is expired and are candidates for final deletion as proposal Collecting the data of a DVM reporting run (only a snapshot) for long term analysis. The effort, time schedule estimation and the scope for the implementation or assessment of a DVM strategy.

In addition, the report of DVM reporting could be used to ease the creation of a condensed management summary of the data archiving and data deletion activities, as basis for the set up an assessment of the existing strategy

4.6 Data Volume Management for Upgrade.Purpose DVM for upgrade describes how the components of the DVM standard will be used in the area of release upgrades. 2007 SAP AG SAP Standard Data Volume Management Version: 1.0 Page 24 of 38

SAP Standard Data Volume ManagementA critical success factor for release upgrades is the system downtime which should be as low as possible. The system downtime could be influenced among others by the conversion of database tables which is dependent on the other hand on the size and content of the tables. Data archiving and data deletion could be helpful decreasing the system downtime through reducing the table size and content. Under the criterion of TCO, data archiving and data deletion should only be used in an upgradee project for upgrade relevant tables and not in a general approach. Relevant for the upgrade are tables with structural changes which make conversions necessary and contribute to a higher system downtime. Process Flow DVM could be used as followed 1. DVM-scoping identifying the upgrade relevant tables identifying and recommending the corresponding reduction possibilities Implementing the most convenient reduction possibility considering the business requirements

2. DVM strategy

Using DVM scoping, the upgrade relevant tables and the corresponding reduction possibilities could be identified. With DVM-scoping, the following information will be identified The release and support package level of the source system (release which will be upgraded) The release of the target system (release to which will be upgraded) The type and the release of the used database system The productive start date of the source system The database size and growth of the last 12 months The Top 20 tables with their size (GB) and their corresponding business data objects The distribution over the years of the documents for the top 20 tables/business objects The reduction possibilities for these tables (avoidance, summarization, deletion, archiving) The data archiving and/or data deletion objects for the top 20 tables The current data archiving and data deletion activities The upgrade relevant tables of the top 20 tables o Using the corresponding DVM Best Practice Document you can check, if one of the Top 20 tables for the source-target release combination is listed SAP Standard Data Volume Management Version: 1.0 Page 25 of 38

2007 SAP AG

SAP Standard Data Volume Managemento If a top table is listed as an upgrade relevant table, the following steps should be performed If the reduction possibility data deletion or data archiving is listed for this table, the corresponding data archiving or data deletion objects If a reduction by deletion or archiving is not available for this table, this table could not be reduced by DVM If a reduction by deletion or archiving is technical possible and the corresponding data deletion or data archiving object has actively been used for this table in the last 18 months, then the necessity of additional reduction activities should be checked by using the distribution over the years information and checking if there are data older than an assumed residence time of 1 year and for the additional reduction activities recommendations for the corresponding reduction possibilities are defined for the follow up DVM process. If the corresponding data deletion or data archiving objects are not actively used, recommendations for the corresponding reduction possibilities are defined for the follow up DVM process. o If a table is not listed, no further DVM actions are mandatory necessary.

Using DVM strategy, the given recommendations will be implemented in consideration of the customer business and legal requirements. Result The result of the DVM scoping process in the case of an upgrade is a list of identified upgrade relevant tables with their business objects and recommendations for reduction of the data volume by deletion or archiving with a rating of the implementation complexity as preventive measure decreasing the upgrade system downtime. The DVM strategy process is the follow up process for the implementation of the recommendations above.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 26 of 38

SAP Standard Data Volume Management

5 How to Implement the DVM Standard?5.1 Methodology - Implementation5.1.1 Data AnalysisA proper data analysis is the key to a successful DVM implementation. SAP transaction TAANA is the central tool for table analysis. It can be used to analyze how entries for a table have been distributed to selected fields. Table analysis counts the table entries and assigns the number of entries found to the selected field values (such as organizational units or periods). Fields that should be used repeatedly for table analysis should be grouped and saved as an analysis variant. Besides the feature of the analysis variants that allows to run the same analysis from time to time and compare the results, the feature of so called virtual fields is especially beneficial to support the data analysis by a time criteria. Many tables include a date-field (e.g. posting date) but not a field indicating the year of creation. The analysis by single day is much too detailed. Therefore TAANA offers the possibility to define a virtual field YEAR on a date field, so that the distribution of data can be analyzed directly by the YEAR criteria. The question which should be chosen for the analysis has to be answered by the application management team. All data should be analyzed that is required to prepare the blueprint workshops with the business process champion. (see chapter 4.4.1 Data Volume Management Strategy Implement)

5.1.2 Technical Archiving Customizing5.1.2.1 Cross-archiving object customizingThere are general technical settings that apply to all archiving objects. (SAP transaction SARA -> Button: Customizing -> Cross-archiving object customizing -> Technical Settings. Interrupt the Archive Write Job after either a specified maximum runtime or maximum volume of created archive files. In general the runtime of the write job should be limited by limiting the processed data volume by using appropriate selection criteria. But this may not be possible in all cases or the available scheduling window is very strictly limited, this customizing setting may offer an additional security. A server group used for the archiving jobs can be defined, e.g. in order to keep the archiving workload off the application servers for the dialog users.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 27 of 38

SAP Standard Data Volume Management5.1.2.2 Archiving object-specific customizingThe technical customizing settings that only apply to a specific archiving object will be set via SAP transaction SARA -> Button: Customizing -> Archiving-object specific customizing -> Technical Settings. A detailed description on the parameters can be found in the SAP Library. A files size of 100 - 200 MB per archive files is proven value to start with. In case of storing the archive files on an external storage system, you should be aware that activating the switch Delete program reads from storage system offers additional security but slows down the performance of the delete job significantly. Therefore the default recommendation is: Not activated. Archive File Routing: The lifetime (retention time) may not be the same for all archive files of a single archiving object (e.g. Material Documents). Therefore it may be necessary to store e.g. material documents from Austria in a content repository with a defined lifetime of 7 year, whereas archive files containing documents from Germany have to be stored for 10 years in a different content repository. For this reason, archive file routing offers a new flexibility by the possibility to determine the appropriate content repository on the level of organizational unit (for example company code) or time-based criteria (such as fiscal or business year).

5.1.3 Setup of SAP AS Archive Information StructuresArchive information structures are transparent DB tables that index archive files for a faster access based on customer-defined selection criteria. Important things to know about archive information structures: The archive information structures shipped by SAP are only suggestions and examples. In case they do not exactly fit the retrieval requirements a customer-individual archive information structure should be defined in the customer namespace. It is not recommended to adjust the default SAP archive information structures. Use as little fields as possible in an archive information structure to avoid unnecessary growth of the archive information structure table. Only those fields that are required as selection criteria should be included. Independent on the fields selected to be part of the archive information structure, ALL archived fields can be displayed. If possible, the archive information structure should be based on a field catalog defined on document header level. When using line item information, the number of records will multiply compared to an archive information structure based on header data.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 28 of 38

SAP Standard Data Volume Management The number of activated archive information structures should be kept to a minimum. As an archive information structure can very easily activated and populated on demand, it may make sense to first experience the real users needs, before offering a tool, that is not really required. Depending on the search criteria, it may be required to define additional secondary indexes on the table for the archive information structure. This can be done proactively by the customizing table AIND_STR8 or on already populated archive information structures with report ASCORRINDX. By default, SAP AS archive information structure tables are created in the POOL tablespace. As these tables may show a significant growth this POOL tablespace may not seem the right place for managing these tables properly. If the table should be migrated to a different tablespace, this should be done very soon after the creation of the table, i.e. as long as the table contains no or only a few records. A retention time should be agreed on and documented that defines after which timeframe outdated data should be removed from the archive information structure. Example: A fast and index supported access based on an archive information structure may only be required for the first 2 years after archiving. After this period, a sequential read in the background may be sufficient or alternatively the archive information structures are then populated again on demand in case a fast access is required.

5.1.4 Document ManagementFor some application documents, e.g. billing documents, there are legal requirements in many countries to be able to store or be able reproduce the print-out of the billing document. After data archiving, the print out is no longer possible based on the archived database records. The recommended alternative is to do the re-print based on a PDF-file containing an image of the billing document. This PDF file is stored in a content server attached to the SAP system via the ArchiveLink system. This implementation step has to be considered and planned carefully. Especially the creation of PDF files retrospectively for all already existing billing documents can be a time consuming process.

5.1.5 End-user TrainingDepending on the taken measure for DVM (customizing changes to avoid future creation of data, deletion or data-archiving), it may be necessary to inform the end-users about the new system behavior, for example, how long log files can be accessed before they are deleted or how to display data after it has been archived. For smaller changes, a roll-out e.g. by a newsletter is sufficient. For changes with a bigger effect on the business process, a roll-out by e-learning or key user classroom training may be necessary.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 29 of 38

SAP Standard Data Volume Management5.2 Methodology - Operations5.2.1 Job Scheduling5.2.1.1 Scheduling Write JobsUsually, it is required to schedule archive runs not only a few times a year but on a regular basis. To avoid that archive write job variants have to be adjusted manually, SAP recommends the definition of dynamic variants. Based on a dynamic variant, it is possible to schedule the write job periodically using the same variant, but the content of the variant will be adjusted. Especially for date fields a dynamic variant can be defined very easily. The variant definition based on table TVARV should be used for other fields (e.g. fiscal year, fiscal period, archiving session note).

5.2.1.2 Scheduling Delete JobsFor every created archive file a separate delete job is required. Delete Jobs should not be started automatically as the number of created archive files cannot always be determined beforehand such that the number of delete jobs running in parallel may be higher then expected. On the other hand, it is not feasible to start every delete job manually. The solution to this dilemma provides program RSARCHD which should be used for scheduling the delete jobs. For further information please see SAP Notes: 205585, 447921 and 820023. In case there is no content server attached via Archive Link, there should be a back-up of archive files before (!) the delete job is scheduled. A possible solution for this is to define a job that combines the start of RSARCHD with a copy job by defining two steps. The copy job can be a shell script that is defined with SAP transaction SM69. Scheduling delete jobs in parallel to long running write jobs may raise the risk on Oracle database that the write job is cancelled because of an ORA-1555 (snapshot too old) error. Delete Jobs running parallel to your DB Online Backup will most likely increase the runtime of your backup.

5.2.2 DocumentationIt is important to keep track of which data was archived in which archiving session. Especially for sequential file scans, the user should be able to decide which files should be chosen for the sequential read. 2007 SAP AG SAP Standard Data Volume Management Version: 1.0 Page 30 of 38

SAP Standard Data Volume ManagementPossibilities for documentation: Short Text of Archiving Session The short text is displayed in the management overview in SAP transaction SARA and in any other pop-up where the user can choose an archive file for sequential read. The short text should be maintained in any case. This can be done in most cases in the variant for the archiving object. In some cases the selection screen for the archiving object does not offer a field Archiving Note. In those cases, the short text has to be maintained afterwards manually in SAP transaction SARA Management View. The short text should include the most important selection criteria e.g. period, fiscal year, document type. Long Text for Archiving Session when you double-click a single archiving session in SARA Management View you get a pop-up where you can choose to maintain a long text that describes the content and selection of the archiving session. As this always requires manual interaction, this method of documentation is not very widespread in SAPs customer base. Used selection variants in an archiving session The values of the selection variables of an archiving session are automatically stored and can be displayed in the Archive Administration (= SAP transaction SARA) -> Management -> right-click on an archiving session -> User Input. Save selection variants and spool output of an archiving session as print list When you tick the option Selection Cover Page in the print parameters the values of the selection screen is included in the top part of the spool file, that is created for the write job. In addition, you have to choose Archiving mode: Print and archive instead of the default Print only. This solution gives you the highest level of detail with the lowest manual interaction as all necessary information will be stored in the print list automatically. But this solution requires a 3rd-party solution in place to store the created print lists.

Recommendation: Maintain in all cases the short text for every archiving session. In case you have a 3rd Party Solution as a Content Server implemented, use print lists to document the details of your archiving sessions.

5.2.3 Database ReorganizationIndex reorganization: It is recommended to do an index reorganization after the first massive initial archiving and deletion (catchup phase) and from time to time in the operation phase with regular data archiving, because: It is not very resource intensive It can be done online Performance advantages usually outweigh the disadvantages

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 31 of 38

SAP Standard Data Volume Managemento o o Important for chronologically sorted indexes Less important for non-chronologically sorted indexes Without index reorganization there is no performance benefit after archiving

Reorganization of Tables Reorganization should mainly be performed if a significant and permanent reduction of the table is expected; this means that you do not expect that the table will ever grow in such an extend that it would reach the current size once again. Reorganization of database tables are resource intensive and often require downtime depending on the available features for online reorganization of the database system. How to avoid the need for table reorganization? Please consider the following points during data archiving in order to avoid data fragmentation and consequently to decrease the necessity of table reorganization: Use appropriate selection criteria for data archiving to avoid fragmentation (e.g. use a minimum of selection criteria - only residence time if possible) Avoid archiving for periods in which data records are not business complete yet use test mode first

5.2.4 House-KeepingPossibilities of Reduction: Data Deletion There are a number of jobs that must periodically run in a live SAP installation, for example, to delete outdated jobs or spool objects. As of R/3 release 4.6C, you can easily schedule these jobs as follows: SAP Transaction SM36, press button 'Standard jobs'. Please review SAP Note 16083 Standard jobs, reorganization jobs to get detailed information regarding the jobs to be considered.

5.2.5 Troubleshooting5.2.5.1 Write Job TerminatesThe most common reasons for a terminated Write Job are: Database error ORA-1555: snapshot too old. Recommendation: o Try to reduce the amount of selected data to reduce the runtime of the write job and to increase the chance of finishing properly. o Run your archiving session when few updates/inserts are being made to the database. Especially avoid scheduling archiving delete jobs in parallel to a

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 32 of 38

SAP Standard Data Volume Managementstill running archiving write job. Do not start delete jobs automatically but via scheduling SAP report RSARCHD after the write job has finished. o For SD objects, select alternative DB Access or o Extend the rollback segments of the database. No disk space available for the creation of the archive files.

In the event of an error, the ADK declares the last created file to be invalid. As a consequence, it will not show up in SARA and it will not be possible to schedule a delete job for this last file. So there will be no risk of deleting data that is in fact corrupted in the archive file. In addition, before the delete jobs start the archive file is scanned through and the correctness is verified. How to handle cancelled archive write jobs: Step 1: Schedule the delete jobs for all valid archive files, that are displayed in SAP transaction SARA. Step 2: Check if there is a remaining archive file you did not schedule a delete job for and remove this last and probably defective file from the file system. Step 3: Re-run the archiving session using the same selection criteria to archive the remaining data. Attention: if you re-start the write job before running all delete jobs for the existing archive files, there is a risk of archiving the same data twice in two separate archive write jobs.

5.2.5.2 Delete Job TerminationTwo cases have to be distinguished. Termination because of System error (e.g. lack of resources, shutdown, ) Verification errors In the case of system errors, you only have to restart the delete job until it finishes successfully. There are no further steps necessary. In case of verification errors during deletion, the data in the corrupt file will not be deleted from the database. The data from the corrupted files must be archived again in a new archiving session and must then be deleted. Finally, the corrupted archive file must be removed manually from the management data of the archive development kit (ADK) and from the file system. If verification errors occur when archive files are being read or reloaded, the data in the corrupted files is not read or reloaded. Because the corrupted files must have passed the verification during the delete phase, you can assume that the archive files were corrupted after that phase, for example, while they were being recopied. It may be possible to repair the defective file using SAP remote consulting service. A fee will be charged for this.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 33 of 38

SAP Standard Data Volume Management5.2.6 MonitoringIn general, all jobs for deletion, archiving, and storing of archive files to a content server should be included in the overall job monitoring concept. If possible, the spool outputs should be parsed automatically, for example, to ensure that the delete job really deleted data or just ended (e.g. because of missing authorizations). Use SAP transaction RZ20 Monitor Templates Data Archiving Monitor to get a crossarchiving object overview. Use SAP transaction SARA for an archiving object specific and more detailed view.

5.3 ToolsThe following tools will be used for DVM SAP Solution Manager Early Watch Alert SAP Solution Manager Data Volume Scoping SAP Solution Manager Data Volume Strategy SAP Solution Manager Data Volume Reporting SAP Solution Support Enablement Package, an enhancement for SAP Solution Manager SAP Service Tools for Applications SAP Netweaver Operations

SAP Solution Manager Early Watch Alert covers among others for DVM the functionality of the process Monitor Data Growth, identifying the size of the data volume and the monthly data growth, evaluating this information with regards to the DVM KPIs. Depend on this, the following issue is created and as action recommended: We have identified that your Database is large and growing significantly each month. We would recommend that you consider the possibility of performing Data Volume Management. SAP Solution Manager Data Volume Scoping is used as next step in the case of the recommendation above. This tool covers the functionality of the process DVM Scoping and recommends follow up actions like Implementation of a DVM Strategy or performing a Business process Analysis or Monitoring the data volume and data growth on regular basis. SAP Solution Manager Data Volume Strategy is used if the implementation or optimization of a DVM strategy is recommended. This tool supports as focal point the data analysis and the blueprint phase of the implementation. SAP Solution Manager Data Volume Reporting is used monitoring an implemented DVM strategy on regular basis regarding data growth, data archiving activities and their efficiency. Depend on the result of these evaluations as follow up activities could be recommended: SAP Standard Data Volume Management Version: 1.0

2007 SAP AG

Page 34 of 38

SAP Standard Data Volume Management Perform DVM Reporting on regular basis Optimize the existent DVM strategy Extend the existing DVM strategy by new identified functions.

SAP Solution Support Enablement Package provides among other things the DVM related best practices and empowering information. SAP Service Tools for Applications is used as data collector by the SAP Solution Manager based tools. The content of DVM analyses is captured and evaluated automatically by this tools set. SAP Netweaver Operations provides the basic functionality of system monitoring (e.g. database monitor), of technology and administration for data archiving (ADK, TAANA and so on) and of legal compliant (e.g. DART). This functionality is used by the tools above. A good overview on the SAP transactions that are useful for the implementation and operation of DVM provides the SAP transaction ARCHGUIDE and the pre-defined role for Archiving Administrators: SAP_BC_CCM_DATA_ARCHIVING. A more detailed description of the data archiving relevant SAP transactions can be found in chapter 5.1 Methodology - Implementation.

5.4 PeopleThe following groups are involved in the DVM process: Business process champion Business process operations Application management.

The DVM strategy is not only the concern of the IT department since such a strategy depends on customer specific business processes and country specific legal regulations. So a team is necessary consisting of A team lead (mostly located in the IT department) Business process champions Business process operations persons.

Business Process Champion The business process champion is another key role in the organization of the customer. As the business units and the IT organization implement the business processes, the champion will be the expert on the process requirements, implementation and continuous improvement. Often, execution exceptions and potential resulting data inconsistencies can only be resolved through champion with the profound knowledge on these processes. SAP Standard Data Volume Management Version: 1.0

2007 SAP AG

Page 35 of 38

SAP Standard Data Volume ManagementIn the case of DVM, they are responsible for the definition of the business requirements (e.g. business dependencies, legal aspects, residence times and so on) and the confirmation of the DVM strategy. They should include internal or external Auditors regarding the different legal requirements. They are also included in the business process analysis activities. Business Process Operations The business process operations team is largely concerned with the management of business critical processes. Standards and guidelines cover the grounds of interface monitoring, managing an ever increasing amount of business data, the dependencies of jobs and their correct execution resulting finally the end to end transactional consistency of the entire business process execution. In the case of DVM, this group is responsible for the monitoring of the data growth and the initiation and the execution of the scoping process.

5.5 Available sources of informationDVM is part of the SAP E2E Solution Support Curriculum Integration & Automation. The following SAP standard trainings are recommended as prerequisite BIT 660 Data Archiving (overview about Data Archiving and its tools) BIT 670 Data Archiving Retrieval (especially applicable to integrate an archive-access in Z-Reports) BC/BIT 680 DART Data Retention Tool (how to handle the extraction of tax audit related data, especially required for compliance in US, Germany)

For details, see www.sap.com/education Additional Information: Book from SAP Press: Archiving Your SAP Data, ISBN: 1-59229-008-6 (English), Details see www.sappress.de Quick-Link in SAP Service Marketplace http://service.sap.com/data-archiving http://service.sap.com/dart http://service.sap.com/gdpdu (especially for German tax laws) http://service.sap.com/dvm

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 36 of 38

SAP Standard Data Volume Management

6 How to Measure the Success of the Implementation?What are the benefits of the standard process? The benefit of a successful DVM implementation is a DB that is as slim as possible and therefore can offer best performance results. In addition, hardware investments for additional disks can be postponed. And most important, the system management activities (e.g. DB backup, recovery, system copy ) can be handled much easier if only that data has to be handled that is really required for the standard business processes. How to measure the benefits (KPIs)? As already described above the measurement of the success may be difficult as there are some factors (fragmented DB tables, changed business processes, additional business volume, access to archived data via archive indexes) that still will cause a DB growth even while operating a perfect DVM strategy. For details on the reasons why a DB may still grow despite a proper DVM strategy, see chapter 4.4.3 Data Volume Management Strategy - Optimize In addition, it has to be considered that in many cases DB index reorganization is required to achieve the performance gains and a DB table reorganization to really re-gain disk space. How to calculate the KPIs? For business objects related to the top largest and fast growing tables the corresponding data archiving object is performed regular and for these objects no data are available online which are out of the defined residence time range.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 37 of 38

SAP Standard Data Volume Management

Copyright 2007 SAP AG. All Rights ReservedNo part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained herein may be changed without prior notice. Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors. Microsoft, Windows, Excel, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation. IBM, DB2, DB2 Universal Database, OS/2, Parallel Sysplex, MVS/ESA, AIX, S/390, AS/400, OS/390, OS/400, iSeries, pSeries, xSeries, zSeries, System i, System i5, System p, System p5, System x, System z, System z9, z/OS, AFP, Intelligent Miner, WebSphere, Netfinity, Tivoli, Informix, i5/OS, POWER, POWER5, POWER5+, OpenPower and PowerPC are trademarks or registered trademarks of IBM Corporation. Adobe, the Adobe logo, Acrobat, PostScript, and Reader are either trademarks or registered trademarks of Adobe Systems Incorporated in the United States and/or other countries. Oracle is a registered trademark of Oracle Corporation. UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group. Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or registered trademarks of Citrix Systems, Inc. HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C, World Wide Web Consortium, Massachusetts Institute of Technology. Java is a registered trademark of Sun Microsystems, Inc. JavaScript is a registered trademark of Sun Microsystems, Inc., used under license for technology invented and implemented by Netscape. MaxDB is a trademark of MySQL AB, Sweden. SAP, R/3, mySAP, mySAP.com, xApps, xApp, SAP NetWeaver, and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP AG in Germany and in several other countries all over the world. All other product and service names mentioned are the trademarks of their respective companies. Data contained in this document serves informational purposes only. National product specifications may vary.

The information in this document is proprietary to SAP. No part of this document may be reproduced, copied, or transmitted in any form or for any purpose without the express prior written permission of SAP AG. This document is a preliminary version and not subject to your license agreement or any other agreement with SAP. This document contains only intended strategies, developments, and functionalities of the SAP product and is not intended to be binding upon SAP to any particular course of business, product strategy, and/or development. Please note that this document is subject to change and may be changed by SAP at any time without notice. SAP assumes no responsibility for errors or omissions in this document. SAP does not warrant the accuracy or completeness of the information, text, graphics, links, or other items contained within this material. This document is provided without a warranty of any kind, either express or implied, including but not limited to the implied warranties of merchantability, fitness for a particular purpose, or non-infringement. SAP shall have no liability for damages of any kind including without limitation direct, special, indirect, or consequential damages that may result from the use of these materials. This limitation shall not apply in cases of intent or gross negligence. The statutory liability for personal injury and defective products is not affected. SAP has no control over the information that you may access through the use of hot links contained in these materials and does not endorse your use of third-party Web pages nor provide any warranty whatsoever relating to third-party Web pages.

2007 SAP AG

SAP Standard Data Volume Management Version: 1.0

Page 38 of 38