Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s...

19
Large-scale DSS Implementation Decision Support Planning andAnalysis: The Problems of Getting Large-Scale DSS Started By: C. Lawrence Meador Martin J. Guyote Decision Support Technology, Inc. 18 Coltsway Wayland,Massachusetts By: William L. Rosenfeld Research & Planning, Inc. 222Third Street Cambridge, Massachusetts Abstract Developing a large-scale institutional DSS designed to serve multiple managers in different business functions can be a more challenging task than that of developing the much more common one-user, one-function DSS that have evolved over the past few years. In this article we review some of the evidence suggesting that extra effort andrigor in the early planning and analysis stageof large-scale DSS development is worthwhile. We attempt to identify those characteristics of DSS that require different treatmentthan those available in tradi- tional structured techniques.We then present, in the form of a casestudy, a hybrid technique which we refer to as DSA (Decision Support Analysis) which has been used effectively in developing large-scale institutional DSS. Finally, we discuss some of the positive andnegativeexperiences that have emerged from using DSA. Keywords: Decision support system, end user computing, user needs assessment, development methodology, architecture ACM Categories: D.2.1, H.1.2, H.4.2, J.1, K.4.3, K.6.1 Introduction Decision support systems (DSS) are computer- based information systems designed to help managers solve problems in semi-structured decision-making areas. Successful DSS ap- plications have addressed problems and deci- sions in a broad range of managerial and policy environments (see [1, 2, 9, 11, 15, 26]). By defini- tion, semi-structured decision-making environ- ments are those not well enough understood to permit complete analytical description. This implies the need (and opportunity)to combine managerial experience and judgement with quantitative computer-based approaches. Planning and analysis are critical tasks in the development of large, complex DSS environ- ments, especially those designed to support several different business functions. Differ- ences between DSS and traditional MIS and DP applications however, oblige developers to use different analytic methods for DSS.In this article we approach these differences from an applications perspective rather than from a theoretical perspective, though in many regards DSS lacks both theoretical and empirical underpinnings. The early stage of large scale institutional DSS development, here called the Decision Support Analysis (DSA) stage, includes plan- ning, end user needs assessment, problem diagnosis, management orientation and priori- ty setting. Five elements incorporated into the DSA ap- proach include structured interviews, deci- sion analysis, data analysis, technical analy- sis, and management orientation. The use of this approach is illustrated by a case study that analyzes the decision support needs of multiple business functions within the market- ing organization of a large manufacturingfirm. Development of DSS vs. Traditional MIS Systems Analysis in Traditional Systems Development The importance of analysis in the develop- ment of traditional MIS and DP applications MIS Quarterly~June 1986 159

Transcript of Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s...

Page 1: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

Decision SupportPlanning and Analysis:The Problems ofGetting Large-ScaleDSS StartedBy: C. Lawrence Meador

Martin J. GuyoteDecision Support Technology, Inc.18 ColtswayWayland, Massachusetts

By: William L. RosenfeldResearch & Planning, Inc.222 Third StreetCambridge, Massachusetts

AbstractDeveloping a large-scale institutional DSS designedto serve multiple managers in different businessfunctions can be a more challenging task than thatof developing the much more common one-user,one-function DSS that have evolved over the pastfew years. In this article we review some of theevidence suggesting that extra effort and rigor inthe early planning and analysis stage of large-scaleDSS development is worthwhile. We attempt toidentify those characteristics of DSS that requiredifferent treatment than those available in tradi-tional structured techniques. We then present, inthe form of a case study, a hybrid technique whichwe refer to as DSA (Decision Support Analysis)which has been used effectively in developinglarge-scale institutional DSS. Finally, we discusssome of the positive and negative experiences thathave emerged from using DSA.

Keywords: Decision support system, end usercomputing, user needs assessment,development methodology,architecture

ACM Categories: D.2.1, H.1.2, H.4.2, J.1, K.4.3,K.6.1

IntroductionDecision support systems (DSS) are computer-based information systems designed to helpmanagers solve problems in semi-structureddecision-making areas. Successful DSS ap-plications have addressed problems and deci-sions in a broad range of managerial and policyenvironments (see [1, 2, 9, 11, 15, 26]). By defini-tion, semi-structured decision-making environ-ments are those not well enough understoodto permit complete analytical description. Thisimplies the need (and opportunity)to combinemanagerial experience and judgement withquantitative computer-based approaches.

Planning and analysis are critical tasks in thedevelopment of large, complex DSS environ-ments, especially those designed to supportseveral different business functions. Differ-ences between DSS and traditional MIS andDP applications however, oblige developersto use different analytic methods for DSS. Inthis article we approach these differencesfrom an applications perspective rather thanfrom a theoretical perspective, though inmany regards DSS lacks both theoretical andempirical underpinnings.

The early stage of large scale institutionalDSS development, here called the DecisionSupport Analysis (DSA) stage, includes plan-ning, end user needs assessment, problemdiagnosis, management orientation and priori-ty setting.

Five elements incorporated into the DSA ap-proach include structured interviews, deci-sion analysis, data analysis, technical analy-sis, and management orientation. The use ofthis approach is illustrated by a case studythat analyzes the decision support needs ofmultiple business functions within the market-ing organization of a large manufacturing firm.

Development of DSS vs.Traditional MIS SystemsAnalysis in Traditional SystemsDevelopmentThe importance of analysis in the develop-ment of traditional MIS and DP applications

MIS Quarterly~June 1986 159

Page 2: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

is well established. For example, Boehm [4]has shown that more errors are introduced in-to a new system through failures in analysisthan through failures in design, construction,or implementation. Shooman and Bolsky [23]found that errors in analysis are more costlyto correct and have greater impact upon sys-tem effectiveness than errors in design andconstruction. McKeen [12] found that addi-tional time spent in front-end analysis led toless overall development time and cost, andgreater user satisfaction with the deliveredsystem. Developers of traditional systems --recognizing the importance of analysis --now give increased time and effort to front-end analysis. Their objective is a detailedprespecification of the full system. However,DSS development typically follows a differentapproach.

Analysis in DSS Development

DSS development follows a plan that lays outspecific tasks to be performed, and the pro-per order of performance. Recent research onDSS applications suggests that planning isperceived by DSS users and developers to bea very important activity. However, it is oftenperformed less effectively than is desired [13].Figure 1 shows a typical DSS developmentplan that might be suitable for large-scale in-stitutional DSS (sometimes referred to as anorganizational support system). The first stepin this process, Decision Support Analysis, in-volves the identification of:1. high priority applications,2. high level function requirements for those

applications,3. information characteristics and require-

ments,4. appropriate fundamental approaches to

addressing user needs, including systemarchitecture and detailed technical require-ments, and

5. orientation of users to DSS concepts andtheir relevance to supporting users’ jobs.

The first four areas are used to guide soft-ware evaluation and selection, prototype de-sign, and prototype construction. The deci-sion support analysis stage provides initialdirection to the entire DSS development pro-cess. In addition, management orientation to

DSS that occurs during this stage helps toavoid organizational problems during imple-mentation. It does this by fostering realisticexpectations and generating commitmentfrom users.

Traditional Systems AnalysisMethodologies

Many methodologies for analyzing require-ments of data processing applications havebeen developed during the last several years.They include IBM’s Business Systems Plan-ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis andDesign Techniques [21, 22], and others.

These methodologies generally share thefollowing characteristics.

1. They are typically used to analyze informa-tion flows and data structures for large,structured applications where the scope ofthe application is fairly well defined inadvance.

2. The primary objective is to provide a de-tailed specification of data flows and struc-tures that can be directly translated intosystem designs. Designs are then frozenprior to the construction phase. This con-struction method subdivides large projectsamong many programmers, thereby requir-ing greatly detailed specifications in orderto avoid massive coordination problems.

3. Given the size of these applications, a sec-ond basic objective is to facilitate the trans-lation of the detailed specifications into ef-ficiently designed systems.

4. These methodologies require extensive in-vestments in time and resources to achievethe level of detail required.

Requirements analysis is costly and may behard to justify for DSS. Thus traditionalmethodologies may have to be scaled downand modified to be acceptable to DSS usersfor the following reasons:

1. Many DSS applications are smaller inscope than traditional MIS or DP applica-tions.

160 MIS Quarterly/June 1986

Page 3: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

Figure 1. A Tactical Plan for DSS Development

MAJORACTIVITIES

DECISIONSUPPORTANALYSIS

DSS SOFTWAREEVALUATION &SELECTION

PROTOTYPEDEVELOPMENT

OPERATIONALDEPLOYMENT &SUPPORT

DETAILEDTASKS

Structured Interviews

Decision Analysis

Data Analysis

Technical Analysis

Conceptual DSS Orientation

Plans and Prioritization

Identification of Candidate Vendors

Feature Analysis

Benchmarks

External Site Surveys

Scoping of Prototype

Project Evaluation Criteria

Detailed Design

System Construction

Testing

Demonstration

Evaluation

Functional Orientation

Operational Training

Deployment

Maintenance

TYPICALFEEDBACK LOOPS

ADAPTATION

OPERATIONALSYSTEMDEVELOPMENT

REVISION &ENHANCEMENTS

MIS Quarterly~June 1986 161

Page 4: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

2. The benefits of many DSS applications arehard to quantify.

3. DSS users have difficulty prespecifyingtheir decision support needs without aconcrete system to which they can react.

4. The decision support needs of DSS userschange frequently.

5. It is more important that DSS be effectivethan efficient (although efficiency is also,of course, an important goal).

6. Problems to which DSS are applied mustoften be addressed quickly relative to tradi-tional system development timeframes.

These factors have led many authors to arguefor an evolutionary approach to DSS develop-ment using prototypes and rapid develop-ment tools such as fourth generation lan-guages [3, 17]. However, it is often still takenfor granted that evolutionary application de-velopment must be preceded by a precise anddetailed requirements analysis. This articlesupports an evolutionary approach for re-quirements analysis as well.

DSS Analysis MethodologyProcedures for DSS analysis and designshould exhibit the following characteristics.

1. Minimal elapsed time prior to prototypedevelopment. Users need a concrete sys-tem to which they can react.

2. Robustness -- given the preceding con-straint, analysis will often be incompleteand fragmentary. The analytic methodmust compensate for less than perfectdata by quickly focusing on the highestpriority applications, and on those func-tional requirements for each applicationthat warrant more detailed analysis.

3. Ability to evolve along with the DSS -- thesame methodology should be capable ofbeing used to discover initial DSS oppor-tunities, establish initial functional re-quirements, and evaluate existing systemsto identify directions for further growth.

4. The analytic methodology must have userinvolvement as an important by-product.

5. There should be an orientation towardmanagerial users and their decision mak-

ing activity. This implies designing the sys-tems with special emphasis on the user in-terface(s), and providing procedures anddata representations that fit well withspecific managers’ established activities.

At the same time, there should be an em-phasis on prescription as well as descrip-tion. The methodology should capturemanagers’ decision processes and shouldestablish priorities to improve these pro-cesses. This includes pinpointing poten-tial applications with the biggest impacton managerial effectiveness and the high-est priority functions needed for these ap-plications. To do this well, the analysis ofmanagers’ activities should be tied to in-dividual goals plus the overall goals of theorganization, that is, to managers’ criticalsuccess factors [20]. The analyses shouldexplicitly surface information on improv-ing specific business activities, satisfac-tion with current performance, perceivedcosts of improvement, and the amount oftechnological and organizational riskassociated with proposed alternatives.

Decision Support AnalysisApproachDecision Support Analysis is designed to getDSS started quickly and to achieve the re-sults described previously. There are fivebasic components of the DSA approach: struc-tured interviews with management, decisionanalysis, data analysis, technical analysis andmanagement orientation. These processesand their intended results are illustrated inFigure 2.

A case study is presented to show the overallflow of the process. The approach presentedhere is primarily oriented toward planninglarge-scale institutional DSS in settingswhere the user community includes multiplemanagers in different business functions. Inother settings, however, it has been tailoredto the needs of single business functions andsmaller organizations, to the techn61ogicalsophistication of the organization, and thescope of the DSS. In such cases, specific tasks

162 MIS Quarterly~June 1986

Page 5: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

Figure 2. Decision Support Needs AssessmentProcess and Results

".. Hardware/Software I Usable/Available ....’:/

"....... Architecture I Information ......"

q Plans and Priorities

may be de-emphasized or eliminated in orderto proceed more quickly and at lower cost.

The case study organization, which we will re-fer to as Image Technology Corporation (Imag-tec), is a Fortune 100 manufacturer of con-sumer and Industrial products. The corpora-tion has a 500-person sales force In its U.S.and overseas subsidiaries. This sales force ismanaged from a central line organization sup-ported by staff functions that perform fore-casting, sales analysis, promotion analysis,market research, end consumer sales audits,product development, strategic planning andadvertising. The marketing function was beingsupported by a marginally functional set oftransaction processing systems but had verylittle decision support. A number of sophisti-cated software development tools had beenacquired at various times but at the time ofthis study they were poorly utilized, and theresulting applications suffered from poor inte-gration. The planning effort described herewas initiated to develop a "blueprint" for a

worldwide marketing information and decisionsupport system (WMIS) and to obtain the nec-essary management support. The analysis andplanning described was conducted by a teamof five analysts over a three month period.

Structured InterviewsThe process begins with interviews that allowmanagers to identify their critical needs, ob-jectives, and priorities. When possible, inter-views are conducted with senior management,systems management and staff analysts.

The interview process at Imagtec includedone to two hour interviews with more than fortyline executives and staff professionals in theabove categories, all of whom were involvedwith the production and use of marketing in-formation. Given the time constraints, twothings are key to maximizing the usefulnessof the interviews. The first is the use of inter-view checklists to focus interviews.

MIS Quarterly~June 1986 163

Page 6: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

Interview checklists should be designed inconsultation with one or two key user man-agers to ensure that all important areas of ac-tivity are covered. Broad areas of coverageshould include:1. a brief description of the objectives, scope,

and plan of the project;2. a description of the project methodology;3. major business objectiveslprioritiesldeci-

sions;4. areas for improved decision support (man-

ual or automated);5. interfaces with other groups and organiza-

tions (internal and external);6. projections of future needs;7. policy issues such as authority, account-

ability, and degree of direct use;8. feedback on the interviews.

The checklists should be used in short preli-minary sample interviews with a few selectedmanagers, and then revised based on feed-back received. By necessity, these interviewswill gather impressionistic data, and tech-niques such as rating scales and semanticdifferentials are useful in structuring man-agers’ reports of their perceptions and prefer-ences. Aside from making quantitative analy-ses possible, the use of rating scales facili-tates comparison of results from differentrespondents.

A useful technique in helping to establishpriorities is to have the respondents rate listsof issues and/or functional features on twoseparate dimensions: perceived importanceand perceived performance or satisfaction.High priority issues and features are thosethat are given high importance but low perfor-mance/satisfaction ratings.

We have found that "generic" lists of issuesand features are often of limited usefulness,since such lists necessarily include many ex-traneous items and omit many importantones for any specific application. Further,they usually have to be reworded to conformto the internal terminology of the organiza-tion. However, there may be some classes ofapplications that lend themselves well tosuch checklists. One example would be or-ganizational cost management and budgetanalysis systems, which for most large firmswould have similar characteristics, including

large, multi-user systems with substantialhierarchical consolidations, financial model-ing, automated forecasting, alternative sce-nario analysis, and flexible reporting. In mostcases, however, the issues and functionalfeatures must be developed by the team con-ducting the decision support analysis alongwith sponsoring senior managers.

The second key to the usefulness of the inter-view results is the experience and skill of theinterviewer. The interviewer should go intothe interview with a clear understanding ofthose characteristics that mark a particulartask or decision making activity as a primecandidate for decision support. Examples ofthese characteristics include:1. labor intensive calculations are involved;2. frequent iterations of the calculations are

needed to reach consensus on plans;3. multiple scenarios are required to evaluate

uncertainty and form contingency plans;4. a process that is highly judgemental and

cannot be completely programmed;5. coordination among numerous individuals,

so that the DSS can provide structure anda common language to enhance consis-tency and communication;

6. a task or decision process that has seniormanagement involvement, and thus highvisibility and potential impact;

7. a task or decision process that is compart-mentalized. This facilitates a phased im-plementation plan using prototypes;

8. an environment that involves situations ofclarification;

The ability to quickly spot high potential DSSapplications is important in directing inter-views along fruitful paths. This helps uncoveras much useful information as possible in theallocated time. The interviewer must walk afine line, however, between direction thatfacilitates the manager’s identification ofneeds and putting words into the manager’smouth. The interviewee should be allowed todeviate from the interview checklist to dis-cuss issues and features that may be of par-ticular importance in his or her situationwhich were not anticipated.

In the early stages of interaction with Imag-te¢ management, the intendew process foundthat Important decisions in advertising, pro-

164 MIS Quarterly~June 1986

Page 7: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

motion, pricing, and distribution were oftenmade without determining the Interrelation-ships among these areas. Such decisionswere often made without an assessment ofcompetitive actions along these same dimen-sions. Some of these deficiencies were ap-parent to the Group Vice President for Market.ing. The senior manager who had approvedthe decision support planning and analysisactivity initially was himself skeptical and un-willing to go through the structured inter-view process. Gentle but persistent pressureby the DSS development team eventually con-vinced him that he should participate in theprocess as well. Once he became committedthrough personal involvement in the process,his apparent understanding of it and appre-ciation for its value Increased dramatically.

Significant deficiencies surfaced in the pro-cess by which market research impacted pro-duct development and strategic competitivepositioning decisions. Five separate, indepen-dent and typically disparate forecasting pro-cesses operated in parallel to, and in competi-tion with each other. No process existed to ra-tionalize the discrepancies for senior manage-ment. These findings suggested that decisionprocess integration was to be a major objec-tive of the proposed DSS. Feedback on theoverall interview results was provided to eachparticipating manager as soon as analysishad been completed by the DSS developmentteam. Each manager was asked to critiquethe assessment derived from the interviewprocess for his or her area of responsibility.

Decision AnalysisAt the conclusion of the structured interviews,we develop a conceptual framework to guidethe identification of DSS opportunities, sys-tem design, project management, and com-munication of priorities between users andsystem developers. We call the developmentof this conceptual framework Decision Analy-sis, and it is the next step in Decision SupportAnalysis. (Please note that our use of theterm "Decision Analysis" does not refer tothe narrow -- though well entrenched -- viewwhich applies Bayesian statistical theory toanalysis of decision problems.) The concep-

tual framework consists of the output fromthree tasks which are described below.

Business Area AnalysisThe first step in Decision Analysis is comple-tion of a business area analysis. This processstudies representative business units to deter-mine their decision support functional require-ments. The analysis is closely tied to the ex-isting organizational structure (as is thedefinition of the term "business area"). In oneclient situation, it might refer to corporatedivisions, and in another, functional depart-ments or even offices. This lets managers ap-preciate the extent to which each businessunit has unique needs. It also allows for dif-ferent levels of sophistication in the use of in-formation technology. The result is a set ofbusiness area specifications that identifyeach group’s mission, system objectives,basic functions, shared data, internal data,and reports/analysis needed. These specifica-tions are critiqued and approved by each man-ager before proceeding to the next step.

Description of Logical Functional Flow

Once the business area specifications havebeen developed from an organizational per-spective, they are converted to functionalflow diagrams. This involves hierarchical de-composition of the decision making activitiesof the business areas. The purpose of the hier-archical decomposition is the description ofthe logical relationships among the businessfunctions. Any of a number of analytic metho-dologies may be used for this purpose (theStructured Analysis and Design Technique(SADT) developed by Softech is illustrative this regard -- see [5, 21, 22]). The primary ob-jective of the functional flow diagrams is toquickly provide a structure for the more de-tailed analyses that follow, so that only oneor two levels of detail are necessary.

One of the functional flow diagrams developedfor the Imagtec marketing function is shownin Figure 3. This is an SADT style diagram. Abusiness function is defined as a group oflogically related activities (decisions or tasks)required to manage the resources of the busi-ness. Thus, at the highest level, four majorbusiness functions were identified for the

MIS Quarterly/June 1986 165

Page 8: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large.scale DSS Implementation

Figure 3

OVERALL CONTEXT

~lanufacturl~J L..L- External FactorsCoraorste & I -L

~ Assum~tl°~s_L~._~--’~,-I ~ I ~t S I ~

R~I Sales

Sulm~s &Materials

Consum~ Needs & I~eferences

OrderBhlloment& Re~

SELL-IN

3

MarketingPlans AActuals

RetailSales

Actuals

SELL-THROUGH4

DETAILED FUNCTIONAL FLOW

LExternal Factors ,.Bales ¯ ~ IProduct & Marketing Plans~History -- Shipman!

I-- I AN&LYSlS | Seasonal | I I IMarket Research /

.-’;’~-;;~;~-"-I~IB-~’~.~ ... J ~.Ma,k.,,,r /

~k ,l:’ian nevislons

Budget to Track Against

Mantl|Sclurln& Finance

-LMarketIng

ForecastFinal

1 Forecasl

166 MIS Quarterly/June 1986

Page 9: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

Imagtec marketing function: planning, fore-casting, sell-in, end sell-through. Manufactur-ing is shown on the diagram because of Its in.terrace with the marketing function at a cen-tral point in the ongoing process. The plan-ning function is responsible for directing allof the marketing functions. The forecastingfunction Is responsible for developing theshorter term, more quantitative plans thatdirect current operations. Manufacturing con-verts the plan into product. The sell-in pro-cess is concerned with moving inventory fromImagtec warehouses to the dealers’ ware-houses (i.e., "selling Into" the dealer network).Finally, sell through is concerned with mov-Ing the product from the dealer to the custo-mer (i.e., "selling through" the dealer net-work to the consumer marketplace).

The logical relationships among functionsmay be of two kinds. One type of relationshipis the subordinate relationship which identi-fies the specific activities that make up ahigher level function. In looking down the hier-archy from any given function, the subfunc-tions describe how to accomplish the func-tion. On the other hand, looking up the hler.archy from any given function shows why thatfunction is performed. Thus, in Figure 3, theoverall forecasting function is broken downinto three subprocesses: sales analysis, esti-mation of sales potential, and sales/produc-tion review.

The second type of relationship among func-tions is shown In Figure 3 by arrows denotingmajor Information flows. These are used both toIndicate the major Information needs and out-puts of any given function and also to establishsequential dependencies among functions.

The discipline behind the techniques used todevelop the diagram may have a further pur-pose. It may highlight development opportuni-ties that will emerge in the future. Such hier-archical descriptions may eventually be therequisite inputs to computer aided softwareengineering systems (sometimes referred toas automatic programmers). These can pro-duce at least "first draft" computer codedirectly from the graphical descriptions. TheCAD/CAM (computer aided design/computeraided manufacturing) tradition is now spawn-

ing a whole new prototypical generation ofthese systems development capabilities.

The functional flow diagrams also provide acontext for other types of analysis and usercommunication. In combination with priorityratings given by managers, they can be usedto highlight the most important functions tosupport first. The diagrams are also used toclassify and categorize information needs,and to guide system design and projectmanagement.

Specification of Detailed Decision Areas

The final step of Decision Analysis is deci-sion identification and classification. Under-standing decision domains allows us to ef-fectively plan decision support. This simpleobservation is occasionally forgotten. A solidbasis is developed for prioritizing DSS develop-ment by undergoing a formal process to iden-tify the organization’s major regular and adhoc decisions. Decisions can be analyzed interms of their complexity, frequency, level ofdetail, time horizon, accuracy requirements,information sources, and the scope of theirinformation requirements.

One result of the functional flow diagrams isthe identification, at the lowest level of thehierarchy, of detailed business functions ordecision areas that offer distinct opportuni-ties for decision support. As mentioned above,these lowest level functions should be speci-fled so that they can serve as potential ¢andl-dates for prototype systems. This means thateach should describe a relatively modulartask of modest scale that nevertheless hassignificant Impact upon the success of a hlgh.er level function.

For the marketing function at Imagtec, a listof 16 detailed functions was derived; three ofthese are shown In the bottom diagram In Fig.ure 3. A list of sample questions Is then de-rived for each decision area (see Figure 4)that reflects the highest priority informationneeds. These questions are later used as ob-jectives to guide system design.

Data AnalysisThe next step in DSA is the identification anddescription of the classes of data used by the

MIS Quarterly~June 1986 167

Page 10: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

Figure 4. Examples of Key Questions for Sales Production Review

2.3 Hold Sales Production ReviewFORECAST

1. Compare Alternative Forecasts* How does marketing’s forecast compare to manufacturing’s estimate of sales?¯ How does marketing’s forecast compare to finance’s estimate?¯ Is there a consensus on the underlying assumptions?¯ How does the current forecast compare to the annual unit plan?

2. Issue Identification/Resolution¯ Does manufacturing have the appropriate level and mix planned?¯ Is there a consensus on the forecast?¯ Are there products for which we have excessive or insufficient inventory to meet

expected demand?¯ Are there capacity constraints that will limit sales? If so, time/cost to increase

capacity?

3. Risk Evaluation¯ What is the range of sales potential and where do the marketing and

manufacturing sales forecasts fall within that range?¯ What are the consequences of various levels of over/under forecasting?¯ What can be done to minimize the risk?

functions. This is done through analysis ofthe functional flow diagrams. The purpose ofData Analysis is to identify commonalities ininformation requirements and usage amongthe decision areas. It also allows us to derivedesign requirements for application data-bases. These purposes are accomplishedthrough: (1) data classifications to categorizevariables of interest for the managers’ inter-views (e.g. product pricing, inventory levels,competitors’ market share), and (2) dimen-sional representations developed to structuredifferent views of these variables by time, pro-duct, market, etc.

For the Imagtec marketing function, 31 differ.ent classes of data were identified and de.scribed. Detailed elements were documentedfor each class of data. One of the outputsfrom this stage was a chart detailing the rela.tionship between the data classes and thedecision area. The Data and Function UsageChart for the Imagtec marketing function is

shown in Figure 5. For each data class anddecision area the chart shows whether thefunction uses (U), creates (C), or both and creates (B) the data. This chad providesinsight into the most frequently used data andthe most data intensive decision areas (e.g.estimation of sales potential in forecasting).In this case the chad clearly suggests theneed for an integrated database -- all of thedata is used by more than one decision area.Thus, a central set of management data shouldbe used to eliminate the need to re-enter orrecreate data for different applications.

At Imagtec, the marketing function managershad grown weary of a data processing systemwhich overwhelmed them with their own data.The marketing managers participating in thisstudy came to realize the magnitude andcomplexity of the data resources needed toperform their functions -- and thus under-stood why they currently had little or no rea-sonable basis for effective quantitative analy-

168 MIS Quarterly/June 1986

Page 11: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Figure 5. Data and Function Usage

PLAN

FORECAST

SELL-IN

SELL-THROUGH

Legend: C = createU = use

B = both

Assess Madder andIndustdes

Set Product/Msrt(atln9 Direction

Research NewConcepts/Products

Develop Long RangePlans

Sales Analysis

Estimate sele~Potential

Hold Sales/Productlo~t Review

Maximize SslseEffectiveness

Oeflne Promotionsand Programs

sell to Dealer

Distribute Product

Track the SalesForce Progrsm

National DealedCo<)p Advertising

Sail To Consumers

Retail Market Analysis

Conduct MarketResearch

GEOGRAPHY

U U

u

U

U

U U U U

U U U U U

U U U U U

U

C C U U

u

U U U U

U U U U

U U U U

ACTUALS

U U U

UU. U U UB

uu U U u uu

uu U

u

cu 8c

UC U B U

uu U

EXTERNAL PLANSAND DATA

U U U u

U U U U

U u U U

U U U U

U U

U U U U

U

U

U U U

U

U U U

PRODUCT

U U

B B B U B

U u C u

u u U U U

u U u U U

u u u

u u u U u

PREDICTIONS/DIRECTIONS

U U U

C U

U B B U

U U U

U U U U U

C uU u u

U U U U

UU U U U U

UU B u u u U u

U u U

U U

u u

u u

u C

U U

U U U

U U U

U U

B U U U

B U

PLANS

B B B U C

U U U

U U

U U U U U

U U U C

B U U C a

u

U U U U U

U U U U U

U U U U U

Page 12: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

sis. The prior existence of sophisticated DSSsoftware was of little help. A few well posi-tioned prototypes demonstrating some inter-active database retrieval capabilities as wellas analysis of sales accounts resulted in abroad management.end user support base.

One of the key elements in effective DSS is aset of well designed, multidimensional datastructures that allow alternative views of im-portant business variables. Research con-firms that end users perceive the need forsuch capabilities [10]. Different responsibili-ties imply different views of strategic andoperational information. At the same time, ef-fective communication between managers re-quires consistent procedures for structuringand aggregating information across severaldimensions. Some of the keys to effectivedesign of the multidimensional data structureinclude:

1. Easy multidimensional access -- Eachmanager and analyst must be able to ac-cess information at different levels usingconsistent procedures and commands.

2. Easy restructuring of information --Dimensions change over time. Productlines and markets are combined, added,and dropped, as are entire businesses andsubsidiaries. The user should have theability to restate and/or recalculate histori-cal and forecasted information in terms ofnew dimensions or new values on existingdimensions.

3. Manageable dimensionality -- Humanscan only handle a certain level of complexi-ty in information. Although computers canbe made to deal with unlimited dimension-ality in a database, we find that formanagers using the database, an upperlimit on the number of usable dimensionsmay be in the range of five to seven in asingle data structure. A typical examplewould include the dimensions of time, pro-duct, market, division, and geographicalregion for actual and forecast data. Man-agers sometimes find it useful to picturemultidimensional databases in terms of aseries of information "cubes."

4. Use of "information bases" rather thandatabases whenever possible -- We distin-guish between information bases and

transaction-processing databases in thefollowing ways. (1) Information bases arerelatively small, and usually highly ag-gregated, reflecting the fact that managersusually overestimate the amount of tran-saction detail required to satisfy theirquery and analysis needs. (2) Informationbases have a much greater orientation to-ward future time periods than transaction-processing databases, which tend to focuson recent history. (3) Information basesprovide value-added information throughthe addition of appropriate external data.(4) Information bases are optimized for effi-cient access and analysis, rather than forefficient updating and storage. (5) Informa-tion bases emphasize multiple scenariosand alternative views rather than consis-tency and completeness. (6) Informationbases are constantly evolving.

For most DSS applications it is more cost ef-fective to provide managers with interactiveaccess to several small, highly summarized ex-tracts providing alternative views of the data,rather than to provide access to large transac-tion files. This also helps to ensure the securi-ty and integrity of transaction files and data-bases. The information bases will satisfy 90-95% of user’s queries, and the remainder canbe processed via batch procedures.

The ultimate architecture for the Integratedmarketing databases at Imagtec took on thefollowing characteristics. Using "invertedlist" technology, simple databases were con-structed. A series of databases were imple-mented, the largest of which contained rele-vant sales information for every producUcusto-mer combination in the prior 24 to 26 months.This file was accessed only on an exceptionbasis, after higher level summary files (i.e., in-formation bases) had enabled an area of in-terest to be identified.

The basis for constructing these critical sum-mary files came from the decision and dataanalyses conducted with each marketing man-ager. Thus, the manager’s own perspectiveson his or her business was exactly the viewthey utilized to access their sales data. Final-ly, menu-based dialogues were constructedand data dictionaries were built to ensure

170 MIS Quarterly/June 1986

Page 13: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

Figure 6. Imagtec DSS Architecture Summary

R~elat ionalLManC:ue~;egs

Restate Extract,Historical Aggregate,

Data & Control

Statistical ] Presentation [ Natural| Analysis | Quality

[

Language

~ Graphics Interface

Structured ApplicationsApplications

__.1 I

I End-User

LC°mputing I

DataDictionary

COBOLFORTRAN etc.

MIS Quarterly/June 1986 171

Page 14: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large.scale DSS Implementation

ease of use and flexibility in building and ac-cessing these critical tools (see Figure 6).

Technical Analysis

Technical Analysis translates needs identi-fied in the previous stages into a proposed sys-tem design with technical requirements forhardware and software. The results of this pro-cess will, of course, vary for each application.However, we have found that many issues ap-pear time and again. These are described be-low in the context of the Imagtec environment:

There are a number of technical performanceissues relative to the design, implementation,deployment, and operational use of the Imag-tec systems. Some of the technical issues arefunctional in nature; many of them have impor-tant performance implications. The Imagtecmarketing DSS had to provide considerablyexpanded functional capabilities over ex-Isting systems with acceptable performanceparameters; achieving adequate performanceturned out to be both complex and resourceconsuming.

A number of these technical requirements in-cluded:¯ Increased access to information, including

ability for multiple users to remotely and in-teractively review and analyze information.

¯ More timely delivery of information fromoperational systems to DSS end users;this capability, of course, depends uponthe originating source of the data and itsready availability to the DSS.

¯ Improved modeling and simulation capa.bilities to fully support expanded model-ing, "what-if" and statistical analysis.

¯ Interactive development of ad hoc data.base and information base extracts andreporting.

¯ A high degree of overall systems reliability,Including that of the host hardware, tele.communications, and support software, aswell as that of the DSS support routinesand application modules.

¯ Comprehensive security and integritymechanisms which are effective to protectindividual and group activities without be.Ing unduly awkward to use or generat!ngexcessive system overhead.

¯ Dual modes of DSS user interface -- toenable novice users to operate in a fullyprompted mode and enable expert users tobypass unneeded prompts and operate in aconcise prompt or command ddven mode.

¯ Display of data/information in a variety offormats: on screen, hard copy from screen,batch system or multi-copy, muIti-destina.tion routing, and routing to remote devicessuch as local area networks (LAN), "gate-ways," intelligent terminals, and personalcomputers.

¯ Choice of terminal devices to include on-line access via either full screen video/key.board device or line by line hardcopy de-vice, at user option, leased line and dial upmethods of access.

¯ Flexible and convenient access to personaldatabases for personal experimentation,with ease of creating, modifying, deleting,and/or sharing the personal database.

¯ Appropriate use of batch processing, ratherthan online Interactive processing, wherethe amount of processing, cost factors,and user response requirements dictate.

Once an initial system design architecturehas been chosen, it must be translated into aset of requirements to guide software evalua-tion and selection. A summary list of featuresto consider in a DSS development language,for example, is shown in Figure 7. The require.ments drawn up must address technical is.sues relative to the design, implementation,deployment, and operational use of the DSS.Many of the technical issues are functional innature, and have important performance impli-cations. Some of those listed in Figure 7 forthe Imagtec DSS explicitly appeared in theuser needs assessment (e.g., "what if" analy.sis capabilities), whereas others had to be in.ferred (e.g. the need for multidimensionaldata access). In addition to functional fea-tures, the important issues of implementationand ongoing support of any DSS should be re-flected in characteristics such as the syntac.tic complexity and readability of the com-mand language (which affects end user orien-tation and programmer productivity), securityconsiderations [16], and the types of trainingand support offered by the vendor. A fullerdiscussion of other software features relevantto Technical Analysis of DSS Is given in [14].

172 MIS Quarterly/June 1986

Page 15: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

Figure 7. Sample Criteria For Evaluating DSS Tools

A. FUNCTIONS AND FEATURES1. MODELING -- able to calculate with the information in the system, do

optimization, "what-if" analysis

2. PROCEDURALITY -- ability to solve equations independent of their ordering,symbolic reference of data

3. DATA MANAGEMENT -- number of dimensions, handling of sparse data, adhoc inquiry

4. REPORT GENERATOR -- ability to produce high quality formal reports quicklyand easily

5. GRAPHICS -- line, pie, bar, quality of output6. STATISTICS & ANALYSIS -- descriptive statistics, regression, significance tests7. PROJECT MANAGEMENT -- PERT/CPM, multi-level work breakdown structure8. OPERATIONS RESEARCH -- linear, integer, dynamic programming

9. FORECASTING & ECONOMETRICS -- time series analysis, seasonalization,smoothing

10. EXTERNAL DATABASES & INTERFACES11. SECURITY -- database, file, model, class of user

B. EASE OF USE1. END USER -- analysis performed directly by person who needs the information2. PROGRAMMER/ANALYST -- interested in the quality of the editor, data

management, report writer, etc.3. AD HOC INQUIRY -- end user answering questions for which no standard report

is available

C. FACILITIES1. DOCUMENTATION -- for user, programmer, operations2. TRAINING -- noviceladvanced, systems/user3. SUPPORT -- consultant, hot line4. HOST HARDWARE -- computers supported

5. OPERATING ENVIRONMENT -- operating systems, disk requirements, etc.6. AVAILABILITY IN-HOUSE & ON TIMESHARE

D. MARKET POSTURE1. PRICING -- lease, rent, purchase2. INSTALLATIONS -- number of users, length of use3. TARGET MARKET -- type of business actively pursued by the vendor4. PLANS -- commitment to DSS as a business area, amount of R&D5. USER PERCEPTIONS -- degree of use and support, functions used6. VENDOR VIABILITY -- size of company, revenues, etc.

MIS Quarter~y/June 1986 173

Page 16: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

Management OrientationUser needs assessment should serve to edu-cate the DSS developer about the types ofsystems to build and how they will be used.The needs assessment process, along withManagement Orientation, helps to educatepotential users and other managers about theconcept of DSS and what they can realistical-ly expect the proposed system to do for them.Further, Management Orientation should indi-cate how the DSS should be effectively used.The educational needs of users depends upontheir previous experience with computers andhow they plan to use the proposed system(either directly or through intermediaries). Re-gardless of the exact topics covered, thebasic objectives of the Management Orienta-tion process are to promote:1. information sharing among developers

and users,2. attitude change among users and develop-

ers,3. skill building among the DSS development

team members, and4. actions motivated to ensure commitment

to development and use.

The Management Orientation process at Imag-tec took place in three steps. First, the man-agerial community (about 40 line managersand staff professionals) was briefed in detailon the specific design characteristics and uti-lization implications for each of their function-al areas. Second, senior marketing manage-ment and information systems managementwere briefed on the organizational impactsand benefits and costs of implementation ofthe system. These two sets of orientationswere conducted by the development team. Fi.nally, after an indepth briefing, the Chief Oper-ating Office of Imagtec presented implemen-tation recommendations for final approval tothe Board of Directors. The Board acceptedthe plan and implementation was begun.

DiscussionBy following an effective Decision SupportAnalysis process, the organization can morequickly and confidently proceed with DSS de-velopment. Management priorities are known.

User needs have been analyzed, documented,and structured in an overall framework. Datarequirements are understood and integratedwithin that framework. Technical plans havebeen developed to create an appropriate en-vironment for system development, and cri-teria have been established to guide appropri-ate technology selection. Most importantly,manager involvement and education providedirection, build support, encourage the ac-ceptance of appropriate responsibilities, andcreate realistic expectations.

These results of Decision Support Analysisguide the DSS development process throughdetailed design, prototype development, andfull-scale operational deployment. The plansand priorities established in these initialstages, together with careful project manage-ment during construction and implementa-tion, help to guarantee a system that will ef-fectively support users in high priority tasks.

In the Imagtec case study, a DSS develop-ment plan was created for 16 major businessfunctions within 10 functional areas of themarketing organization in three months (lessdetailed planning, using the same DSA ap-proach for smaller organizations, has beencompleted in six weeks). The identification ofpriority projects proved, in this case, to becritical because funding only allowed develop-ment to proceed in certain areas.

The DSS development manager at Imegtecused the overall plan to Identify potential highpayoff projects for development, to placethem in context, and to ensure integration ofthe separate efforts. The use of this particularapproach to analysis and design at Imagtecalso served other very practical purposes. Themarketing information systems organizationwas initially viewed as a group of systems en-gineers whose job it was to analyze, designand implement systems. DSA allowed thatrole to be expanded into a partnership withkey marketing managers in the identificationof critical areas of improvement in the deci-sion making process. This was initiated at themiddle management level, not as a top execu.tive Critical Success Factors study, but simplyas a user needs assessment aimed at produc-ing more useful information to marketingmanagers.

174 MIS Quarterly~June 1986

Page 17: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

La[ge-scale DSS Implementation

The process was able to identify cdtical needs,achieve a consensus on priorities, and utli-mately received funding from senior manage-ment. It was even able to withstand some fairlycritical organizational changes. Shortly afterthe DSS planning process was completed, thesenior manager who sponsored the study wasreassigned to an entirely different project ac-tivity. This might have led to complete aban-donment of the DSS development project wereit not for the then existing shared perspectivesand goals of Imagtec marketing managers aswell as careful documentation of objectives,approach, benefits and impacts.

The resulting architecture and individual pro-ject plans have continued with varying degreesof success. The implementations of proto-types for marketing managers most closelyassociated with the project have been by farthe most successful. The overall objective ofmarketing data integration has suffered due,to the discontinuity of management sponsor-ship. However, a marketing organization onceoverwhelmed with data has some useful in-formation and increased confidence in an in-formation systems organization and its newtechnologies and methodologies. Most im-portantly, it has a plan and an architecturethat in time can bring more information tobear on its critical marketing decisions.

These benefits, of course, have their associa-ted costs. The decision to perform DecisionSupport Analysis adds up-front costs to thedevelopment effort, and delays the develop-ment of prototypes. In the case of Imagtec, forexample, four man-months of effort over threecalendar months were required. This may seemantithetical to the DSS philosophy of quickprototype development. However, we observein many cases that large.scale, institutionalDSS such as the Imagtec application resemblelarge scale transaction processing applica-tions in that the costs of initial analysis areusually more than offset by a reduced risk ofproject failure, and by lower costs in all otherphases of development. Large-scale, institu-tional DSS represent a hybrid case somewherebetween personal DSS and large transactionprocessing systems, and the approaches toanalysis and design should reflect this. In theImagtec setting, It was possible to start imple-mentation of prototypes in some areas beforeDecision Support Analysis was completed.

ConclusionIt has been suggested that there are naturalstages of evolution and maturity in the use oforganizational information systems. Somesuggestive anecdotal evidence for such con-clusions has been presented. It has oftenbeen observed (or at least argued) in biologythat "ontogeny recapitulates phylogeny" --that is, particular organisms, in their develop-ment, go through stages that resemble someof their ancestral forms [24]. Perhaps DSS de-velopment methodology will (or should) re-capitulate some of its own predecessors byimplementing some of the lessons learnedand techniques invented by those innovatorswho have contributed to the improvement andrelevance of information systems developmentand utilization for a wide range of earlier, largerand more traditional application domains.

We note that many instances of importantevolutionary progress in new fields havearisen from a combination of judicious use ofgood ideas from prior disciplines and a will-ingness to abandon those ideas that don’t ap-ply to the new endeavors. We don’t necessari-ly advocate the particular approach to Deci-sion Support Analysis that has been describedin this article for all large-scale DSS develop-merit efforts, but we do believe that the use ofan explicit, consistent, and repeatable metho-dology has value in structuring specific de-velopment projects as well as institutionaliz-ing successful practices. Improvements froman established baseline may be easier andmore relevant, as well as more effective, thanalways starting from scratch.

AcknowledgementThe methodology presented and the casecited are based on client activities at Researchand Planning Inc.

[1]

ReferencesAlter, S.L. Decision Support Systems:Current Practice and Continuing Chal-lenge, Addison-Wesley, Reading, Massa-chusetts, 1980.

MIS Quarterly/June 1986 175

Page 18: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

[2] Bennet, J.L. Building Decision SupportSystems, Addison-Wesley, Reading,Massachusetts, 1983.

[3] Berrisford, T. and Wetherbe, J. "Heuris-tic Development: A Redesign of SystemsDesign," MIS Quarterly, Volume 3, Num-ber 1, March 1979, pp. 11-19.

[4] Boehm, B. "Quantitative Assessment,"Datamation, Volume 19, Number 5, May1973, pp. 49-57.

[5] Constantine, L. and Yourdon, E. Struc-tured Design, Prentice-Hall, EnglewoodCliffs, New Jersey, 1979.

[6] Donovan, J.J. and Madnick, S.E. "Institu-tional and Ad hoc Decision Support Sys-tems and Their Effective Use," WorkingPaper CR14-27, Massachusetts Instituteof Technology, Cambridge, Massachu-setts, 1977.

[7] Hamilton, M. and Zeldin, S. "The Func-tional Life Cycle Model and Its Automa-tion: USE.IT," Journal of Systems andSoftware, Volume 3, Number 1, March1983.

[8] International Business Machines Cor-poration. "Information Systems PlanningGuide," White Plains, New York, 1978.

[9] Keen, P.G.W. and Scott Mor~0n, M.S.Decision Support Systems: An Organiza-tional Perspective, Addison-Wesley,Reading, Massachusetts, 1978.

[10] Kuhn, T.H. The Structure of ScientificRevolutions, The University of ChicagoPress, Chicago, Illinois, 1962.

[11]Little, J.D.C. "Models and Managers: TheConcept of a Decision Calculus," Man-agement Science, Volume 16, Number 8,April 1970, pp. B466-485.

[12]McKeen, J.D. "Successful DevelopmentStrategies for Business Applications Sys-tems," MIS Quarterly, Volume 7, Number2, September 1983, pp. 47-59.

[13]Meador, C.L., Guyote, M.J. and Keen,P.G.W. "Setting Priorities for DSS De-velopment," MIS Quarterly, Volume 8,Number 2, June 1984, pp. 117-129.

[14]Meador, C.L. and Mezger, R.A. "Selectingan End User Programming Language forDSS Development," MIS Quarterly, Vol-ume 8, Number 4, December 1984, pp.267-281.

[15]Meador, C.L. and Ness, D.N. "DecisionSupport Systems: An Application to Cor-

porate Planning," S/oan Management Re-view, Volume 14, Number 2, 1974, pp.51-68.

[16] Meldman, J.A. "SMR Forum: EducatingToward Ethical Responsibility in MIS,"Sloan Management Review, Volume 23,Number 2, Winter 1982, pp. 73-75.

[17]Ness, D.N. "Interactive Systems: Theoriesof Design," paper presented at the What-ton Office of Naval Research Conferenceon DSS, University of Pennsylvania, Phila-delphia, Pennsylvania, November 4-7,1975, pp. 1-24.

[18] Nolan, R.L. "Managing the Crisis in DataProcessing," Harvard Business Review,Volume 57, Number 2, March-April 1979,pp. 116-118.

[19] Nolan, R.L. and Norton, D.P. "RecharterDP to the Advance Stages," Stage byStage, Volume 2, Number 3, 1984, pp. 1-5.

[20] Rockart, J.F. "Chief Executives DefineTheir Own Data Needs," Harvard Busi-ness Review, Volume 57, Number 2,March-April 1979, pp. 81-93.

[21] Ross, D.T. and Schoman, K.E. "Struc-tured Analysis for Requirements Defini-tion," IEEE Transactions on SoftwareEngineering, Volume SE3, Number 1, Jan-uary 1977, pp. 6-15.

[22]Rudkin, R.I. and Sheve, K.D. "StructuredDecomposition Diagram: A New Tech-nique for System Analysis," Datamation,Volume 25, Number 11, October 1979, pp.130-146.

[23]Shooman, M.L. and Bolsky, M.l. "Types,Distribution, and Test and CorrectionTimes for Programming Errors," Interna-tional Conference on Reliable SoftwareProceedings, Los Angeles, California,1975, pp. 347-357.

[24]Simon, H.A. The Sciences of the Artificial,Massachusetts Institute of Technology,Cambridge, Massachusetts, 1969.

[25]Sprague, Jr., R.H. and Carlson, E.D. Build-ing Effective Decision Support Systems,Prentice-Hall Inc., Englewood Cliffs, NewJersey, 1982.

[26]Urban, G.L. "Building Models for Deci-sion Makers," Interfaces, Volume 4, Num-ber 3, May 1974, pp. 1-11.

[27]Zmud, R.W. "Management of Large Soft-ware Developments," MIS Quarterly, Vol-ume 4, Number 2, June 1980, pp. 45-55.

176 MIS Quarterly/June 1986

Page 19: Decision Support Planning and Analysis: The Problems of ...€¦ · ning [8], Yourdon’s Structured Analysis Tech-niques [5], SofTech’s Structured Analysis and Design Techniques

Large-scale DSS Implementation

About

C. Lawrence Meador is Chairman of the con-sulting firm Decision Support Technology,Inc. and serves on the Board of Directors ofSoftware Productivity Research, Inc. He re-ceived his graduate degrees from the SloanSchool of Management and the School ofEngineering, MIT. He is also on the academicstaff of MIT where he has held various teach-ing and research appointments for the pasttwelve years. At MIT he earlier served as afounding member of the Clinical DecisionMaking Group at the Laboratory for ComputerScience, and was Assistant Director of theSloan School Center for Information SystemsResearch. His consulting experiences havefocused on decision support and knowledge-based systems with particular emphasis onstrategic and tactical planning applications.His current research is concerned with futureDSS technologies and their impacts, planninglarge-scale DSS environments, and softwaretechnology evaluation and development. Heis an editor of the international journals Com-puter Communications and Comunicacion eInformatica.

Martin J. Guyote is a Senior Consultant andProduct Manager with Decision Support Tech-nology, Inc. He received his M.S. and Ph.D.degrees in Cognitive Science at Yale Universi-ty, and his M.S. degree in Management fromthe Alfred P. Sioan School of Management atMIT, and has been on the faculty of Boston

the Authors

University. His research and consulting hasincluded the areas of decision support sys-tems, software productivity, quality and reli-ability, organizational decision making andorganizational psychology. His activities in-clude hardware and software evaluation,decision analysis, user needs assessment,DSS design, development and implementa-tion and research into the issues of DSS andartificial intelligence. He has contributedbook chapters and articles to several industryand academic journals.

William L. Rosenfeld is Vice President of Re-search and Planning, Inc. He received his B.A.in Mathematics and his M.S. in Computer Sys-tems from the State University of New York atBinghamton. He has managed and implemen-ted a number of decision support systems in-cluding a model used to evaluate the caseflow/profitability of R&D programs as theyprogress through the project life cycle, anoperational expense planning model used todevelop and analyze estimates of expensesalong several dimensions, and the refinementof a large model that supports financialevaluation of strategic alternatives and sce-narios. Prior to his current job, he was a con-sultant for SofTech, Inc. where he managedthe specification, design, and implementa-tion of their in-house labor accounting sys-tems. He also planned and developed a pro-curement management system for the Depart-ment of Energy.

MIS Quarterly/June 1986 177