Smith_Toward Identifying Factors REadiness Online Learning

12
Distance Education, Vol. 24, No. 1, 2003 Towards Identifying Factors Underlying Readiness for Online Learning: An Exploratory Study Peter J. Smith Deakin University, Australia Karen L. Murphy Texas A&M University, USA Sue E. Mahoney University of Houston, USA ABSTRACT To test the potential value of McVay’s (2000) Readiness for Online Learning questionnaire for research and practice, the instrument was administered to 107 undergraduate university students drawn from a range of courses in the United States and Australia. The questionnaire was subjected to a reliability analysis and a factor analysis. The instrument fared well in the reliability analysis, and yielded a two-factor structure that was readily interpretable in a framework of existing theory and research. Factors identified were “Comfort with e-learning” and “Self-management of learning.” It is suggested that the instrument is useful for both research and practice, but would be enhanced through further work on 5 of the 13 items. Additionally, further work is required to establish predictive validity. Introduction Previous Research Findings The notion of readiness for online learning among students has been developed in some detail by Warner et al. (1998) in their study within the Australian vocational education and training sector. In that study Warner et al. defined the notion of readiness for online learning in terms of three aspects: (a) students’ preferences for that form of delivery as opposed to face-to-face classroom instruction, or the provision of print-based pre-packaged resource materials; (b) student confidence in using electronic communication for learning and, in particular, com- petence and confidence in the use of Internet and computer-mediated communication; and (c) ability to engage in autonomous learning. The Warner et al. study provided some very sobering findings for educators who are committed to online learning, in so far as the large sample (542) of students in their study was typified by a low preference for online learning, ISSN 0158-7919 print; 1475-0198 online/03/010057-11 2003 Open and Distance Learning Association of Australia, Inc. DOI: 10.1080/0158791032000066525

Transcript of Smith_Toward Identifying Factors REadiness Online Learning

Page 1: Smith_Toward Identifying Factors REadiness Online Learning

Distance Education, Vol. 24, No. 1, 2003

Towards Identifying Factors UnderlyingReadiness for Online Learning: An Exploratory

Study

Peter J. SmithDeakin University, Australia

Karen L. MurphyTexas A&M University, USA

Sue E. MahoneyUniversity of Houston, USA

ABSTRACT To test the potential value of McVay’s (2000) Readiness for Online Learning questionnaire

for research and practice, the instrument was administered to 107 undergraduate university students

drawn from a range of courses in the United States and Australia. The questionnaire was subjected to

a reliability analysis and a factor analysis. The instrument fared well in the reliability analysis, and

yielded a two-factor structure that was readily interpretable in a framework of existing theory and

research. Factors identified were “Comfort with e-learning” and “Self-management of learning.” It is

suggested that the instrument is useful for both research and practice, but would be enhanced through

further work on 5 of the 13 items. Additionally, further work is required to establish predictive validity.

Introduction

Previous Research Findings

The notion of readiness for online learning among students has been developed in some detailby Warner et al. (1998) in their study within the Australian vocational education and trainingsector. In that study Warner et al. defined the notion of readiness for online learning in termsof three aspects: (a) students’ preferences for that form of delivery as opposed to face-to-faceclassroom instruction, or the provision of print-based pre-packaged resource materials; (b)student confidence in using electronic communication for learning and, in particular, com-petence and confidence in the use of Internet and computer-mediated communication; and (c)ability to engage in autonomous learning. The Warner et al. study provided some verysobering findings for educators who are committed to online learning, in so far as the largesample (542) of students in their study was typified by a low preference for online learning,

ISSN 0158-7919 print; 1475-0198 online/03/010057-11 2003 Open and Distance Learning Association of Australia, Inc.

DOI: 10.1080/0158791032000066525

Page 2: Smith_Toward Identifying Factors REadiness Online Learning

58 P. J. Smith et al.

low confidence, little experience with the Internet for learning or with computer-mediatedcommunication (CMC), and poor skills for the sort of self-direction required for autonomouslearning in any setting, not least an online learning setting.

In working with vocational learners, Smith (2001a) identified a lack of readiness for flexiblelearning in its more broad sense of including online learning but not exclusively online, andspecifically identified the need for learners to move towards a position where they could:

• use past experience to develop new learning;• be motivated by intrinsic factors in favour of extrinsic motivation;• set their own goals for learning;• evaluate and monitor their own learning;• adopt a problem-solving approach;• select their own learning strategies and learning materials.

In a factor analytic study of the learning preferences of 1,252 vocational students, Smith(2000) showed that two major factors could be identified through the Canfield Learning StylesInventory (Canfield, 1980). One of these factors identified the comfort with which studentscould engage with learning sequences that were presented verbally (text or listening) asopposed to sequences that were largely non-textual (direct experience, observation, practice).The second factor was identified as being associated with autonomous learning and self-direc-tion. The vocational students in Smith’s (2000) large sample were typified as being non-verballearners, and non-self-directed. He drew attention to the challenges that these findings providefor education and training designs that rely on resource-based learning, either online or in printand audio materials. In a later study Smith (2001b) showed that university undergraduatestudents exhibited much the same characteristics within the same identifiable two-factor space.

The identification of dimensions associated with comfort with the instructional delivery andresources, and with autonomy in learning is also resonant with work by Riding and hiscolleagues in the United Kingdom. Riding and Cheema (1991) have attempted to integrate themany conceptualizations of “learning style” and “cognitive style,” and have developed atwo-dimensional model of cognitive style. In that model, one dimension is conceptualized asWholist–Analytic, and the other as Verbalizer–Imager. Riding and Cheema (1991) and Ridingand Sadler-Smith (1992) have suggested that the Field-dependence/Field-independence dimen-sion (Witkin et al., 1977) is a label used “within the Wholist–Analytic Cognitive Style family”(Riding & Sadler-Smith, 1992, p. 324), with Field-dependents lying within the Wholistcategory.

Riding and Sadler-Smith (1992) and Sadler-Smith and Riding (1999) have also begun toinvestigate the relationship between cognitive style and instructional preference in an attemptto develop predictions to assist the instructional design and delivery of learning programs tomeet the needs of different groups of learners, or individuals. In particular, Sadler-Smith andRiding (1999) have identified a statistically significant preference among Wholists fornon-print-based media of instruction, for collaborative learning methods, and for moreinformal types of assessment. These findings are consistent with Riding’s (1991) suggestionthat Wholists are sociable and socially dependent, and have led to the suggestion (Sadler-Smith& Riding, 1999), along with Smith (2000), that self-direction-dependence represents acontinuum of diversity among learners. Riding and Sadler-Smith (1997) have also suggestedthat Wholists process information simultaneously, while Analytics break it down into parts and

Page 3: Smith_Toward Identifying Factors REadiness Online Learning

Factors Underlying Readiness for Online Learning 59

process it sequentially, such that learner comfort with resources and their delivery is relatedto this dimension. This notion of the comfort of different learners with different resource-basedlearning models has also been identified and discussed by Riding and Rayner (1998).

These two dimensions, of learner skill, confidence and comfort with the learning resourcesand delivery mode, and the need for self-management of learning are further identified bySmith (1999). Smith’s paper, though, focuses specifically on online learning, such that sherecasts this first dimension in much the same way that Warner et al. (1998) have described thecharacteristics they identified for readiness for online learning.

McVay’s Readiness for Online Learning Questionnaire

McVay (2000, 2001) has developed and evaluated a student orientation course towards onlinelearning in a bachelor’s degree within a university in the United States. Specifically, the coursewas designed to “incorporate elements that would provide the student with a robust experienceof becoming familiar with the technology, the communication tools, and the learning processitself” (McVay, 2000, p. 39).

As part of the evaluation of the efficacy of the orientation program, McVay developed areadiness questionnaire that she administered pre- and post-orientation course, and which shelater published in McVay (2001). The questionnaire comprises 13 items, rated by respondentson a 4-point Likert scale. The items in the questionnaire are shown in Table 1 in the Resultssection of this paper, which deals with the results of our analysis of the questionnaire.

What particularly interested us about this short test was that the items relate strongly to thecharacteristics of readiness for flexible learning identified by Smith (2001a), and describedwithin his two-factor space of learner preferences for resource-based learning (Smith, 2000),and identifiable within the other two-dimensional models reviewed earlier in this paper.Specifically, the items relate to the issues identified by Smith as the use of previous learningand experience, the setting of goals for learning, the evaluation and monitoring of learning, andthe selection of learning strategies and learning resources. Accordingly, the face validity of thequestionnaire looked promising. The characteristics of intrinsic motivation and the adoption ofa problem-based learning approach are not, though, clearly evident in the McVay question-naire.

Also interesting in the McVay questionnaire was that a close and experienced surfaceexamination of the instrument suggested that factor analysis may identify factors similar tothose identified by Smith (2000) and Sadler-Smith and Riding (1999): comfort with thelearning resources available in a sequence of learning, and degree of self-direction.

The McVay instrument appeared to us to have considerable potential congruence with morebroadly based previous work on the readiness of learners for resource-based learning.Establishment of that congruence strengthens the place of the instrument for research on onlinelearning within the broader context of resource-based learning and flexible delivery. At thesame time the instrument provided for a very clear focus on readiness for online learning asa particular form of resource-based learning, and may prove to be a useful instrument forresearch on online learning readiness among varied learner groups, as well as being a usefuldiagnostic tool for readiness.

Our research here is designed to test the instrument to identify whether or not data resultingfrom it can be effectively factor analysed, and whether a factor analysis provides results that

Page 4: Smith_Toward Identifying Factors REadiness Online Learning

60 P. J. Smith et al.

are congruent with, and interpretable within, the broader research literature on resource-basedflexible learning. Additionally, it is our intention to test the reliability of the questionnaire, aswell as the reliability of each of the items that comprise it. Our view here is that if reliabilitycan be established, and a factor solution identified that can be interpreted in a framework ofexisting theory and research, then the questionnaire shows promise as a tool for research andpractice. If those features of the questionnaire can be established, there will be sufficientgrounds to go ahead and test predictive validity as a further piece of research. Positive findingsfrom such a study will also provide a justification for broader research, with larger samples,to further test reliability, factorability, and predictive validity.

Method

To test the McVay instrument with a broad selection of students, we sought participants froma range of undergraduate programs in the United States (Mahoney, 2002) and in Australia.That sample selection provided for participants with a diversity of educational contexts, anda diversity of courses. We also ensured that the sample had good representation from bothgenders, with 47 males and 60 females participating.

Students who participated in the research were drawn from the following range ofundergraduate courses:

• Business;• Education;• Engineering;• Nursing;• Arts/Journalism/English;• Veterinary/Science;• Horticulture;• Information Technology;• Law;• Social Science/Social Work/Psychology.

The age range of participating students was 18–24 years (with one 41 year old). Forty-five ofthe students were from the United States and sixty-two were from Australia.

We chose to focus only on undergraduate students to control for the possibility thatundergraduate and postgraduate students may behave differently in terms of their readiness foronline learning. Macdonald et al. (2001) have shown considerable differences betweenundergraduate and postgraduate students, and we see that as a fruitful area for future research,should the McVay instrument prove a promising one for this form of investigation. Althoughwe wanted a sample that was not just from one context, we also wanted to control culture, atleast to the extent that all participants were native English speakers, and came from similarcultures in the United States and Australia.

Students were provided with the short questionnaire in class, together with a shortexplanation of the nature and purpose of the research. They were then free to complete thequestionnaire at a time of their choosing, and return it to the researchers. All testing wasanonymous.

Page 5: Smith_Toward Identifying Factors REadiness Online Learning

Factors Underlying Readiness for Online Learning 61

Results

Reliability Analysis

The reliability of the questionnaire is satisfactory, with a Cronbach alpha of 0.83. Both Coakesand Steed (1997) and Pallant (2001) suggest that alpha values above 0.7 are sufficient forreliability to be assumed. Item 9 showed a Corrected Item–Total correlation of less than 0.3,achieving a score of 0.25. That score indicates that the items may be testing somethingdifferent from the rest of the questionnaire (Pallant, 2001, p. 87). All other items achievedCorrected Item–Total correlations of greater than 0.3, ranging from 0.33 to 0.59.

Establishing the Factorability of the Matrix

As exploratory research on the use of McVay’s instrument for research purposes, it wasimportant that we established the factorability of the data yielded by the survey. Coakes andSteed (1997) suggest a minimum of five subjects per variable, which has been achieved in thisstudy. As also suggested by Coakes and Steed, we examined the factorability through aninspection of the correlation matrix, and through conducting the Kaiser–Meyer–Olkin (KMO)measure of sampling adequacy and Bartlett’s test of sphericity.

A sizeable number (25) of the correlations in the matrix were larger than 0.3. The KMO testyielded a measure of 0.815, and Coakes and Steed recommend that this measure should exceed0.6 to proceed with factoring. Bartlett’s test of sphericity was significant well beyond the 0.001level.

Accordingly, having established the factorability of the matrix, we proceeded with the factoranalysis.

Factor Analysis

As an exploratory study we conducted several analyses to establish the one that yielded themost satisfactory results for interpretation. Each was a principal components analysis withvarimax rotation. We adopted a factor loading criterion of 0.40 for inclusion of the item in theinterpretation, more stringent than Tabachnik and Fidell (1996), who suggest 0.32, andconsistent with Comrey and Lee (1992) who suggest that the criterion should be set a littlehigher than 0.32. Additionally, where an item loaded on more than one factor, we havefollowed the advice of Arrindell et al. (1983) and have included the item in the factor on whichit scored highest, provided the difference between the two-factor loadings was at least 0.2.

The first analysis, after varimax rotation, indicated three factors with eigenvalues greaterthan one. That solution accounted for 57.5% of the variance; however, applying Cattell’s(1966) scree test indicated that a two-factor solution was the most satisfactory. We settled onthe two-factor solution, which provided a clear interpretation, and accounted for 48.5% of thevariance. That solution is shown in Table 1.

Factor Interpretation

Items 1, 2, 3, 5, 6, 7 and 12 loaded highly on Factor 1, and we have interpreted that factoras “Comfort with e-learning.” However, for items 6, 7 and 12 the loading on Factor 1 was not

Page 6: Smith_Toward Identifying Factors REadiness Online Learning

62 P. J. Smith et al.

TABLE 1. Questionnaire items and factor loadings

Factor 1 2

Eigenvalue 4.37 1.97

% of variance 33.35 15.16

Questionnaire item

1. I am able to easily access the Internet as needed for my studies .75 .032. I am comfortable communicating electronically .87 � .043. I am willing to actively communicate with my classmates and .79 .03

instructors electronically4. I am willing to dedicate 8 to 10 hours per week for my studies .29 .265. I feel that online learning is of at least equal quality to traditional .71 .20

classroom learning6. I feel that my background and experience will be beneficial to my .47 .37

studies7. I am comfortable with written communication .48 .428. When it comes to learning and studying, I am a self-directed person .33 .709. I believe looking back on what I have learned in a course will help � .08 .59

me to remember it better10. In my studies, I am self-disciplined and find it easy to set aside .15 .77

reading and homework time11. I am able to manage my study time effectively and easily complete .06 .79

assignments on time12. As a student, I enjoy working independently .41 .3513. In my studies, I set goals and have a high degree of initiative .13 .64

at least 0.2 greater than the loading on Factor 2. Items 8, 9, 10, 11 and 13 loaded highly anddistinctively on Factor 2, which we have interpreted as “Self-management of learning.” Thefactor may possibly also be interpreted as “Self-directed learning.”

In our discussion below we have adopted the name “Comfort with e-learning” for Factor 1,and “Self-management of learning” for Factor 2.

Discussion

The research undertaken in this study has been relatively small scale, and designed to providean early assessment of the value of the McVay Readiness for Online Learning questionnaireas an instrument for further research. Early indications from this exploration of the McVayReadiness for Online Learning questionnaire are that it is a promising instrument for researchand for practice. We suggest, however, that further work needs to be done with larger and

Page 7: Smith_Toward Identifying Factors REadiness Online Learning

Factors Underlying Readiness for Online Learning 63

more varied samples to further investigate the value of the tool, and its range of applications.It is our intention to undertake some of that research now that we are satisfied that theinstrument has promise in terms of its reliability and its factorability.

The instrument has yielded a factor structure that is readily interpretable, and stronglyresonant with other research findings from the broader flexible learning literature. The“Comfort with e-learning” factor is recognizable as an e-learning-focused dimension not unlikethe Verbal–Nonverbal dimension identified by Smith (2000) in his work with a broader set ofresource-based flexible learning materials. The McVay questionnaire describes a readiness forengagement with the particular form of resource-based learning delivery that is online. Thatdimension is also recognizable in the Sadler-Smith and Riding (1999) research that identifiedan association between cognitive style and learner comfort with different forms of resource-based learning materials.

The identification of a “self-management of learning” factor in the McVay instrument isclearly evident also in the broader research by Smith (2000), and by Sadler-Smith and Riding(1999). Indeed, the need for self-direction, or self-management of learning, runs clearlythroughout the distance education and resource-based flexible learning literature, with Evans(2000) commenting that self-direction is a necessary prerequisite for effective resource-basedlearning in distance education and flexible delivery, while similar observations have been madeby Calder (2000) and Warner et al. (1998).

The identification through the Readiness questionnaire of these two factors with congruencein broader resource-based distance education and flexible delivery indicates that the instrumentcan be effectively used for further research. Through the extraction of factor scores forindividuals, using a regression technique, the instrument lends itself to research that comparesdifferent groups of learners; or compares the same learners at different points of development.

While it is suggested here that the McVay instrument could be profitably used in its currentform for research and practice in online learning readiness, there are some issues that furtherwork on the instrument could valuably focus upon and seek to improve.

First, the two-factor solution accounted for 48.5% of the variance which, while respectableenough, could be improved upon to increase the value of the instrument. There is still 51.5%of the variance left unexplained. However, given that the two factors identified haveconsiderable support in the broader literature, it is sensible to suggest that further work on thequestionnaire may be dedicated towards developing the measures of these two factors in sucha way as to increase the amount of variance explained by them.

Five questions (4, 6, 7, 9 and 12) require further work to yield a better contribution to thereliability of the instrument, or to measurement of the two identified factors.

Question 4, which asks students to rate their willingness “to dedicate 8 to 10 hours per weekfor my studies,” has not loaded significantly on either factor, although the nature of thequestion would suggest it should be more related to the self-management dimension than tothe comfort with e-learning factor. It may be that the problem with this question is that it istoo contextualized within a particular institution’s expectations (in this case, perhaps the onein which McVay conducted her research). It may be better to broaden the question to ask aboutwillingness to set aside an amount of time each week to effectively engage with study, withoutsuggesting any particular time. That may move the question out of a particular context, and

Page 8: Smith_Toward Identifying Factors REadiness Online Learning

64 P. J. Smith et al.

may also move it away from the sense of commitment that comes with the question as itstands. By broadening the nature of the question in the way suggested it may set it more ina context of study management rather than one of a particular commitment.

Question 6, which asks respondents to rate how well they think their background andexperience will be beneficial to their studies, has loadings on both factors that are only 0.1apart, with the higher loading of 0.47 on the comfort with e-learning factor. Although thatresult indicates that the item does measure that factor, it is also measuring the other factor,indicating that some work is required on the item to yield a distinctive loading. It is noteworthyhere that the use of prior knowledge to assist with new learning was identified by Smith(2001a) as one of the learner characteristics for effective engagement with flexible learning.Although the current study provides evidence for the importance of that characteristic, it isunclear in the McVay instrument whether it is associated with e-learning comfort or withself-management.

Question 7, which asks students about their degree of comfort with written communication,loads on the comfort with e-learning factor, indicating that written communication forms partof e-learning comfort. However, the item also loaded on the self-management of learningfactor, although at a lower factor loading. This result makes sense within our theoreticalframework, since ability to work with textually based information also forms part ofself-directed learning in a resource-based environment. There would be value in adjusting thisitem to load more clearly on one factor or the other, with preference for loading on the comfortwith e-learning factor.

Question 9, which asks the degree to which looking back on previous learning will help withremembering it, displayed lower than acceptable contribution to the reliability of the instru-ment, and may be measuring something else altogether, probably to do with revision as alearning strategy. The reliability analysis showed that the item missed being acceptable by aconsiderable margin, such that it is probably advisable in the first instance to delete thequestion and replace it with a new one that contributes to reliability and that loads on one ofthe identified factors.

Question 12 is an interesting result in that it loads on both factors, with very similar factorloadings. The question asks respondents to rate how much they “enjoy working indepen-dently.” It is quite possible that some respondents interpreted the word “independently” asmeaning working by themselves without other support or interaction, while others may haveinterpreted “independently” to mean working in a self-directed way, including with othersthrough electronic communication. That distinction in the uses of the word “independent” hasbeen made by writers such as Brookfield (1986), Morgan (1993) and Paul (1990). If thosedifferent interpretations were made by respondents, it is easy to see why the question may haveyielded a split loading on both identified factors. Had the word “independently” been replacedby “in a self-directed way,” it may have loaded on the second identified factor.

Finally, although the questionnaire items have face validity sufficient for the currentresearchers to see the possibilities of a readily interpretable factor analysis, work has yet to beconducted on predictive validity. The results of McVay’s own research in using thisquestionnaire provide some early evidence for validity, since she was able to establish arelationship between student responses and the features of her orientation program. Furtherresearch is necessary to establish more formal validity indicators, but we are encouraged by

Page 9: Smith_Toward Identifying Factors REadiness Online Learning

Factors Underlying Readiness for Online Learning 65

the results of the present investigation to suggest that the investment in that research toestablish predictive validity with larger samples would be worthwhile.

At only 13 questions the instrument typically took respondents about 5–10 min to complete,which represents a good feature of its utility. However, there is scope here to increase thelength of the questionnaire to further enhance its reliability and validity. Further research intothe identification, construction and trialling of additional items that would load on the twoidentified factors would be valuable.

Conclusion

We suggest that the McVay questionnaire provides a useful tool for research and practice inthe area of readiness for online learning, and can be usefully employed in its current form. Wesuggest, though, that the instrument is sufficiently promising to render worthwhile extra workon refining the measurement of the two factors to increase the amount of variance explained.Additionally, further work on some specific questions would be valuable to associate themmore closely with the identified factors, as would the construction of some new questions toenhance reliability and possibly validity. Further work on establishing predictive validitywould also be worth undertaking.

Leaving aside the characteristics of the instrument itself, this research has identified a usefulfactor structure and potential theoretical framework for further research in this area ofreadiness for online learning. However, the study reported here is quite small in scale such thatfurther research on the instrument and other factor structures that might be identified issuggested as a fruitful activity.

REFERENCES

Arrindell, W. A., Emmelkamp, P. M. G., Brilman, E., & Monsma, A. (1983). Psychometricevaluation of an inventory for assessment of parental rearing practices. Acta PsychiatricaScandinavica, 67, 163–177.

Brookfield, S. (1986). Understanding and facilitating adult learning. San Francisco: Jossey-Bass.

Calder, J. (2000). Beauty lies in the eye of the beholder. International Review of Research inOpen and Distance Learning, 1(1). Retrieved May 16, 2002, from http://www.irrodl.org/content/v1.1/judith.pdf

Canfield, A. A. (1980). Learning styles inventory manual. Ann Arbor: Humanics Media.Cattell, R. B. (1966). The scree test for the number of factors. Multivariate Behavioral

Research, 1, 245–276.Coakes, S. J., & Steed, L. G. (1997). SPSS analysis without anguish. Brisbane: John Wiley.Comrey, A. L., & Lee, H. B. (1992). A first course in factor analysis (2nd ed.). Hillsdale, NJ:

Erlbaum.Evans, T. (2000). Flexible delivery and flexible learning: Developing flexible learners? In V.

Jakupec & J. Garrick (Eds.), Flexible learning, human resource and organisationaldevelopment (pp. 211–224). London: Routledge.

Page 10: Smith_Toward Identifying Factors REadiness Online Learning

66 P. J. Smith et al.

Macdonald, J., Heap, N., & Mason, R. (2001). “Have I learnt it?” Evaluating skills forresource-based study using electronic resources. British Journal of Educational Technology,32(4), 419–433.

Mahoney, S. E. (2002). Mindset change: Influences on student buy-in to online classes.Unpublished doctoral dissertation, Texas A&M University, College Station.

McVay, M. (2000). Developing a Web-based distance student orientation to enhance studentsuccess in an online bachelor’s degree completion program. Unpublished practicum reportpresented to the Ed.D. Program, Nova Southeastern University, Florida.

McVay, M. (2001). How to be a successful distance learning student: Learning on theInternet. New York: Prentice Hall.

Morgan, A. R. (1993). Improving your students’ learning: Reflections on the experience ofstudy. London: Kogan Page.

Pallant, J. (2001). SPSS survival manual. Crows Nest, NSW: Allen & Unwin.Paul, R. (1990). Open learning and open management. London: Kogan Page.Riding, R. J. (1991). Cognitive styles analysis. Birmingham: Learning and Training Technol-

ogy.Riding, R. J., & Cheema, I. (1991). Cognitive styles: An overview and integration. Educational

Psychology, 11, 193–215.Riding, R. J., & Rayner, S. (1998). Cognitive styles and learning strategies: Understanding

style and differences in learning and behaviour. London: Fulton.Riding, R. J., & Sadler-Smith, E. (1992). Type of instructional material, cognitive style and

learning performance. Educational Studies, 18(3), 323–339.Riding, R. J., & Sadler-Smith, E. (1997). Cognitive style and learning strategies: Some

implications for training design. International Journal of Training and Development, 1,199–208.

Sadler-Smith, E., & Riding, R. (1999). Cognitive style and instructional preferences. Instruc-tional Science, 27, 355–371.

Smith, E. (1999, December). Learning to learn online. Paper presented at the annual meetingof the Australasian Society for Computers in Learning in Tertiary Education, Brisbane,Australia. Retrieved May 16, 2002, from http://www.ascilite.org.au/conferences/brisbane99/papers/smith.pdf

Smith, P. J. (2000). Preparedness for flexible delivery among vocational learners. DistanceEducation, 21(1), 29–48.

Smith, P. J. (2001a). Learners and their workplaces: Towards a strategic model of flexibledelivery of training in the workplace. Journal of Vocational Education and Training, 53(4),609–628.

Smith, P. J. (2001b). Learning preferences of TAFE and university students. Australian andNew Zealand Journal of Vocational Education Research, 9(2), 87–109.

Tabachnik, B. S., & Fidell, L. S. (1996). Using multivariate statistics. New York: Harper-Collins.

Warner, D., Christie, G., & Choy, S. (1998). The readiness of the VET sector for flexibledelivery including on-line learning. Brisbane: Australian National Training Authority.

Witkin, H. A., Moore, C. A., Goodenough, D. R., & Cox, P. W. (1977). Field-dependent andfield-independent cognitive styles and their educational implications. Review of EducationalResearch, 47, 1–64.

Page 11: Smith_Toward Identifying Factors REadiness Online Learning

Factors Underlying Readiness for Online Learning 67

Correspondence. Peter Smith, Senior Lecturer, Faculty of Education, Deakin University,Waterfront Campus, Geelong, Victoria 3217, Australia. E-mail: [email protected]

Peter Smith is Senior Lecturer in the Faculty of Education at Deakin University, Australia.

Karen Murphy is Associate Professor in the College of Education at Texas A&M University,USA.

Sue Mahoney is Visiting Assistant Professor in the Department of Urban Education at theUniversity of Houston-Downtown, USA.

Page 12: Smith_Toward Identifying Factors REadiness Online Learning