Research Article
BibTex RIS Cite

Sınıf İçi Öğretimin ve Öğrencinin Değerlendirilmesine İlişkin Meta-Bilişsel Farkındalıklar Ölçeğinin (MFÖ-DEĞ) Geliştirilmesi ve Psikometrik Özelliklerinin Belirlenmesi

Year 2022, Volume: 11 Issue: 1, 60 - 74, 28.03.2022
https://doi.org/10.30703/cije.899403

Abstract

Bu araştırmanın amacı öğretmenlerin kendi sınıf içi öğretimlerini ve öğrenenlerin bilişsel çıktılarını anlık izlemesi ve değerlendirmesine yönelik metabilişsel farkındalıklarını belirleyebilecek bir ölçeğin geliştirilmesidir. Tarama deseni aracılığıyla gerçekleştirilen bu araştırma aynı zamanda bir ölçek geliştirme çalışmasıdır. Bu çalışmaya 2019-2020 eğitim-öğretim yılında Millî Eğitim Bakanlığına (MEB) bağlı kurumlarda görev yapan farklı bölüm ve kıdemlerden 356 öğretmenin katılımı sağlanmıştır. Beş adımdan oluşan bu araştırmada öncelikle mevcut literatür taranmış ve öğrenme-öğretme sürecinin değerlendirilmesi ve meta-bilişsel farkındalıkları pedagoji yüklü olarak bütünleştirilmesine olanak sağlayacak kavramsal ve tematik çerçeve oluşturulmuştur. Ardından bu teorik çerçeveye uygun 45 maddenin yer aldığı madde havuzu oluşturulmuş ve uzman görüşlerine sunulmuştur. Uzman görüşlerinin değerlendirilmesinde Lawshe tekniğine bağlı kalınmıştır. Uzman görüşleri sonrası 36 maddeden oluşan ölçeğin nihai hali, katılımcı grubuna uygulanarak ölçeğin psikometrik özellikleri keşfedilmiştir. Bunun için, veriler üzerinde açımlayıcı ve doğrulayıcı faktör analizleri yapılarak Cronbach Alpha iç tutarlılık katsayısı hesaplanmıştır. Analizler sonucunda, MFÖ-DEĞ’in nihai halinin üç faktörlü bir yapıda olduğu ve 34 maddeden oluştuğu tespit edilmiştir. Ölçeğin alt faktörleri; sınıf içi öğretimsel faaliyetlerin ve etkililiğinin izlenmesi ve değerlendirilmesi, öğrenen çıktılarının özetleyici değerlendirilmesi için izlenen yolların değerlendirilmesi ve metabilişsel meslektaş etkileşimi ve fikir değişimi şeklinde isimlendirilmiştir. Ayrıca Cronbach Alpha iç tutarlılık katsayısı 0,96 olarak tespit edilen MFÖ-DEĞ’in güvenilir bir ölçek olduğu da anlaşılmıştır. Ölçeğin psikometrik özelliklerine ilişkin sonuçlar, mevcut literatür temelinde tartışılmış ve gerekli öneriler sunulmuştur.

References

  • Allen, M. J., and Yen, W. M. (2002). Introduction to measurement theory (2nd ed). Waveland Press.
  • Ayre, C., and Scally, A. J. (2014). Critical values for Lawshe’s content validity ratio: revisiting the original methods of calculation. Measurement and Evaluation in Counseling and Development, 47(1), 79-86.
  • Berliner, D. C. (2001). Learning about and learning from expert teachers. International Journal of Educational Research, 35(5), 463–482.
  • Bransford, J. D., Brown, A. L., and Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school. National Academy Press.
  • Bonett, D. G., and Wright, T. A. (2015). Cronbach’s alpha reliability: Interval estimation, hypothesis testing, and sample size planning. Journal of organizational behavior, 36(1), 3-15.
  • Brown, T. A. (2015). Confirmatory factor analysis for applied research. Guilford Publications.
  • Büyüköztürk, Ş., Çakmak, E. K., Akgün, Ö. E., Karadeniz, Ş., ve Demirel, F. (2017). Bilimsel araştırma yöntemleri. Pegem Atıf İndeksi, 1-360.
  • Büyüköztürk, Ş. (2018). Veri analizi el kitabı. Pegem Akademi.
  • Carpenter, S. (2018). Ten steps in scale development and reporting: A guide for researchers. Communication Methods and Measures, 12(1), 25-44.
  • Child, D. (2006). The Essentials of Factor Analysis. 3rd edn. Continuum.
  • Cochran-Smith, M. (2005). Teacher educators as researchers: multiple perspectives. Teaching and Teacher Education, 21(2), 219–225.
  • Cochran-Smith, M. (2006). Policy, practice, and politics in teacher education. Corwin Press.
  • Çeliker, G. (2016). Öğretmen adayları için sınıf-ı̇çi değerlendirme öz-yeterlik algısı ölçeği: Geçerlik ve güvenirlik çalışması. Eğitim ve İnsani Bilimler Dergisi: Teori ve Uygulama, 7(14), 3-18.
  • Crawford, A. V., Green, S. B., Levy, R., Lo, W. J., Scott, L., Svetina, D., and Thompson, M. S. (2010). Evaluation of parallel analysis methods for determining the number of factors. Educational and Psychological Measurement, 70(6), 885-901.
  • Çokluk, Ö., Şekercioğlu, G. ve Büyüköztürk, Ş. (2018). Sosyal bilimler için çok değişkenli istatistik: SPSS ve LISREL uygulamaları. Pegem Akademi.
  • DeVellis, R. F. (2016). Scale development: Theory and applications. Sage.
  • Doğan, N., Soysal, S., ve Karaman, H. (2017). Aynı örnekleme açımlayıcı ve doğrulayıcı faktör analizi uygulanabilir mi?. Pegem Atıf İndeksi, 373-400.
  • Efklides, A. (2001). Metacognitive experiences in problem solving: Metacognition, motivation and self-regulation. In A. Efklides, J. Kuhl, and M. Sorrentino (Eds.), Trends and prospects in motivation research (pp. 297–323). Kluwer.
  • Efklides, A. (2006). Metacognition and affect: What can metacognitive experiences tell us about the learning process? Educational Research Review, 1, 3–14.
  • Efklides, A. (2008). Metacognition: Defining its facets and levels of functioning in relation to self-regulation and co-regulation. European Psychologist, 13, 277–287.
  • Erford, B. T., Lowe, S., and Chang, C. Y. (2011). Technical analysis of teacher responses to the self-evaluation scale-teacher (SES-T) version. Measurement and Evaluation in Counseling and Development, 44(3), 151-158.
  • Field, A. (2013). Discovering Statistics Using SPSS: Introducing Statistical Method (4rd ed.). Sage.
  • Flavell, J. H. (1979). Metacognition and cognitive monitoring. American Psychologist, 34, 906–911.
  • Flavell, J. H. (1987). Speculations about the nature and development of metacognition. In F. E. Weinert and R. H. Kluwe (Eds.), Metacognition, motivation and understanding (pp. 21–29). Lawrence Erlbaum Associates.
  • Flavell, J. H., Miller, P. H., and Miller, S. A. (2002). Cognitive development (4th ed.). Prentice Hall.
  • Fraenkel, J. R., and Wallen, N. (2006). How to design and evaluate research in education. Mc Grawall Hill.
  • Fullan, M. G. (1993). Why teachers must become change agents. Educational Leadership, 50, 12-22.
  • Guskey, T. R. (2002). Professional development and teacher change. Teachers and Teaching: Theory and Practice, 8(3), 381–391.
  • Hair J. F., Black, W. C., Babin, B. J., Anderson, R. E., and Tatham, R. L. (2010). SEM: An introduction. Multivariate data analysis: A global perspective, 5(6), 629-686.
  • Hacker, D. J., Dunlosky, J., and Graesser, A. C. (Eds.). (2009). Handbook of Metacognition in Education. Routledge.
  • Hammerness, K., Darling-Hammond, L., Bransford, J. D., Berliner, D., Cochran-Smith, M., McDonald, M., . . . Zeichner, K. (2005). How teachers learn and develop. In L. Darling-Hammond and J. D. Bransford (Eds.), Preparing teachers for a changing world: What teachers should learn and be able to do (pp. 358–389). Jossey-Bass.
  • Hatano, G., and Inagaki, K. (1986). Two courses of expertise. In H. Stevenson, K. Azama, and K. Hakuta (Eds.), Child development and education in Japan (pp. 262–272). Freeman.
  • Hattie, J. (2012). Visible learning for teachers: Maximizing impact on learning. Routledge.
  • Hooper, D., Coughlan, J., and Mullen, M. (2008, June). Evaluating model fit: a synthesis of the structural equation modelling literature. In 7th European Conference on research methodology for business and management studies (pp. 195-200).
  • Jacobs, V. R., Franke, M. L., Carpenter, T. P., Levi, L., and Battey, D. (2007). Professional development focused on children’s algebraic reasoning in elementary school. Journal for Research in Mathematics Education, 38(3), 258–288.
  • Jacobs, V. R., Lamb, L. L. C., and Philipp, R. A. (2010). Professional noticing of children’s mathematical thinking. Journal for Research in Mathematics Education, 41(2), 169–202.
  • Jacobs, V. R., and Spangler, D. A. (2018). Research on core practices in K–12 mathematics teaching. In J. Cai (Ed.), Compendium for research in mathematics education (pp. 766–792). National Council of Teachers of Mathematics.
  • Jiang, Y., Ma, L., and Gao, L. (2016). Assessing teachers’ metacognition in teaching: The teacher metacognition inventory. Teaching and Teacher Education, 59, 403-413.
  • Kline, R. B. (2013). Exploratory and confirmatory factor analysis. In Y. Petscher, C. Schatschneider, and D. L. Compton (Eds.), Applied quantitative analysis education and the social sciences (pp. 171–207). Routledge.
  • Kline, R. B. (2015). Principles and practice of structural equation modeling. Guilford publications.
  • Kuhn, D. (1999). Metacognitive development. In L. Balter and C. S. Tamis-LeMonda (Eds.), Child psychology: A handbook of contemporary issues (pp. 259–286). Psychology Press.
  • Kuhn, D. (2000). Metacognitive development. Psychological Science, 9, 178–181.Ledesma, R. D., Valero-Mora, P., and Macbeth, G. (2015). The scree test and the number of factors: a dynamic graphics approach. The Spanish Journal of Psychology, 18, 1-10.
  • Lingel, K., Lenhart, J., and Schneider, W. (2019). Metacognition in mathematics: do different metacognitive monitoring measures make a difference? ZDM, 51(4), 587-600.
  • Martinez, M. E. (2006). What is metacognition? Phi Delta Kappan, 87(9), 696-699.
  • Mason, J. (2002). Researching your own practice: The discipline of noticing. Routledge.
  • Meijer, J., Veenman, M. V. J., and van Hout-Wolters, B. H. A. M. (2006). Metacognitive activities in text-studying and problem-solving: Development of a taxonomy. Educational Research and Evaluation, 12, 209–237.
  • Morrison, J. T. (2009). Evaluating factor analysis decisions for scale design in communication research. Communication Methods and Measures, 3(4), 195–215.
  • Mundfrom, D. J., Shaw, D. G., and Ke, T. L. (2005). Minimum sample size recommendations for conducting factor analyses. International Journal of Testing, 5(2), 159-168.
  • Myhill, D., and Warren, P. (2005). Scaffolds or straitjackets? Critical moments in classroom discourse. Educational Review, 57(1), 55–69.
  • Noar, S M. (2009). The role of structural equation modeling in scale development. Structural Equation Modeling: A Multidisciplinary Journal, 10(4), 622-647.
  • Norris, M., and Lecavalier, L. (2010). Evaluating the use of exploratory factor analysis in developmental disability psychological research. Journal of Autism Development and Disorders, 40, 8–20.
  • Osborne, J. W. (2014). Best practices in exploratory factor analysis. Create Space Independent Publishing.
  • Pallant, J. (2017). SPSS survival manual: a step by step guide to data analysis using SPSS for windows (version 15). Open University Press.
  • Parsons, S. A., Vaughn, M., Scales, R. Q., Gallagher, M. A., Parsons, A. W., Davis, S. G., Pierczynski, M., and Allen, M. (2018). Teachers’ instructional adaptations: A research synthesis. Review of Educational Research, 88(2), 205–242.
  • Piaget, J. (1976). The grasp of consciousness. Harvard University Press.
  • Pintrich, P. R., Wolters, C., and Baxter, G. (2000). Assessing metacognition and self-regulated learning. In G. Schraw and J. Impara (Eds.), Issues in the measurement of metacognition (pp. 43–97). Buros Institute of Mental Measurement.
  • Polya, G. (1957). How to solve it: A new aspect of mathematical method (2nd ed.), Princeton University Press.
  • Robertson, A. D., Scherr, R., and Hammer, D. (2015). Responsive teaching in science and mathematics. Routledge.
  • Reynolds, C. R., Livingston, R. B., and Willson, V. (2009). Measurement and assessment in education. Merrill.
  • Rhodes, M. G. (2019). Metacognition. Teaching of Psychology, 46(2), 168-175.
  • Rubinstein, R. Y., and Kroese, D. P. (2016). Simulation and the Monte Carlo method. John Wiley and Sons.
  • Samuels, P. (2016). Advice on exploratory factor analysis. Birmingham City University Press. Doi: 10.13140/ RG.2.1.5013.9766.
  • Sawyer, R. K. (2004). Creative teaching: Collaborative discussion as disciplined improvisation. Educational Researchers, 33(2), 12–20.
  • Sherin, M. G. (2001). Developing a professional vision of classroom events. In T. Wood, B. S. Nelson, and J. E. Warfield (Eds.), Beyond classical pedagogy: Teaching elementary school mathematics (pp. 75–93). Lawrence Erlbaum.
  • Sherin, M. G. (2007). The development of teachers’ professional vision in video clubs. In R. Goldman, R. Pea, B. Barron, and S. J. Derry (Eds.), Video research in the learning sciences (pp. 383–395). Lawrence Erlbaum.
  • Sherin, M. G., Jacobs, V. R., and Philipp, R. A. (2011). Mathematics teacher noticing: Seeing through teachers’ eyes. Routledge.
  • Sherin, M. G. (2017). Exploring the boundaries of teacher noticing: Commentary. In E. O. Schack, M. H. Fisher, and J. A. Wilhelm (Eds.), Teacher noticing: Bridging and broadening perspectives, contexts and frameworks (pp. 401–408). Springer.
  • Schon, D. (1983). The reflective practitioner: how professionals think in action. Basic Books.
  • Schon, D. A. (1987). Educating the reflective practitioner: toward a new design for teaching and learning in the professions. Jossey-Bass.
  • Schraw, G., and Moshman, D. (1995). Metacognitive theories. Educational Psychology Review, 7, 351–371.
  • Schraw, G., and Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19(4), 460-475.
  • Schreiber, J. B., Nora, A., Stage, F. K., Barlow, E. A., and King, J. (2006). Reporting structural equation modeling and confirmatory factor analysis results: A review. The Journal of educational research, 99(6), 323-338.
  • Star, J. R., and Strickland, S. K. (2008). Learning to observe: Using video to improve preservice mathematics teachers’ ability to notice. Journal of Mathematics Teacher Education, 11(2), 107-125.
  • Seçer, İ. (2017). SPSS ve LISREL ile pratik veri analizi [Practical data analysis through SPSS and LISREL]. Anı Yayıncılık.
  • Şendurur, E., and Yildirim, Z. (2019). Web-based metacognitive scaffolding for internet search. Journal of Educational Technology Systems, 47(3), 433-455.
  • Tabachnick, B. G. ve Fidell, L. S. (2015). Using multivariate analysis. California State University Northridge: Harper Collins College Publishers.
  • Taylor, M., Yates, A., Meyer, L. H., and Kinsella, P. (2011). Teacher professional leadership in support of teacher professional development. Teaching and teacher education, 27(1), 85-94.
  • Thomas, J. N. (2017). The ascendance of noticing: Connections, challenges, and questions. In E. O. Schack, M. H. Fisher, and J. A. Wilhelm (Eds.), Teacher noticing: Bridging and broadening perspectives, contexts and frameworks (pp. 507–514). Springer.
  • Topcu, A., and Ubuz, B. (2008). The effects of meta-cognitive knowledge on the preservice teachers’ participation in the asynchronous online forum. Educational Technology and Society, 11(3), 1−12.
  • Tripp, D. (1993). Critical incidents in teaching. Routledge
  • Thomopoulos, N. T. (2012). Essentials of Monte Carlo simulation: Statistical methods for building simulation models. Springer Science and Business Media.
  • Van der Heijden, H. R. M. A., Geldens, J. J., Beijaard, D., and Popeijus, H. L. (2015). Characteristics of teachers as change agents. Teachers and Teaching, 21(6), 681-699.
  • van Es, E. A., and Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of Technology and Teacher Education, 10(4), 571-596.
  • van Es, E. A., and Sherin, M. G. (2008). Mathematics teachers’ “learning to notice” in the context of a video club. Teaching and Teacher Education, 24(2), 244-276.
  • van Velzen, J. H. (2012). Teaching metacognitive knowledge and developing expertise. Teachers and Teaching, 18(3), 365-380
  • Veenman, M. V. J., Kok, R., and Blote, A. W. (2005). The relation between intellectual and metacognitive skills in early adolescence. Instructional Science, 33(3), 193–211.
  • Veenman, M. V. J. (2011). Learning to self-monitor and self-regulate. In R. Mayer and P. Alexander (Eds.), Handbook of research on learning and instruction. Routledge.
  • Woods, P., and Jeffrey, B. (1996). Teachable moments: The art of creative teaching in primary schools. Open University Press.
  • Whitebread, D., Coltman, P., Pasternak, D., Sangster, C., Grau, V., Bingham, S., … Demetriou, D. (2009). The development of two observational tools for assessing metacognition and self-regulated learning in young children. Metacognition and Learning, 4, 63–85.
  • Zhang, L. J., and Zhang, D. (2013). Thinking metacognitively about metacognition in second and foreign language learning, teaching, and research: Toward a dynamic metacognitive systems perspective. Contemporary Foreign Languages Studies, 396(12), 111-121.
  • Zohar, A., and Barzilai, S. (2013). A review of research on metacognition in science education: Current and future directions. Studies in Science education, 49(2), 121-169.

Development of the Meta-Cognitive Awareness Scale (MAS-EVA) for In-Class Teaching and Assessment of Students and Determination of its Psychometric Properties

Year 2022, Volume: 11 Issue: 1, 60 - 74, 28.03.2022
https://doi.org/10.30703/cije.899403

Abstract

The purpose of the present study was to develop a scale that is able describe teachers’ metacognitive awareness for online monitoring and evaluation of their own in-class teaching and students’ intellectual outcomes. The current study was designed as a survey study that was also a scale development study. The participation of 356 teachers from different departments and seniorities working in institutions affiliated to the Ministry of National Education (MNE) in the 2019-2020 academic year was ensured in this study. In this five-step study, firstly the existing literature was scanned and a conceptual and thematic framework was created that would allow the evaluation of the learning-teaching process and the integration of meta-cognitive awareness with a pedagogy. Then, the item pool incorporating 45 expressions in accordance with the established theoretical framework was created and submitted to the experts’ opinion. Expert opinions were evaluated through the Lawshe technique. After taking the experts’ opinions, the final form of the scale incorporating 36 items was administered to the participants and the psychometric properties of the scale were identified. For this, the Cronbach Alpha internal consistency coefficient was calculated by performing exploratory and confirmatory factor analyzes on the data. As a result of analyzes, it was determined that the final form of the scale had a three-factor structure and consisted of 34 items. The sub-factors of the scale are named as follows: monitoring and evaluating in-class teaching activities and their intellectual effectiveness, evaluating the ways followed for the summative assessment of learner outcomes, and metacognitive colleague interaction and exchange of ideas. In addition, the Cronbach Alpha internal consistency coefficient of 0.96 showing the that the developed scale is reliable. The results regarding the psychometric properties of the scale were discussed based on the existing literature and necessary recommendations were presented.

References

  • Allen, M. J., and Yen, W. M. (2002). Introduction to measurement theory (2nd ed). Waveland Press.
  • Ayre, C., and Scally, A. J. (2014). Critical values for Lawshe’s content validity ratio: revisiting the original methods of calculation. Measurement and Evaluation in Counseling and Development, 47(1), 79-86.
  • Berliner, D. C. (2001). Learning about and learning from expert teachers. International Journal of Educational Research, 35(5), 463–482.
  • Bransford, J. D., Brown, A. L., and Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school. National Academy Press.
  • Bonett, D. G., and Wright, T. A. (2015). Cronbach’s alpha reliability: Interval estimation, hypothesis testing, and sample size planning. Journal of organizational behavior, 36(1), 3-15.
  • Brown, T. A. (2015). Confirmatory factor analysis for applied research. Guilford Publications.
  • Büyüköztürk, Ş., Çakmak, E. K., Akgün, Ö. E., Karadeniz, Ş., ve Demirel, F. (2017). Bilimsel araştırma yöntemleri. Pegem Atıf İndeksi, 1-360.
  • Büyüköztürk, Ş. (2018). Veri analizi el kitabı. Pegem Akademi.
  • Carpenter, S. (2018). Ten steps in scale development and reporting: A guide for researchers. Communication Methods and Measures, 12(1), 25-44.
  • Child, D. (2006). The Essentials of Factor Analysis. 3rd edn. Continuum.
  • Cochran-Smith, M. (2005). Teacher educators as researchers: multiple perspectives. Teaching and Teacher Education, 21(2), 219–225.
  • Cochran-Smith, M. (2006). Policy, practice, and politics in teacher education. Corwin Press.
  • Çeliker, G. (2016). Öğretmen adayları için sınıf-ı̇çi değerlendirme öz-yeterlik algısı ölçeği: Geçerlik ve güvenirlik çalışması. Eğitim ve İnsani Bilimler Dergisi: Teori ve Uygulama, 7(14), 3-18.
  • Crawford, A. V., Green, S. B., Levy, R., Lo, W. J., Scott, L., Svetina, D., and Thompson, M. S. (2010). Evaluation of parallel analysis methods for determining the number of factors. Educational and Psychological Measurement, 70(6), 885-901.
  • Çokluk, Ö., Şekercioğlu, G. ve Büyüköztürk, Ş. (2018). Sosyal bilimler için çok değişkenli istatistik: SPSS ve LISREL uygulamaları. Pegem Akademi.
  • DeVellis, R. F. (2016). Scale development: Theory and applications. Sage.
  • Doğan, N., Soysal, S., ve Karaman, H. (2017). Aynı örnekleme açımlayıcı ve doğrulayıcı faktör analizi uygulanabilir mi?. Pegem Atıf İndeksi, 373-400.
  • Efklides, A. (2001). Metacognitive experiences in problem solving: Metacognition, motivation and self-regulation. In A. Efklides, J. Kuhl, and M. Sorrentino (Eds.), Trends and prospects in motivation research (pp. 297–323). Kluwer.
  • Efklides, A. (2006). Metacognition and affect: What can metacognitive experiences tell us about the learning process? Educational Research Review, 1, 3–14.
  • Efklides, A. (2008). Metacognition: Defining its facets and levels of functioning in relation to self-regulation and co-regulation. European Psychologist, 13, 277–287.
  • Erford, B. T., Lowe, S., and Chang, C. Y. (2011). Technical analysis of teacher responses to the self-evaluation scale-teacher (SES-T) version. Measurement and Evaluation in Counseling and Development, 44(3), 151-158.
  • Field, A. (2013). Discovering Statistics Using SPSS: Introducing Statistical Method (4rd ed.). Sage.
  • Flavell, J. H. (1979). Metacognition and cognitive monitoring. American Psychologist, 34, 906–911.
  • Flavell, J. H. (1987). Speculations about the nature and development of metacognition. In F. E. Weinert and R. H. Kluwe (Eds.), Metacognition, motivation and understanding (pp. 21–29). Lawrence Erlbaum Associates.
  • Flavell, J. H., Miller, P. H., and Miller, S. A. (2002). Cognitive development (4th ed.). Prentice Hall.
  • Fraenkel, J. R., and Wallen, N. (2006). How to design and evaluate research in education. Mc Grawall Hill.
  • Fullan, M. G. (1993). Why teachers must become change agents. Educational Leadership, 50, 12-22.
  • Guskey, T. R. (2002). Professional development and teacher change. Teachers and Teaching: Theory and Practice, 8(3), 381–391.
  • Hair J. F., Black, W. C., Babin, B. J., Anderson, R. E., and Tatham, R. L. (2010). SEM: An introduction. Multivariate data analysis: A global perspective, 5(6), 629-686.
  • Hacker, D. J., Dunlosky, J., and Graesser, A. C. (Eds.). (2009). Handbook of Metacognition in Education. Routledge.
  • Hammerness, K., Darling-Hammond, L., Bransford, J. D., Berliner, D., Cochran-Smith, M., McDonald, M., . . . Zeichner, K. (2005). How teachers learn and develop. In L. Darling-Hammond and J. D. Bransford (Eds.), Preparing teachers for a changing world: What teachers should learn and be able to do (pp. 358–389). Jossey-Bass.
  • Hatano, G., and Inagaki, K. (1986). Two courses of expertise. In H. Stevenson, K. Azama, and K. Hakuta (Eds.), Child development and education in Japan (pp. 262–272). Freeman.
  • Hattie, J. (2012). Visible learning for teachers: Maximizing impact on learning. Routledge.
  • Hooper, D., Coughlan, J., and Mullen, M. (2008, June). Evaluating model fit: a synthesis of the structural equation modelling literature. In 7th European Conference on research methodology for business and management studies (pp. 195-200).
  • Jacobs, V. R., Franke, M. L., Carpenter, T. P., Levi, L., and Battey, D. (2007). Professional development focused on children’s algebraic reasoning in elementary school. Journal for Research in Mathematics Education, 38(3), 258–288.
  • Jacobs, V. R., Lamb, L. L. C., and Philipp, R. A. (2010). Professional noticing of children’s mathematical thinking. Journal for Research in Mathematics Education, 41(2), 169–202.
  • Jacobs, V. R., and Spangler, D. A. (2018). Research on core practices in K–12 mathematics teaching. In J. Cai (Ed.), Compendium for research in mathematics education (pp. 766–792). National Council of Teachers of Mathematics.
  • Jiang, Y., Ma, L., and Gao, L. (2016). Assessing teachers’ metacognition in teaching: The teacher metacognition inventory. Teaching and Teacher Education, 59, 403-413.
  • Kline, R. B. (2013). Exploratory and confirmatory factor analysis. In Y. Petscher, C. Schatschneider, and D. L. Compton (Eds.), Applied quantitative analysis education and the social sciences (pp. 171–207). Routledge.
  • Kline, R. B. (2015). Principles and practice of structural equation modeling. Guilford publications.
  • Kuhn, D. (1999). Metacognitive development. In L. Balter and C. S. Tamis-LeMonda (Eds.), Child psychology: A handbook of contemporary issues (pp. 259–286). Psychology Press.
  • Kuhn, D. (2000). Metacognitive development. Psychological Science, 9, 178–181.Ledesma, R. D., Valero-Mora, P., and Macbeth, G. (2015). The scree test and the number of factors: a dynamic graphics approach. The Spanish Journal of Psychology, 18, 1-10.
  • Lingel, K., Lenhart, J., and Schneider, W. (2019). Metacognition in mathematics: do different metacognitive monitoring measures make a difference? ZDM, 51(4), 587-600.
  • Martinez, M. E. (2006). What is metacognition? Phi Delta Kappan, 87(9), 696-699.
  • Mason, J. (2002). Researching your own practice: The discipline of noticing. Routledge.
  • Meijer, J., Veenman, M. V. J., and van Hout-Wolters, B. H. A. M. (2006). Metacognitive activities in text-studying and problem-solving: Development of a taxonomy. Educational Research and Evaluation, 12, 209–237.
  • Morrison, J. T. (2009). Evaluating factor analysis decisions for scale design in communication research. Communication Methods and Measures, 3(4), 195–215.
  • Mundfrom, D. J., Shaw, D. G., and Ke, T. L. (2005). Minimum sample size recommendations for conducting factor analyses. International Journal of Testing, 5(2), 159-168.
  • Myhill, D., and Warren, P. (2005). Scaffolds or straitjackets? Critical moments in classroom discourse. Educational Review, 57(1), 55–69.
  • Noar, S M. (2009). The role of structural equation modeling in scale development. Structural Equation Modeling: A Multidisciplinary Journal, 10(4), 622-647.
  • Norris, M., and Lecavalier, L. (2010). Evaluating the use of exploratory factor analysis in developmental disability psychological research. Journal of Autism Development and Disorders, 40, 8–20.
  • Osborne, J. W. (2014). Best practices in exploratory factor analysis. Create Space Independent Publishing.
  • Pallant, J. (2017). SPSS survival manual: a step by step guide to data analysis using SPSS for windows (version 15). Open University Press.
  • Parsons, S. A., Vaughn, M., Scales, R. Q., Gallagher, M. A., Parsons, A. W., Davis, S. G., Pierczynski, M., and Allen, M. (2018). Teachers’ instructional adaptations: A research synthesis. Review of Educational Research, 88(2), 205–242.
  • Piaget, J. (1976). The grasp of consciousness. Harvard University Press.
  • Pintrich, P. R., Wolters, C., and Baxter, G. (2000). Assessing metacognition and self-regulated learning. In G. Schraw and J. Impara (Eds.), Issues in the measurement of metacognition (pp. 43–97). Buros Institute of Mental Measurement.
  • Polya, G. (1957). How to solve it: A new aspect of mathematical method (2nd ed.), Princeton University Press.
  • Robertson, A. D., Scherr, R., and Hammer, D. (2015). Responsive teaching in science and mathematics. Routledge.
  • Reynolds, C. R., Livingston, R. B., and Willson, V. (2009). Measurement and assessment in education. Merrill.
  • Rhodes, M. G. (2019). Metacognition. Teaching of Psychology, 46(2), 168-175.
  • Rubinstein, R. Y., and Kroese, D. P. (2016). Simulation and the Monte Carlo method. John Wiley and Sons.
  • Samuels, P. (2016). Advice on exploratory factor analysis. Birmingham City University Press. Doi: 10.13140/ RG.2.1.5013.9766.
  • Sawyer, R. K. (2004). Creative teaching: Collaborative discussion as disciplined improvisation. Educational Researchers, 33(2), 12–20.
  • Sherin, M. G. (2001). Developing a professional vision of classroom events. In T. Wood, B. S. Nelson, and J. E. Warfield (Eds.), Beyond classical pedagogy: Teaching elementary school mathematics (pp. 75–93). Lawrence Erlbaum.
  • Sherin, M. G. (2007). The development of teachers’ professional vision in video clubs. In R. Goldman, R. Pea, B. Barron, and S. J. Derry (Eds.), Video research in the learning sciences (pp. 383–395). Lawrence Erlbaum.
  • Sherin, M. G., Jacobs, V. R., and Philipp, R. A. (2011). Mathematics teacher noticing: Seeing through teachers’ eyes. Routledge.
  • Sherin, M. G. (2017). Exploring the boundaries of teacher noticing: Commentary. In E. O. Schack, M. H. Fisher, and J. A. Wilhelm (Eds.), Teacher noticing: Bridging and broadening perspectives, contexts and frameworks (pp. 401–408). Springer.
  • Schon, D. (1983). The reflective practitioner: how professionals think in action. Basic Books.
  • Schon, D. A. (1987). Educating the reflective practitioner: toward a new design for teaching and learning in the professions. Jossey-Bass.
  • Schraw, G., and Moshman, D. (1995). Metacognitive theories. Educational Psychology Review, 7, 351–371.
  • Schraw, G., and Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19(4), 460-475.
  • Schreiber, J. B., Nora, A., Stage, F. K., Barlow, E. A., and King, J. (2006). Reporting structural equation modeling and confirmatory factor analysis results: A review. The Journal of educational research, 99(6), 323-338.
  • Star, J. R., and Strickland, S. K. (2008). Learning to observe: Using video to improve preservice mathematics teachers’ ability to notice. Journal of Mathematics Teacher Education, 11(2), 107-125.
  • Seçer, İ. (2017). SPSS ve LISREL ile pratik veri analizi [Practical data analysis through SPSS and LISREL]. Anı Yayıncılık.
  • Şendurur, E., and Yildirim, Z. (2019). Web-based metacognitive scaffolding for internet search. Journal of Educational Technology Systems, 47(3), 433-455.
  • Tabachnick, B. G. ve Fidell, L. S. (2015). Using multivariate analysis. California State University Northridge: Harper Collins College Publishers.
  • Taylor, M., Yates, A., Meyer, L. H., and Kinsella, P. (2011). Teacher professional leadership in support of teacher professional development. Teaching and teacher education, 27(1), 85-94.
  • Thomas, J. N. (2017). The ascendance of noticing: Connections, challenges, and questions. In E. O. Schack, M. H. Fisher, and J. A. Wilhelm (Eds.), Teacher noticing: Bridging and broadening perspectives, contexts and frameworks (pp. 507–514). Springer.
  • Topcu, A., and Ubuz, B. (2008). The effects of meta-cognitive knowledge on the preservice teachers’ participation in the asynchronous online forum. Educational Technology and Society, 11(3), 1−12.
  • Tripp, D. (1993). Critical incidents in teaching. Routledge
  • Thomopoulos, N. T. (2012). Essentials of Monte Carlo simulation: Statistical methods for building simulation models. Springer Science and Business Media.
  • Van der Heijden, H. R. M. A., Geldens, J. J., Beijaard, D., and Popeijus, H. L. (2015). Characteristics of teachers as change agents. Teachers and Teaching, 21(6), 681-699.
  • van Es, E. A., and Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of Technology and Teacher Education, 10(4), 571-596.
  • van Es, E. A., and Sherin, M. G. (2008). Mathematics teachers’ “learning to notice” in the context of a video club. Teaching and Teacher Education, 24(2), 244-276.
  • van Velzen, J. H. (2012). Teaching metacognitive knowledge and developing expertise. Teachers and Teaching, 18(3), 365-380
  • Veenman, M. V. J., Kok, R., and Blote, A. W. (2005). The relation between intellectual and metacognitive skills in early adolescence. Instructional Science, 33(3), 193–211.
  • Veenman, M. V. J. (2011). Learning to self-monitor and self-regulate. In R. Mayer and P. Alexander (Eds.), Handbook of research on learning and instruction. Routledge.
  • Woods, P., and Jeffrey, B. (1996). Teachable moments: The art of creative teaching in primary schools. Open University Press.
  • Whitebread, D., Coltman, P., Pasternak, D., Sangster, C., Grau, V., Bingham, S., … Demetriou, D. (2009). The development of two observational tools for assessing metacognition and self-regulated learning in young children. Metacognition and Learning, 4, 63–85.
  • Zhang, L. J., and Zhang, D. (2013). Thinking metacognitively about metacognition in second and foreign language learning, teaching, and research: Toward a dynamic metacognitive systems perspective. Contemporary Foreign Languages Studies, 396(12), 111-121.
  • Zohar, A., and Barzilai, S. (2013). A review of research on metacognition in science education: Current and future directions. Studies in Science education, 49(2), 121-169.
There are 91 citations in total.

Details

Primary Language Turkish
Journal Section Research Article
Authors

Ali Yiğit Kutluca 0000-0002-1341-3432

Somayyeh Radmard 0000-0002-9431-8081

Yilmaz Soysal 0000-0003-1352-8421

Publication Date March 28, 2022
Published in Issue Year 2022Volume: 11 Issue: 1

Cite

APA Kutluca, A. Y., Radmard, S., & Soysal, Y. (2022). Sınıf İçi Öğretimin ve Öğrencinin Değerlendirilmesine İlişkin Meta-Bilişsel Farkındalıklar Ölçeğinin (MFÖ-DEĞ) Geliştirilmesi ve Psikometrik Özelliklerinin Belirlenmesi. Cumhuriyet Uluslararası Eğitim Dergisi, 11(1), 60-74. https://doi.org/10.30703/cije.899403

14550                 

© Cumhuriyet University, Faculty of Education