Xem mẫu

  1. VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 Original Article Assessing Institutional Learning Outcomes: Implications for Vietnam Higher Education Institutions Pham Thi Tuyet Nhung* College of Foreign Languages - Hue University, 57 Nguyen Khoa Chiem, Hue City, Vietnam Received 22 May 2019 Revised 07 June 2019; Accepted 08 July 2019 Abstract: Institutional learning outcomes indicate the knowledge and skills that all students regardless of disciplines from a specific university demonstrate. There are some researches about assessing learning outcomes at program level in Vietnam but no research about learning outcomes at institution level. This case study research shared experience from a U.S. comprehensive university to conduct assessment of institutional learning outcomes. The paper discussed the achievements such as successful two-year institutional assessment implementation, effective use of a national Valid Assessment of Learning in Undergraduate Education (VALUE) rubric to assess students’ performance, the use of technology in data analysis, and the best practices to communicate assessment results to multiple stakeholders to facilitate leadership decision making; the challenges such as technology, faculty engagement, the participation rate, validity and reliability; and improvement plans. Researcher also made recommendations for Vietnam HEIs to improve internal quality assurance for both quality improvement and accountability purposes. Keywords: Institutional learning outcomes, achievements, challenges, quality improvement, accountability. 1. Introduction * learning (Bassis, 2015 [1]; Jones, 2009 [2]; Nelson, 2014 [3]). The regional accrediting Over the past several years, various organizations identified and recognized by the individuals, organizations, and legislators have Council for Higher Education Accreditation continued to express concerns about the quality (CHEA) all include requirements related to of higher education. Those concerns have assessing student learning outcomes for general triggered legislation and requirements at the education. The accreditors have requirements federal and state levels and by regional for articulating the outcomes as well as accreditors to assess and report on student measuring and documenting student success _______ (“Council for Higher Education Accreditation”, * Corresponding author. n.d.) [4]. E-mail address: nhungptt48@gmail.com https://doi.org/10.25073/2588-1159/vnuer.4265 1
  2. 2 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 Assessment of general education has been higher education institutions have typically going on for years. According to Penn (2011) functioned in an autonomous and siloed culture [5], one of the first, comprehensive assessments when implementing changes. Various programs of general education was in the late 1920s. and offices have operated independently of one Major initiatives were undertaken in higher another. The concept of holistic, institution wide education assessment in the mid 80’s to early assessment can be somewhat of a challenge due to past practices and that autonomous nature. A 90’s to assess general education and university cohesive framework and cooperation across is again seeing that demand for detailed, campus are critical for effective implementation comprehensive assessment. With all the of general education assessment. requirements, it is easy to lose focus of the reason Similarly, accreditation is also a major for assessment and why university collect data, driver for Vietnamese higher education enter it into databases, and generate reports so that institutions (HEIs) to provide evidence of university can improve the learning and student learning. The new standards of higher performance of students. Fletcher, Meyer, education accreditation for both institution and Anderson, Johnston, & Rees (2012) [6] stated program level focus on assessment of student universities conduct assessment to provides learning following Plan-Do-Check Act (PDCA) information about student learning, student to make quality improvement (MOET, 2017, progress, teaching quality, and program and MOET, 2016) [10, 11]. Therefore, there is a institutional accountability. need to create an internal quality assurance There are numerous ways of conducting (IQA) to meet such requirements from external effective general education assessment. The stakeholders. Still, IQA is still a challenge for Association of American Colleges & Universities (AAC&U), Valid Assessment of many Vietnamese HIEs (Nguyen, 2018) [12] Learning in Undergraduate Education and quality assurance offices (Pham, 2019) (VALUE) project and the resulting rubrics have [13]. There is a research from Hue University to been implemented by many Universities. The share the experience to implement IQA from VALUE rubrics were developed as part of Asian University Network- Quality Assurance AAC&U’s Liberal Education and America’s (AUN-QA) to assess learning outcomes at Promise (LEAP) initiative (“About LEAP,” program level (Nguyen and Nguyen, 2017) [14] n.d.) [7]. One advantage of implementing the but no research has shared experience to assess VALUE rubrics is that data and studies such as learning outcomes at institutional level in the Multi-State Collaborative to Advance Vietnam context. This case study shared Quality Student Learning (MSC) and the Great experience from a comprehensive university in Lakes College Association Project to Advance United States to conduct the assessment of Learning, to name a few, report their findings student learning at institution level to support and share lessons they have learned through Vietnamese HEIs to improve quality of student their implementation. A recent report, On Solid learning and provide accountable evidence for Ground (McConnell & Rhodes, 2017) [8], provides detailed information from a large external stakeholders such as accreditation. number of institutions. The VALUE rubrics were piloted and are used by a diverse range of 2. Method post-secondary education institutions including community colleges, regional comprehensives, This research used case study as a major and R1 institutions. These data sets allow us to method to provide a rich description of the benchmark our student performance with that of phenomenon (Yin, 1994) [15]. A case can be a the collaborating universities. Brown, person, a small group, a program, or an McGreevy, & Berigan (2018) [9] point out that institution. As stated by Merriam (1998) [16], a
  3. P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 3 case study provides an in-depth description of a reviewed by a general education committee for single instance, phenomenon, or social unit. recertification and to ensure they are following Creswell (2014) [17] also stated that a case has the assessment plan and student artifacts align a clear boundary and can provide an in-depth with desired outcomes. comprehension of the case. The first step in This research tried to answer the following conducting a case study is to define the case. questions: The university’s assessment process 1. What are the assessment process of explained here is from a regional institutional learning outcomes? comprehensive university in the Midwest of 2. What were the challenges and United States. Their Carnegie classification is improvements the university have had? Comprehensive Universities offering both 3. What are the key achievements the undergraduate and graduate programs. The university has made? enrollment of the university is just over 12,000 4. What are the strategies university use to undergraduate and graduate students. The sustain the institutional learning outcome system? general education program has always had the mission of providing students with foundational knowledge and skills, primarily in liberal arts 3. Findings and sciences, that encompasses all 3.1. Assessment process of institutional baccalaureate programs. A frequent observation learning outcomes made by faculty and students alike was that our previous general education program did not Assessment measures. In 2014, university appear to be a program at all but rather a updated our general education curriculum to collection of unconnected courses. Our include areas of understanding which comprise programs and the general education program four key outcomes that include a total of ten were operating in that siloed type of competencies. To assess these competencies, environment and not functioning cohesively, the Valid Assessment of Learning in particularly when related to assessment. For Undergraduate Education (VALUE) rubric those reasons, university sought a framework to (Rhodes, 2009) [19] was modified and applied implement a holistic assessment approach across campus. This activity demonstrated the which would allow us to assess the impact of institution’s commitment to ensuring learning our general education. outcomes are achieved and that a degree Like many universities, our previous reflects high quality, a goal of the Multi-State general education program focused on input, in Collaborative (MSC). This effort also the form of courses and their specific responded to a widespread objective of using competencies, and not on an outcomes related standardized testing in higher education. Most perspective (Bruce, 2018) [18]. The courses importantly, the assessment of student learning were selected strictly by their alignment with using a modified VALUE rubric provided the the selected general education topic areas. opportunity for faculty to have conversations Under our current general education program, about improvement of student learning courses must show how they align with and will outcomes (Wehlburg, Carnahan & Rhodes, meet the specific outcomes for the university 2017) [20]. general education program. Programs on Assessment process. The university campus can submit courses to the faculty senate assessment system follows six phases of the general education committee for consideration assessment cycle: (1) plan and identify of inclusion in the general education program. outcomes, (2) collect data, (3) analyze data, (4) As part of that submission, they must include share results, (5) identify and implement information on how they will meet and assess changes, and (6) assess impact of change (Kuh, the prescribed outcomes. Courses are also Ikenberry, Jankowski, Cain, Edwell, Hutching
  4. 4 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 and Kinzie, 2015) [21]. The revised general how to use the modified rubrics. It was education program serves student need and the determined that pilot data would be collected in public interest by ensuring students have strong the Spring of 2017 semester. Student artifacts foundational skills by providing a broad, for five competencies: written communication, enriched academic experience that both oral communication, quantitative literacy, complements and supports their study within critical/creative thinking, and managing information would be collected. As this was the specialized disciplines. To capture the student first time the university had conducted an learning of the ten general education institution-wide general education assessment, competencies, the university has used three instructors of all courses that aligned to a major assessment measures: The General specific competency were asked to voluntarily Education Assessment (GEA) Exam, the provide students’ artifacts for institutional Modified VALUE rubrics, and the National assessment. Data from four competencies (Oral Survey of Student Engagement (NSSE). The Communication, Quantitative Literacy, GEA and Modified VALUE rubrics serve as the Creative/Critical Thinking, and Managing direct assessment measure of student learning Information) were gathered in an excel template outcomes and the NSSE serves as an indirect and the Written Communication competency assessment measure of student learning outcomes. was collected through an assessment This paper only discusses the newly management software (AMS). The purpose of implementation of direct modified this pilot was to ensure the assessment process VALUE rubric. was appropriate before collecting artifacts of In an effort to determine whether the the five competencies from all courses. teaching of the GE courses met the requirement Two-Year Timeline. The data collection of the new general education competencies, the pilot was successful, therefore, from 2017- university started working on an assessment 2018, the university implemented a two-year plan and timeline for data collection. In 2015- assessment plan for general education 2016, university conducted a series of planning assessment (Table 1), using the course- meetings, with faculty teaching in the general embedded assessment (CBA) function in the education program, to collectively define the AMS. Data was collected during the Fall process for data collection. In the Fall 2016 semester, and in the Spring semester the results semester, the institution provided face-to-face, and opportunities for teaching and learning as well as online training for all instructors on improvement are discussed and documented. Table 1. Two-Year general education assessment timeline 2017-2018 2017-2018 2018-2019 Assessment and Evaluation Activity Fall Spring Fall Spring Collect data/Evaluate data including the processes Competency 1,2,3 & 5 Competency 4 Deliver report findings to constituents x x Take actions where necessary x x Review the competency if necessary x x yh Human Resources. To support the Services oversees the assessment activities. The assessment of the general education program, university assessment coordinator is in charge additional resources were needed and had to be of implementing the assessment process. The devoted to the process. Our structure included general Education Coordinator, a full-time faculty administrative support and faculty input. The member with course release, supports the Vice Provost of Academic Programs and communication of the purpose of assessment,
  5. P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 5 assessment process, and facilitates the course- and enabled a relatively automated transfer of embedded assessment (CBA) training with information into the AMS. Therefore, faculty university assessment coordinator to streamline utilize and grade the students’ artifacts using the process and to increase the artifacts the LMS they are familiar with. As most faculty submission in the AMS. Both the assessment were familiar with LMS, this helped to coordinator and the general education coordinator encourage their participation. The second are non-voting members on the faculty senate advantage of technology is the protection of general education committee. confidential information. All data were loaded Data Collection. Aligning several directly into the AMS and only people with components of the general education courses, specific privileges were able to access the data. assessment process, and data collection is very The third advantage of technology was intentional. The goal is to ensure courses efficiency (e.g., time savings) in the data maintain alignment with the competencies and analysis, as the assessment software could run that faculty can collect and report data with a various reports. Consequently, the university minimal amount of additional workload. Any could collect a large sample of students’ GE courses going through the recertification artifacts across multiple competencies in a year. process need to demonstrate that the course This comprehensive data collection enabled the learning outcomes and course assignments align university to capture a more accurate and with a specific GE competency. This ensures complete picture of student learning and courses continue to align with the general facilitate actions for improvement when looking education competencies and goals. All courses at the assessment results in the later step. The aligned to a skill-based competency are fourth advantage of using technology for data required to provide students’ artifacts from one collection was to provide both faculty and the assignment in their class. Faculty choose an institution individualized assessment reports assignment that meets all the dimensions in the based on the needs. modified VALUE rubric for university data Assessment Results. In AY 2017-2018, collection. The intent is for faculty to utilize a faculty collected students’ artifacts from 230 normal or typical assignment that are currently sections aligned with Competency 1 (Written implementing in their course and to use that for Communication), Competency 2 (Oral the institutional assessment. This authentic Communication), Competency 3 (Quantitative assessment does not create much additional Literacy) and Competency 5 (Managing workload for faculty as opposed to using an Information). 57% (2858) of the artifacts had intentional assignment just for institutional been assessed by the instructors and loaded into assessment as a component of student learning in the AMS. For the remaining 43%, in some their course. Since assessment is embedded within cases, faculty did not collect the data and in all sections of the courses and is evaluated by the others, improvements in the assignments are faculty member teaching each section, the needed for faculty to be able to independently assessment process has been streamlined. score the artifacts. The goal is to have 100% of Advantages of Technology in Data the artifacts scored. In the future, to continue to Collection. In addition to the faculty-centered ensure sustainability of the assessment process, and authentic assessment process, the data university will likely implement sampling of collection and data analysis from an AMS also larger sections. Of the four competencies, streamlined assessment process. The first Competency 3 received the highest response advantage was that it integrated with the rate (76%) and Competency 2 received the existing learning management system (LMS) lowest response rate (42%). o
  6. 6 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 Table 2. Modified VALUE Rubric Response Rate 2017-2018 Written Oral Quantitative Managing Total Communication Communication Literacy Information Total Students 1610 828 1218 1330 4986 Total Reponses 752 350 924 832 2858 % of Response 47% 42% 76% 63% 57% t On average, 98% of freshman met the competencies, Oral Communication and requirement, scoring one or above in the Quantitative Literacy had a higher average modified VALUE rubric. Of the four score (2.4). Written Communication (N=534) Oral Communication (N=297) Rating 0 1% Rating 0 2% Rating 1 47% Rating 1 13% Rating 2 36% Rating 2 43% Rating 3 11% Rating 3 21% Rating 4 5% Rating 4 19% Assessment ompetencies Quantitative Literacy (N=603) Managing Information (N=494) Rating 0 3% Rating 0 1% Rating 1 14% Rating 1 22% Rating 2 36% Rating 2 53% Rating 3 37% Rating 3 15% Rating 4 9% Rating 4 10% Figure 1. Assessment Results of Competencies. l In Spring 2018, the University Assessment involved in the data collection of Modified Coordinator prepared the university GE VALUE rubrics. The purpose of the meeting Assessment report and shared it with several with academic council was to provide them groups and committees across campus with the assessment results and discuss the including Academic Council, department strategies to improve next year’s response rates chairs, General Education Committee, Faculty using the Modified VALUE rubrics. The Senate University Assessment Council discussion with the GE Committee was to (FSUAC) and the faculty group that has been facilitate their use of assessment results in the
  7. P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 7 recertification process. In addition to Improvements: From the challenges aggregated assessment results for the whole encountered, in AY18-19, university prioritized university, the assessment coordinator also three solutions to facilitate closing the loop in provided the assessment report by competency. the assessment process. Acknowledging the The faculty meetings were set up by the Vice value of faculty coming together to discuss provost, university assessment coordinator, and student learning and pedagogy to identify GE coordinator to share the results and ask for opportunities to better support teaching and their feedback about the assessment process. learning in GE courses is critical. The first One of the key and critical components of the improvement is to create a time and place for assessment process remains a challenge; faculty to engage in deep, meaningful documenting actions for improvement from conversations about student learning and each competency. effective teaching. To facilitate this strategy, university established lead faculty for each 3.2. Challenges encountered and improvements competency. The major responsibilities of these Challenges encountered. After two-year of faculty are to lead the discussion of the implementation, the university still has some assessment results within their group, document challenges to overcome. The first challenge the feedback and recommendations to improve university encountered is the technology. the assessment process and possible actions for Although it provides the ability to collect and improvement. University provides a template analyze a great deal of information, some with key components in the assessment cycle to faculty had issues in the implementation such as facilitate the documentation of meeting being unable to create a link in the LMS, minutes. The second priority is to improve the inappropriate data display or issues with artifact validity and reliability of student artifacts. submission by students. The second challenge University is currently providing training and is the faculty interpretation of the modified workshops on “assignment design” and VALUE rubrics. Although training about the “norming” workshop series facilitated by modified VALUE rubrics was done before the university assessment coordinator and external data collection, some faculty still had a hard presenters. In the following semesters, lead GE time determining and assigning the scores from faculty in each competency will facilitate these the rubric to their own assignment, especially trainings for their own group annually. These when the freshman scored one in the rubric still lead faculty will serve as facilitators to promote got the A grade in their course. The third the professional development opportunities and challenge is the participation rate across the to coordinate faculty meetings to discuss and institution. Although more than two thousand review actions taken in response to learning artifacts were collected, it only accounted for outcomes data. The third improvement the 57% of population. Some faculty decided not to university is working is the additional submit any artifacts from their course in the requirement of utilizing assessment data in the system. Some had challenges separating out the GE recertification. Previously, the GE individual artifacts. The fourth challenge is the committee ensured the course learning lack of infrastructure to engage faculty who are outcomes and course assignments aligned with directly involved in the assessment process to GE competencies. The current practice is to discuss results of student learning effectively ensure student performance meets the and to identity changes for quality expectation of course learning outcomes and the improvement. Finally, university assessment course assignment. results relied on one artifact or one assignment; therefore, it was sometimes questioned about 3.3. Key achievements the reliability of the results, a barrier in making The first advantage of this assessment appropriate changes for improvement. process is the consistent assessment process for
  8. 8 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 all GE competencies, which would benefit the sheet (Appendix A). This is a meaningful accreditation-related efforts. Our goal is to process and allows faculty to determine the create processes and strategies that make strengths and weaknesses of student learning assessment practice and assessment visible to for their own course, then decide what actions all faculty. This is the first-time the university they can make for improvement. Our goal is not conducted an institution-wide authentic to evaluate faculty assessment efforts but to assessment following the national authentic assist them in using assessment results to assessment, VALUE rubric. The intent is to evaluate their own practices. It is hoped that capture the 21st century skills that all graduates multiple, minor changes systematically need to demonstrate by their graduation. To implemented over time can produce substantive facilitate the implementation, the university sets impact on teaching and learning (Stanny, up GE assessment plans and a two-year Gonzale and McGowan, 2015) [22] timeline to collect data, provides multiple assessment related trainings to faculty 3.4. Sustainable strategies throughout the academic year, and utilizes a As short-term goals, the university has three central AMS system to store and analyze plans to improve the assessment of the GE assessment data. program. The first plan is to improve the The second advantage of this process is the alignment of student learning outcomes at widespread faculty engagement in the different levels (university, GE, and academic assessment process from assignment design to programs) to facilitate skill-based assessment at pedagogy, data collection, and discussion of the senior level. Senior level data not only assessment result. Two features of this process, ensures students have had opportunities to personnel work and technological tools, improve, practice, and develop skills related to distribute the responsibility for assessment of the competencies, but allows us to provide student learning outcomes so that no one person evidence of student growth over time. The is solely responsible for the assessment. University Assessment Committee will work Multiple coordinators at different levels with programs to ensure appropriate skills are (university, college, department, and embeded in their program learning outcomes. A competency) facilitate faculty engagement in pilot will be implemented the Spring of 2019 in meaningful discussion of assessment findings which faculty teaching capstone courses will and regular conversations about teaching use the modified VALUE rubric to assess practices. Most importantly, faculty can student performance. For one capstone experience assessment activities as assignment, faculty can use it to assess multiple opportunities for their own learning and skills. Faculty will decide which skills the professional growth when attending the annual capstone would align with and select the training about teaching and learning appropriate rubric(s). The pilot of capstone improvement. At the same time, lead faculty assessment will facilitate the university plan to serve as the leaders in their group to facilitate fully implement assessment across the entire closing the loop discussions. academic timeframe of students. The second The third advantage of this assessment plan is to improve the validity and reliability of process is that it also allowed individual faculty assessment results by encouraging more to evaluate their own practice. After attending meaningful actions for improvement. meetings with the group to discuss assessment University will build an inter-rater reliability results within their competency, faculty are system that includes a second faculty assessing encouraged to run the CBA report, watch a sample artifacts of the five competencies. video on the assessment website on the Statistical power will be tested to have strategies of interpreting assessment data, and representative and powerful sample. Finally, the then fill in the GE Assessment Self-reflection university will consider having a GE Assessment
  9. P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 9 Committee to discuss and continue to improve the include qualitative data which our process has GE assessment process. Right now, the bulk of not yet formally included. the GE assessment activities are still initiated and overseen at the academic administrative level. To transition the assessment functions to the GE 4. Conclusion committee or formation of a committee specifically addressing GE assessment, will transfer some of As discussed in the literature review, there the ownership to faculty and help with are limited research about the implementation dissemination of information. This committee can of IQA in Vietnam context and there is no also support with inter-rater reliability as well as specific research about assessment of documentation of discussions and institutional learning outcomes. This case study recommendations for annual assessment reports. provided detailed steps by steps from choosing To sustain the culture of continuous the assessment measure to analyze the data to improvement, the university needs to maintain facilitate the implementation for other some long-term strategies. The first strategy is institutions. In addition, the sharing of the to provide continuous professional development challenges this case encountered, the opportunities for GE faculty, especially the achievements it has made and the strategies the university continue to sustain the IQA system adjuncts. University continues to have faculty can be good examples for other institutions. who seek to determine whether the pedagogical Vietnamese HEIs can implement this changes they make in the course will produce assessment process for quality improvement improvement in student learning. Those faculty and accountability, especially the current wish to pursue research and scholarship accreditation standards encouraged institutions opportunities related to assessment based on to provide quality of student learning. those findings. These efforts can lead to the First, Vietnam HEIs should look at the creation of an assessment network where institution mission to set up appropriate faculty can design and develop a common institutional learning outcomes (ILOs) for the course-based assignment for courses. The first sixty credits in the first two years. The best second strategy to build the culture of practice for ILOs is to look at the list of 21 assessment is to have annual teaching and century skills that AAC&U developed and learning fair, poster sections, workshops, or choose the neccesary skills for Vietnam thinktanks where faculty facilitate sections on context. Second, institutions require courses in assessment results and implications. The major the first two year curriculum to align its courses goal of these events is to enhance faculty to appropriate ILOs. To ensure the alignment, understanding of assessment process, facilitate the course learning outcomes need to address the use of data, evaluate the entire assessment the ILOs language in the course objectives. cycle and determine whether the assessment Third, Vietnam HEIs should choose a reliable process leads to real changes in student assessment measures to collect data. VALUE learning. The final strategy is to engage student rubric is an initiative in U.S. assessment practice in GE assessment process. Although the to move away from standardized exam to university administers the NSSE, it is not authentic assessment, using the authentic students’ artifacts to make improvement of administered annually. To triangulate student learning. Some U.S. HEIs just used the assessment data from both direct and indirect available assessment rubric to collect data. Some assessment measures, instructor can ask adopted the language in the rubric. Others used students to reflect in class and use that feedback VALUE rubric as a framework to build their own for indirect authentic assessment evidence in rubric. Vietnam HEIs can choose appropriate addition to the student assignment artifacts practice to implement. Researcher recommended (Hutchings, 2018) [23]. That feedback could
  10. 10 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 using the available rubric then make changes later References if there are any issues. [1] M. Bassis, A Primer on the transformation of higher Fourth, one of the keys to engage faculty is education in America. to provide guidance and understanding of the http://www.learningoutcomeassessment.org/document entire assessment process, why it is being s/BassisPrimer.pdf/, 2015 (accessed 1st April 2019). undertaken, and what the outcomes of the [2] D.A. Jones, Higher education assessment-Who are process will be used for. Vietnam HEIs should we assessing, and for what purpose? provide professional development opportunities https://www.aacu.org/publications for faculty teaching the courses on how to research/periodicals/higher-education design the assessment to align with the rubric, assessment%E2%80%94who-are-we-assessing-and- what-purpose/, 2009 (accessed 5th March 2019). how to read, integrate and use the rubric to [3] C. Nelson, Assessing assessment. score students’ assignment and how to provide https://www.insidehighered.com/views/2014/11/24/e consistent scoring across the courses. This is a ssay-criticizes-state-assessment-movement-higher- very significant important step to avoid the education/, 2014 (accessed 4th April 2019). challenges in validity and reliability in the data [4] Council for Higher Education Accreditation. collection that this case study encountered. (n.d.). Retrieved from Figure 2 provides additional information on https://www.chea.org/regional-accrediting- how Vietnam HEIs can share the assessment organizations/ (accessed 10th April 2019). results with multiple committee to close the [5] J.D. Penn, The case for assessing complex general assessment loop for quality improvement of education student learning outcomes, New Directions for Institutional Research 149 (2011) student learning. Lastly, Vietnam HEIs should 5-14. https://doi.org/10.1002/ir.376. have a meta-assessment, assessing the [6] R. Fletcher, L. Meyer, H. Anderson, P. Johnston, assessment process in place such as peer review M. Rees, Faculty and students’ conceptions of of assignment design to ensure the validity of assessment in higher education, Higher Education the assignment, calibration to ensure the 64 (1) (2012) 119-133. reliability of the students scores across the http://www.jstor.org/stable/41477923. multiple courses and ask for faculty perceptions [7] About LEAP. (n.d.). https://www.aacu.org/leap/, about the assessment process. 2018 (accessed September 01, 2018). These practices will help institutions to [8] K.D. McConnell, T.L. Rhodes, On solid ground. figure out the strengths and weaknesses in the Retrieved from https://www.aacu.org/OnSolidGroundVALUE/, process to make improvement and most 2017 (accessed 10th April 2019). importantly, provide evidence for institutions to [9] S. Brown, J. McGrevy, N. Berigan, N., Evidence- allocate appropriate resources to improve the Informed improvement through collaborative weaknesses. The implementation of this case professional integration, New Directions for study totally aligned with the suggestions from Teaching and Learning 155 (2018) 55-64. eight case studies supported by UNESCO that https://doi:10.1002/tl.20303. IQA is based on the national accreditation [10] MOET, Circular 12/2017/TT-BGDĐT requirement and international best practice promulgating regulations on accreditation for higher education institutions, Hanoi, Vietnam: (Martin, 2017) [24]. This case study assessment The Author, 2017. of ILOs demonstrated the four key components [11] MOET, Circular 03/2017/TT-BGDĐT of PDCA required by Vietnam national promulgating regulations on accreditation for accreditation in higher education and the higher education programs, Hanoi, Vietnam: The updated assessment initiative from U.S. Further Author, 2016. research can be how a Vietnamese university [12] CEA-HCM, Vietnamese accreditation system: learn this process and implement successful in achievements, challenges and lessons learned Vietnam context. from international accreditation model, Paper presented at Conference about Vietnam higher education, 2018.
  11. P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 11 [13] Pham Thi Huong, Limited legitimacy among [19] T. Rhodes, Assessing outcomes and improving academics of centrally driven approaches to achievement: Tips and tools for using the rubrics, internal quality assurance in Vietnam, Journal of Washington, DC: Association of American Higher Education Policy and Management 42 (2) Colleges and Universities, 2009. (2019) 172-185. http://doi.org/ [20] C. Wehlburg, J. Carnahan, T. Rhodes, Multi-State 10.1080/1360080X.2019.1565298. collaborative to advance quality student learning. [14] Nguyen Hong Giang, Nguyen Hong Son, Quality https://www.aacu.org/sites/default/files/MSC_De Assurance Procedure for Training Programs of monstration_Year.pdf/, 2017 (accessed 10th Hue University in Accordance with AUN-QA, April 2019). VNU Journal of Science: Education Research 33 [21] G.D. Kuh, S.O. Ikenberry, N.A. Jankowski, T.R. 1 (2017) 47-57. Cain, P.T. Ewell, P. Hutching, J. Kinzie, Using [15] R.K. Yin, Case study research: Design and evidence of student learning in improve higher methods, Sage Publications, Thousand Oaks, education, San Franciso, CA: Jossey-Bass, 2015. CA, 1994. [22] C. Stanny, M. Gonzalez, B. McGowan, Assessing [16] S. Merriam, Qualitative research and case study the culture of teaching and learning through a applications in education, Jossey-Bass syllabus review, Assessment & Evaluation in Publications, San Francisco, CA, 1998. Higher Education 40 (7) (2015) 898-913. [17] J.W. Creswell, Research design: Qualitative, [23] P. Hutchings, Helping students develop habits of quantitative, and mixed methods approach (4th ed.), reflection: What we can learn from the NILOA Sage Publications, Thousand Oaks, CA, 2014. Assignment Library, Urbana, IL: University of [18] R.T. Bruce, Assessment in Action: Evidence- Illinois and Indiana University, National Institute for Based discussions about teaching, learning, and Learning Outcomes Assessment (NILOA), 2018. curriculum, New Directions for Teaching and [24] M. Martin, Internal Quality Assurance: Enhancing Learning 10 (2) (2018) 1-7. higher education quality and graduate https://doi.org/10.1002/tl.20260. employability, UNESCO Publishing, 2017. E Figure 2. Institutional learning outcomes assessment process.
  12. 12 P.T.T. Nhung / VNU Journal of Science: Education Research, Vol. 36, No. 1 (2020) 1-12 Appendix A General Education Assessment Self-reflection Competency: Note: Please do not provide individual information in the self-reflection. How does the student learning in your course, based on the CBA data, compare with the institutional assessment results? (Benchmark) What did you learn from the individual course assessment result? Did you find any common patterns occurring in your courses? What are the strategies you implement in class in order to maintain and support student learning? If possible, what new strategies, materials, or pedagogy will you implement in this section to better support student learning? Thank you for your feedback!
nguon tai.lieu . vn