1. Steps to examine and eliminate bias and to establish fairness, accuracy, and consistency. How are the identified procedures being implemented by the unit?
Level I Field Experiences ? Faculty are trained to identify Level I courses as those that require observation and minimal participation. Faculty review their syllabi with candidates emphasizing the purpose and expectations of the required field experience. Candidates are also provided the evaluation form with an explanation of the rubric and how it is used by the cooperating teacher. The course instructor is in direct contact with the cooperating teacher providing him/her with the evaluation form and rubric along with information regarding the expectations of the field experience. Once completed evaluations are received by the university instructor the results are shared with the candidate during a face-to-face conference. The same implementation of a dispositions survey also occurs.
Level II Field Experience ? These experiences are part of the methods courses prior to student teaching and require participation in the teaching process. Course instructors review the evaluation check sheets and rubrics with candidates emphasizing this level of participation. The same implementation process occurs with the absence of the dispositions survey.
TEP Interview ? Standard interview questions were selected from educational resources to ensure consistency and fairness. A minimum of two interviewers are required, with each interviewer completing an individual evaluation with precisely worded rubrics. Interviewers then complete a summary evaluation whereby they reach consensus regarding strengths, weaknesses, and final recommendations. Interviews are conducted in segregated areas void of distractions. Candidates are free to make inquiry for further explanation of interview questions if necessary. All results are returned to the Office of Education Student Services where results are entered into a database and candidates are to request their results. Copies of the results are sent to the candidate's advisor and are also placed in each candidate's advising folder. Data reports are generated to examine consistency between interviewers and rates of acceptance/denial into the program.
Written Comprehensive Exams ? For example, the Counseling program comprehensive exam questions are derived from published test banks and are aligned to national standards across all programs. Internal questions have been eliminated. The Counseling program faculty conduct an item analysis to eliminate poor questions. Candidates receive information regarding the dates and times of exams. Specific information is available online through Bb and in the student handbook.
Portfolios ? Instructions for compilation is provided through Bb and the student handbook. Candidates also receive the scoring rubric online and from instructors of the assigned check point courses. Faculty and advisors are trained in the purpose and use of the portfolio and its evaluation.
Research Projects ? This requirement is listed on advanced program check sheets and described in the catalog. Specific details and grading rubric are part of the course syllabus (TCED 791). Instructors of this course review the expectations of this requirement with all enrolled candidates. Information is also available on Bb. An explicit timetable for submission of the research project components is listed in the syllabus and reviewed by the instructor. An oral presentation is also required with specific information and rubrics available in the syllabus and on Bb. At the end of each semester candidate data is entered electronically into the UAS.
2. Compilation of data for key assessments of the initial and advanced level programs. How are data compiled utilizing the UAS?
Below are links to an explanation of the
that was posted on UT Martin's NCATE website for the Continuous Improvement Institutional Report submitted in June.
There are two other items.
a Field Experience course listing, which is a communication sent to faculty (full-time, part-time, and adjunct) by the Director of Education Student Services notifying them of what courses fall into which category. The Field Experience Level I andLevel II forms are also sent via the e-mail.
aKey Assessment Data Collection deadline chart indicating when forms are due and what courses require what forms.
3. Summarized data for initial programs. The exhibit data tables provided individual programs with missing data sets. What summarized data exists?
Each semester the Director of Assessment presents summary data at the Unit NCATE Retreat as evidenced by agendas and minutes.
Some data is unavailable due to a computer glitch and /or low enrollment programs. One year ago we discovered that the program that downloaded information from Banner was date specific. For example, when the Dean's office entered performance assessment data into Banner, the date of entry was used. If that date were between semesters, then the information was not downloaded at all. To correct the problem, all data entry assistants have been trained to enter the date of the assessment. This allows the information to be downloaded from the appropriate semester.
4. Use of assessment data for improvement. Over the past three years, how many unsuccessful candidates were advised into a non-licensure program that leads to graduation with a Bachelor of Science in Education (non-teaching)?
5. Function of the Teacher Education Effectiveness Committee. Who are the members? What is their involvement in the assessment system? How do they interact with the Director of Assessment?
The TEEC is part of the feedback loop. The Director of Assessment is the person delivering reports to the TEEC.
6. The operation of the unit assessment system. What data are being collected and when? What data are being collected, aggregated, summarized, and analyzed? How are the data being used and by whom? Why are data cells empty on some of the reports in the exhibits?
Assessments with Due Dates represents a timeline when specific data is collected by the UAS. Faculty have been asked to contribute examples of how data are being used and by whom with regard to programs. An issue which is being resolved is entering data into the system in a shorter time frame. This will assist in timely summary and analysis of data.
An example of using data to make program changes comes from the Department of HHP. After close examination of candidate and program data, it was determined that their candidates' poor performance on the PRAXIS content area exam was related in part to the timing of the TEP Interview and admission to the Teacher Education Program. Specific problem areas were examined by the pedagogy faculty and department curriculum committee. Course syllabi and program course requirements were also compared to knowledge and skills required for passing the licensure exam. As a result curricula and syllabi changes are in process with others currently being reviewed. Meeting minutes are available. Examples:
Add TCED 211 to HHP program requirements
New class ? HPED 410 Teaching Models in Physical Education (taken concurrently with HPED 309)
New class ? HPED 321 Individual Lifetime Activity. Added as a compliment to HPED 320 Foundation and Sport Skills (team sports)
HPED 411 Secondary Instruction Strategies and Curriculum Development ? will require PRAXIS Study Guide in addition to textbook
HPED 309 Methods of Teaching Elementary Health and Physical Education ? added prerequisite of admission to TEP
HPED 350 Motor Development and Learning Across the Life Span ? added comprehensive final exam to course assessment
See Item #3 above regarding empty data cells. See the IR Standard 2 for other examples of data driven changes.