Saturday, October 31, 2015

Software to Indict Plagiarists and the edTPA

Remember that memo that Deputy Commissioner John D’Agati of New York State’s Education Department wrote back in July about what would happen to candidates of questionable moral character caught cheating on the edTPA? Here’s a piece to refresh your memory:
“As part of the edTPA scoring process, originality detection software is employed to compare all edTPA submissions nationwide against all other edTPA submissions received, including outside written sources and other sources of material. The software reports any substantial degree of matching between submitted edTPA portfolios. In cases where there is substantial matching, a specially trained portfolio reviewer may then elect to seek enforcement action against the candidate(s) involved and/or refer the candidate(s) to the State Education Department for enforcement action.” (7-23-15)
Now there’s another memo released a few days ago, and the originality detection software used at my college, called Safe Assign, picked up a 46% match! See if you can figure out why:

“It is important that candidates be made aware that, as part of the edTPA submission process, originality detection software is employed to compare all edTPA submissions against all new submissions, including outside written sources and other sources of material. In cases where there is substantial matching, a candidate’s edTPA score may be voided, and the candidate may not be eligible for the edTPA Safety Net. In cases when candidates have already received their teaching certificate, the Department may seek revocation of the certificate. Candidates will be given the opportunity to appeal a decision to void their scores, and that process will be explained in any communication they would receive if their score is voided.” (10-28-15)
All that’s really new in the latest memo is an acknowledgement that candidates should not be prevented from collaborating with each other pursuant to SCALE’s guidelines regarding acceptable forms of support, which was revised in April of 2014 after some questioned the extent to which peers were permitted to help each other through the edTPA process (I wrote about that too). In that document, SCALE stated the obvious: “within their coursework and key program assignments and activities, candidates receive feedback from instructors and fellow candidates.” Professor Laura Davies, in a thoughtful essay on the first D’Agati memo and the questioning of originality in the edTPA process from September 23rd  raises important issues about the harmful unintended consequences of generating so much fear regarding collaboration, a prominent feature of the work of teaching in a profession that suffers from teachers’ isolation from each other.
Recently the first significant empirical study of the edTPA rollout in New York and Washington was published, and among the findings of Meuwissen and Choppin is confirmation that candidates, desperate for clarity and help navigating the complex handbook instructions and rubric guides to ensure a passing score on the edTPA, have found clever secretive ways to get and give support to each other, mostly through social networking. Candidates interviewed in the study also mediated support tensions in their placements in schools, which can place considerable constraints on the teaching they do for the edTPA. Just working out the logistics of which students will be videotaped, what curriculum will be used, and when and where to do the taping is a source of stress and despair more often than not in my own experience with student teachers.
Now imagine the candidate that finally submits an edTPA portfolio, believing to have properly cited the use of the school’s curriculum in the lesson plans and commentary, turning up a high percentage of matching in the originality detection software used by scorers at Pearson. Theoretically such a portfolio is flagged and sent to a specialist to determine the degree of culpable intentionality in the presumed crime. Similarly, imagine a candidate being flagged by the software for using common phrases and routine descriptions in the commentaries that are similarly flagged by the software as matching other text. Alan Singer addressed this problem here. The software doesn’t judge, that’s up to a specialist at Pearson. But how exactly does that specialist make a judgment, one that could jeopardize employment, either due to delays in receiving a score, or due to a guilty verdict where none is merited?
What’s more, why would ANY candidate agree to allow the edTPA portfolio to be used by SCALE, Pearson, or the college or university, when there is a risk that at some point some stranger could plagiarize from that portfolio without the knowledge of the author, and that author could be subjected to having his or her license revoked?
While we’re considering what is legally unfair about all of this, have a look at the fine print regarding candidate’s compliance with edTPA rules (italics are mine):
From edTPA.com site:
“edTPA Rules of Assessment Participation
COMPLIANCE WITH ASSESSMENT RULES
By registering for edTPA, you are agreeing to abide by the current Rules of Assessment Participation for edTPA and all rules, procedures, and policies contained on the current edTPA.com website and/or in the current edTPA Handbook for the content area for which you have registered.
For the purpose of these Rules of Assessment Participation, the following definitions apply:
       "Evaluation Systems." Evaluation Systems, a business of NCS Pearson, Inc. (referred to throughout as Evaluation Systems).
       "Program." The educator preparation program in which an edTPA candidate is enrolled as indicated by the candidate at the time of registration.
       "State Agency." The educator certification agency in any state in which an edTPA candidate is pursuing certification with this assessment.
Rules of Assessment Participation for edTPA
1     PURPOSE OF ASSESSMENT: I understand that this assessment is administered for the purpose of fulfilling a program requirement and/or a state teacher licensure requirement and is only to be taken by individuals to fulfill such requirement(s). I certify that I am taking this assessment for which I have registered, to fulfill a program requirement or for the purpose of teacher licensure.
2     ORIGINALITY OF SUBMISSION: I understand that by submitting my edTPA materials, I am confirming that I am the person who has completed the assessment, that I have primary responsibility for teaching the students/class during the learning segment profiled in this assessment, that the video clip(s) submitted show me teaching the students/class profiled, that the student work included in the documentation is that of my students and completed during the learning segment documented in this assessment, that I am the sole author of the commentaries and other written responses to prompts and other requests for information in this assessment, and that I have cited all materials in the assessment whose sources are from published text, the internet, or other educators.
3     PERMISSIONS AND CONFIDENTIALITY: I understand that I am responsible for obtaining appropriate permissions from the parents/guardians of my students and from adults that appear on the video clip(s) I submit. I agree to produce such permissions if requested after I submit my assessment. I have ensured confidentiality of individuals appearing in the video clip(s) I submit by uploading the video only to the designated Pearson ePortfolio system, an integrated third party edTPA Platform Provider system, or other secure system designated by my program. I understand that I may use my assessment materials according to the parameters of the release forms obtained for children and/or adults who appear in the video. Because parents/guardians and/or adults have not typically granted permission for public use of the videos in which they or their children appear, I will not display videos publicly (i.e., personal websites, YouTube, Facebook) without expressed permission for this purpose from those featured in the video.
4     ASSESSMENT MATERIALS: I acknowledge that I am not permitted to reproduce or share any of the information or materials from edTPA handbooks or support materials (Making Good Choices or other materials with Stanford copyright) for commercial purposes. If I do reproduce information or materials from the edTPA handbooks or related materials for personal use, I will properly attribute the copyright of such materials to Stanford University.
5     USE OF ASSESSMENT: I agree that my edTPA submission, including text, graphics, digital files and video or audio recordings, without the use of my name or other identifying information, may be used by Stanford University and/or Evaluation Systems edTPA program development and implementation, including scorer training associated with the program. If I provided consent as part of my response to registration questions, my submission, without use of my name or other identifying information, may also be used for continued edTPA program activities conducted by Stanford University and/or Evaluation Systems such as future validity and reliability studies of the edTPA. Stanford University and Evaluation Systems will not show candidate materials publicly, make them available in a non-secure way, or use them as exemplars for marketing purposes.
6     SCORE REPORTING AND CANCELLATION: I understand that my results will be reported to me; to the program authorized by me during registration and/or State Agency, if applicable; and to any other institution, entity, or person authorized or required by law to receive this information. edTPA results are anonymously provided to SCALE with candidate responses to registration background questions which address edTPA placement context and demographics of edTPA candidates, including educator preparation program and state affiliation, for the purpose of edTPA assessment analyses and assessment development. edTPA results received by SCALE will not include candidate name or other personally identifying information such as date of birth or partial social security number. I understand that any information provided as part of registration may be used to report scores or to contact me regarding assessment- or program-related issues. Once I submit my assessment, I cannot cancel the scoring or score reporting.
7     CONFORMITY WITH PROCEDURES: I understand that if my submitted artifacts, videos, and/or related documentation do not conform to the current rules, requirements and polices as specified in the edTPA Assessment Handbook, and the edTPA website, my submission or portions thereof may not be scored, my score may be voided and other actions as described in Rule 11 may be taken as deemed appropriate by Evaluation Systems, my program, and/or State Agency. If my complete submission or portions thereof cannot be scored because it does not conform to requirements, no refund of my fee will be issued, and no portion of my fee can be applied to the cost of any future edTPA registration or associated services. If my submission cannot be scored due to a system error occurring after submission, I will have the opportunity to resubmit my portfolio without paying additional fees. I understand that my submitted assessment materials, or a portion thereof, may be reviewed by authorized individuals at the program in which I am enrolled, as indicated at the time of registration, and/or the relevant State Agency responsible for educator certification in order to investigate compliance with the Rules of Assessment Participation, as needed.
8     RIGHTS AND OBLIGATIONS REGARDING edTPA: I understand and agree that liability for assessment activities, including but not limited to the adequacy or accuracy of assessment materials, of the registration processes, of scoring, of score reports, of information provided to me in connection with edTPA and the adequacy of protection of candidate information, will be limited to score correction or edTPA retake at no additional fee. I understand and agree that liability for data loss or file corruption associated with my edTPA submission will be limited to an additional edTPA submission at no additional fee. I waive any and all rights to all other claims, specifically including but not limited to claims for negligence arising out of any acts or omissions of Stanford University, Evaluation Systems, and/or the state or program which is requiring completion of the edTPA (including the agents, employees, contractors, or professional advisors of Stanford University, Evaluation Systems, or such entity).
9     PROGRAM CHANGES: I understand that the edTPA assessment and associated policies and procedures are subject to change at the sole discretion of Stanford University and Evaluation Systems. State Agencies or individual programs may make changes to their policies and requirements related to the edTPA at their discretion.
10    OBJECTION TO PROCEDURES: If, for any reason, I object to the procedures presented in these Rules of Assessment Participation, I must advise Evaluation Systems, in writing, of the basis of my objection at least six (6) weeks before I plan to register for edTPA for my objection to be taken under consideration. If my objection is not honored, I will not be registered for edTPA.
11     COMPLIANCE: I understand that if I fail to comply with the rules, requirements, and policies specified or referenced on the current edTPA website, including these Rules of Assessment Participation, or if I take any prohibited actions, my results may be voided, no refund will be issued, no portion of the assessment fee can be applied toward the cost of any future assessment fees, my registration may be canceled, I may be prohibited in the future from registering for edTPA. Legal proceedings and actions may be pursued as well as other remedies deemed appropriate by Evaluation Systems, my program or State Agency, as appropriate. In addition, I understand that assessment fraud may be grounds for denial, revocation and/or suspension of a teaching license.
RULES: I understand that should any of these rules or any other requirement or provision contained on the current edTPA.com website be declared or determined by any court to be illegal or invalid, the remaining rules, requirements, and provisions will not be affected and the illegal or invalid rule, requirement, or provision shall not be deemed a part of the current edTPA website. The headings of each of the Rules of Assessment Participation for edTPA are for convenient reference only. They are not a part of the rules themselves; they do not necessarily reflect the entire subject matter of each rule; and they are not intended to be used for the purpose of modifying, interpreting, or construing any of these Rules of Assessment Participation for edTPA. I agree that any legal action arising in connection with my registration for or participation in edTPA shall be brought in the state and federal courts governing St. Paul, Minnesota, and I consent to the personal jurisdiction of such courts.

Voiding of Scores:
If you violate one of the Rules of Assessment Participation or if doubts are raised about the validity or legitimacy of your registration or your scores, Evaluation Systems may notify the educator preparation program and/or State Agency you identified during the registration process, as applicable.
Evaluation Systems reserves the right to void your scores if in their sole opinion, or after consultation with the educator preparation program and/or State Agency you identified during the registration process (as appropriate), there is adequate reason to question your scores' validity or legitimacy, due to misconduct including, but not limited to, a violation of the rules set forth on the current edTPA website, including the Rules of Assessment Participation.
Further action may be taken, including remedies deemed appropriate by Evaluation Systems, your educator preparation program or State Agency, as applicable.
Please note that software may be employed to screen submissions for originality of content. Submissions determined to violate edTPA rules regarding the originality of the submitted material will be subject to actions described above.”


We’re always warned to read the fine print, but that doesn’t really mean we have much power to do anything about it, especially when it is a requirement of employment. Now, imagine that candidate I described above, waiting since last spring to receive an edTPA score, getting nowhere with Pearson, SCALE, or the program, and knowing only that the portfolio is in a limbo of administrative review. Maybe that’s not hypothetical. Maybe that is reality.

Saturday, October 17, 2015

Teaching Theatre: Karen Sklaire's Solo Show

Yeah. He don’t care about what we think- he just wants to shove this testing bullshit down our throats.  Yo, I’m sick of sitting all day and doing nothing.  They take away our programs and don’t ask us how we feel. This school used to be fun- why did he even bring you here if you’re gonna be like the rest of the teachers? What’s the point? 

Karen Sklaire is an idealistic teacher who decides to change from an acting career to teach drama in the South Bronx. She recounts her experiences in New York City schools in her one-woman show, Ripple of Hope: One Teacher’s Journey to Make an Impact. A sold-out hit at the fringe theatre festivals in Washington D.C. and New York this year, she is doing an encore performance on Thursday, October 29th at 6pm in Long Island City for the Flying Solo Festival. If you loved Nilaja Sun’s 2006 play No Child as much as I did, you will find that Sklaire’s show has an equally heartbreaking, funny, sometimes cynical take on trying to teacher young people drama even in this theatre mecca city. In the decade since Sun was a teacher, budget cuts, high stakes testing, and mayoral control of the schools have made the conditions Sklaire must face more daunting: disillusioned and defiant students; burned out colleagues; a freaky, faceless bureaucracy; and like her show, it’s a solo affair. 
Starting out, Sklaire recounts briefly what brought her to teaching, including the events of 9/11, a speech by Robert F. Kennedy from 1966, and Hilary Swank’s performance in Freedom Writers. What could possibly go wrong? Everything. Her first day is a nightmare, and the disconnect from her childhood experiences in Connecticut and those of her students is painfully obvious, but there is no help or support, and even the principal tells her if she has a problem she has to solve it herself. In desperation when a particularly out-of-control boy is about to start a fight, she grabs her iPod and picks Michael Jackson (“who doesn’t love him?”). This provides the first breakthrough moment, a glimpse of the joy that is possible, as Le Jean gets everyone dancing and smiling while he shows off his MJ moves. 
Along the way, Sklaire learns that life isn’t like the movies, where heroic teachers overcome the odds. Trapped in a job that is increasingly stressful, and an abusive relationship with her principal, Sklaire hits a wall, broken. Although she doesn’t reveal it in the play, it’s likely that Sklaire found her way back by writing, and seeing the potential for redemption and meeting her original goals by turning her stories into theatre. At its heart, Ripple of Hope is more about the possibilities afforded us through creative expression than it is about teaching per se. Sklaire, like any good teacher, wants to pass on her passion for performance to her students, see them proudly on stage in the spotlight, instead of bent over a test or worksheet, trapped in a chair and desk. 

Get tickets for Ripple of Hope here

Thursday, August 13, 2015

It’s All About the Bell Curve: Sheri Lederman’s Day in Court

I traveled up to Albany this morning to hear the oral arguments in the Lederman v. King case presented to Acting Supreme Court Justice Roger McDonough by Bruce Lederman, and Colleen Galligan representing the State Education Department. This is the first time in my life I have sat in a courtroom proceeding. I don’t even watch Law and Order. Let’s just say I was most definitely not in my element. But I’m a pretty good observer of human behavior, a decent note-taker, and I had personal reasons for caring deeply about the outcome of this case, above and beyond all the reasons we all should care about a case that may have far-reaching implications for the misguided reforms of Race to the Top (see full disclosure below). What I witnessed was a masterful take down of the we-need-objectivity rhetoric that is plaguing education. So I should begin by saying that I am hopeful, because it seems someone with the power to make a difference gets it. Judge McDonough gets that it’s all about the bell curve, and the bell curve is biased and subjective.

In case you need a refresher on how test scoring works these days (and who doesn’t) I suggest you start with the excellent fact sheets from Fair Test, first on norm-referenced tests, or NRTs, and then on criterion-referenced tests, or CRTs, and tests used to measure performance against state standards. In particular note the following important points:

“NRTs are designed to sort and rank students 'on the curve,' not to see if they met a standard or criterion. Therefore, NRTs should not be used to assess whether students have met standards. However, in some states or districts a NRT is used to measure student learning in relation to standards. Specific cut-off scores on the NRT are then chosen (usually by a committee) to separate levels of achievement on the standards. In some cases, a CRT is made using technical procedures developed for NRTs, causing the CRT to sort students in ways that are inappropriate for standards-based decisions.”

As you may notice, we’ve come a long way from getting a 91 out of 100 on a test and knowing that was an A-. Testing today is obtuse and confusing by design. In New York State, we boil it down to a ranking from one to four. That’s right, there’s even jargon for “ones and twos” that is particularly heinous when you learn that politicians have interests in making more than 50% of students fall in those “failing” categories. Today the state released the test score results for students in grades 3-8 and their so-called “proficiency” is reported as below 40% achieving the passing levels. By design the public is meant to read this as miserable failure.

The political narrative of public education failure extends next to the teachers, who must demonstrate student learning based on these faulty tests, even if they don’t teach the subjects tested, and even if they teach students who face hurdles and hardships that have a tremendous impact on their ability to do well on the tests. In Sheri’s case, her rating plunged from 14 out of 20 points to 1 out of 20 points on student growth measures. Yet her students perform exceedingly well on the exams; once you are a “four” you can’t go up to a “four plus” because you’ve hit the ceiling. In fact, one wrong answer could unreasonably mark you as a “three” and you would never know. Similarly, the teacher receives a student growth score that is also based on a comparison to other teachers. When it emerged in the hearing today that the model, also known as VAM, or value-added, pre-determined that 7% of the teachers would be rated “ineffective” Judge McDonough caught on to the injustice that lies at the heart of the bell curve logic: where you rank in the ratings is SUBJECTIVE.

In his affidavits, Professor Aaron Pallas of Teachers College brilliantly explains the many flaws with this misuse of student test scores to evaluate and rank teachers’ effectiveness. Predetermining a set percentage of ineffective teachers regardless of their actual “effectiveness” and their students’ achievements was the first major flaw. The second is that the model is not grounded in scientific definitions of teacher quality or effectiveness, as there are many factors beyond a teacher’s control that contribute to student performance on standardized tests and other measures of their knowledge and skills. Third, the model is not transparent on what “needs to be done to achieve effective or highly effective ratings” which is a requirement of the law. The model also violates the law’s definition of student growth as “change in student achievement for an individual student between two or more points in time.” Judge McDonough seemed to have picked up on this idea, and asked if a better model would test the student at the start and end of a given academic year. Pallas gives a far more nuanced explanation of the need for a different model of testing to measure growth over time, but suffice it to say, the model that produced Sheri’s absurd score is not measuring student growth as defined by the law. Pearson, the corporate entity behind the testing enterprise, even noted, “It is inappropriate to compare scale scores across grades as they neither measure the same content, nor are they on the same scale.” Yet that is what the growth model does.

The lame explanation from Colleen Galligan was that the model may not be perfect but the state tries to compare each student to similar students. The goal, she offered, is to find outliers in the teaching pool who consistently have a pattern of ineffectiveness, to either give them additional training or fire them. At this point Judge McDonough offered her a chance to explain the dramatic drop in Sheri’s score. “On its face it must mean students bombed the test (speaking as one who has bombed tests)” and this produced laughter in the courtroom. For who hasn’t bombed at least one test in their life? Who has not experienced that dread and fear of being labeled a failure? Then Judge McDonough asked rhetorically, “Did they learn nothing?” The only answer she could come up with, was that in this case Dr. Lederman’s students, although admittedly performing well compared to other students, did worse than 98% of students across the state in growth. At this point it was pretty clear to everyone present that this made absolutely no sense whatsoever.
 
Sheri Lederman speaking to reporters outside the courtroom in Albany
Full disclosure:
Sheri Lederman is my high school classmate and she is a highly regarded elementary teacher in the Great Neck Public Schools, which we both attended in our childhoods. She got her doctorate at Hofstra University, where my mother is a professor emerita, and where I know many of the faculty as personal friends. They confirm the high regard I have for Sheri’s intelligence and insights into education. I think she is absolutely heroic to be pursuing a lawsuit, with the expert guidance of her lawyer husband, Bruce Lederman, against the New York State Department of Education, to expose the irrational and illegal practices of evaluating teacher performance using “arbitrary and capricious” student growth models based on flawed science. I have previously written in my blog about Sheri’s hope that her lawsuit would prove to be a “tipping point” in halting the use of these erroneous student growth models. A bit of background on the case from last October can be found here.

On June 1st, the New York State Supreme Court ruled that Sheri’s case could go forward despite the State Education Department’s claim that her lawsuit was baseless since Sheri’s overall evaluation was “effective” despite the “ineffective” label on the student growth portion, worth 20% of the total.

Today’s news was covered so far here, here and here. The local CBS station covered it here and WNYT here.




Monday, June 1, 2015

Almost like being there?

Video is seductive technology. It’s used as click bait on social media, to advertise on the sides of buildings in Times Square, and even to help pass the time in the back seat of a New York City taxi. In education, video has tremendous potential to instruct, to inspire, to raise awareness, and more. It is making its way into teacher education as a tool for analyzing teaching. Despite its potential, I am concerned about some trends I am noticing, and that I believe deserve careful scrutiny.

For example, at Relay, the website boasts that its instructors are not “sitting in ivory towers” but coaching and mentoring students who are learning to become teachers. “When they’re not right there in the classroom, they’re side by side with our students, watching and analyzing video of them…pausing, rewinding and replaying the video to give pinpoint feedback.” They even call these videos “game film” as in show that you’ve got game in the classroom. Video is also used to instruct, and Relay’s site explains “our students can watch and rewatch course modules as they complete our program.” One of the students featured in a Relay video  even claims, “Film doesn’t lie.”

One of my concerns lies in the false sense of objectivity that is ascribed to videos of classroom life. Like it or not, the camera is a presence. You can’t be unaware of it, and it comes with its own interpretive lens even sitting on a tripod in the corner of the room. It is not reality, it is a representation of reality. What’s more, classroom events are often incredibly complex, and require deep contextual knowledge to fully understand and even interpret. I know from my own research in classrooms that when revisiting classroom events with participants using video there is a lot of unpacking to do about the teacher’s intentions and beliefs, the students’ understanding, and the shifts and gaps between what is captured in film and what is remembered by the people afterwards.

Another concern is that the temptation in observing teaching to satisfy a checklist of items you are looking for is exacerbated with video. We have been there, done that, and the behavioral checklist doesn’t work. It’s a bit like getting on a sightseeing bus, driving around a city, and saying you saw this and that. You caught a glimpse, grabbed a bad photo or two, but what did you really see? Not much. Using video to evaluate teaching is also problematic because the likelihood is that only a short clip will be analyzed, a tiny sliver of what classroom life is really like, and the evaluator will probably only watch once. It’s as if instead of going to the Metropolitan Museum of Art, lingering over favorite paintings and talking with a friend about what you notice, like, appreciate, wonder about, and so on, and reading the contextual information provided by the curator, you watch a quick slideshow online where each image lasts for 15 seconds and you get the name of the artist and the title of the work on the bottom of the screen. It’s not likely that you will have a memorable and long-lasting experience.


What much of the video use in teacher education is intending to replace is the bothersome and expensive problem of actually being in the classroom. It is an acknowledged problem that full time faculty don’t generally supervise student teachers, and that the work is more often than not relegated to adjuncts and in large universities, to doctoral students. Principals are also hard pressed to find enough time to evaluate all the teachers in their schools, and rely on help from assistant principals and instructional coaches. Now that companies like EdThena are developing software to make it easy and intuitive to provide feedback on teachers’ videos, we are likely to see more and more remote evaluation. No one will remember anymore the value of being in the room, because teaching won’t be seen as relational work, but as a series of techniques to be micro-managed by data analysis and video software.

How is this creeping up on us? In preservice education, we are seeing how Pearson’s scoring of edTPA portfolios is micromanaged by very specific rubrics looking for particular instructional moves in video clips totaling approximately 15 minutes. This leads to some very problematic oversimplification as in this Powerpoint slide widely used to explain the rubric progression of edTPA scoring from one to five:


Why, for example, is a preservice teacher rewarded for a focus on individuals or flexible groups rather than on the whole class? This is a false dichotomy. There are plenty of classroom moments that call for the teacher to focus on the whole class. The danger of delineating “best practices” in this sense is then certain approaches and teaching moves become de-facto no-nos. The truth is there are times when it is appropriate to be letting students explore and do inquiry, and others when students require explicit step-by-step instructions from the teacher. In the new teacher education accreditation standards from CAEP we see that clinical supervision is using “technology-based applications” and “technology-enhanced learning opportunities” that are likely stand-ins for video analysis of teaching. The call for external evaluators in schools as in Governor Cuomo’s budget will likely be done by video (see p. 18 here that says observations may be live or recorded video) and will claim to have teacher and union support. For example, Public Agenda’s initiative Everyone at the Table (with funding from the Gates Foundation) seeks to involve teachers in evaluation reform. Teachers will be persuaded to buy-in to the idea of external evaluation by video because there is some truth to the problem that principals and peers are biased and can have favorites, and video evaluation is seen as more objective. But precisely because it offers less context, and comes with more narrow parameters (that checklist rears its ugly head again), it is more problematic.  


 Although a recent piece by NPR on professions that are likely to be automated in the coming decades  said college professors only had a 3.2% chance of that happening, there is an increasing possibility that a bleak future for unemployed former teacher educators will entail scraping together a measly income from scoring edTPA portfolios, doing supervision and teacher evaluation by video analysis, and putting together data analysis reports from software made by EdThena or other similar companies.