Teaching Algebra II: Technology Integration

I observed an Algebra 2 class at Hacienda (pseudonym), a Northern California high school, on September 9, 2016. The high school has over 1900 students, mostly minority (Asian and Latino). About 20 percent of the students are eligible for free and reduced lunch–a measure of poverty used in U.S. public schools. Over 98 percent graduate and a very high percentage of those graduates enter college. About one-third of students take Advanced Placement exams with well over 80 percent qualifying for college credit. Less than 10 percent of students are English Language Learners and just over that percentage have been identified with disabilities. This is a high school that prides itself on academic and sports achievements and is recognized in the region, state, and nation as first-rate.

Beverly Young (pseudonym) is a veteran teacher of 22 years at Hacienda.  A slim woman of average height, wearing black slacks, white blouse with a beige sweater, she has been department head and very involved in coordinating the math curriculum at the school. Since 2008, she has embraced different technologies for the efficiency they brought to her in making out quizzes and tests and their help in connecting to students. She has been using an iPad with educational apps particularly Doceri for her math lessons since the tablet appeared.

The 50-minute lesson on Friday morning went swiftly by as the fast-paced, organized teacher taught about factoring quadratic equations. Announcements about upcoming quiz are posted on bulletin board next to whiteboard: “9/14—9/15, Quiz 4.1 to 4.2” –and upcoming test—“9/21—9/22, Test on 4.1 to 4.4.” The numbers refer to textbook sections.

There are 26 students in the room sitting at five rows of three desks next to one another, all facing the whiteboard. Young, carrying her iPad with her as she walks around, uses a remote to post slides and videos on the whiteboard during the lesson.

img_1870

For the first five minutes, Young shows a video about the Rio Paralympics. As students watch the brief video, Young, holding her iPad, walks around recording who is present and then stamping homework that students had laid out on their desks. I look around the class; they were watching intently athletes with disabilities who perform extraordinary feats.

Two minutes later, school announcements appear as a video on the whiteboard. Hacienda students prepare the daily announcements. A student anchors the announcements showing clips prepared by other students for different daily and weekly school activities (e.g., upcoming mini-bike racing event in Quad). In most schools where I observe classes, announcements are on the public address system and generally students ignore them as they drone on. I looked around and saw that all but a few of the students watched each announcement.

After announcements end, Young turns to the lesson for the day. The slide on the whiteboard is the objective for the day: “Factoring and Solving x²+bx+c=0.” She asks if there are any questions on the homework. No hands go up. Young then passes out handout for the day and directs students to go to Google Classroom on their devices (I see those students sitting near me have a mix of different laptops and tablets). She then asks students to go to Socrative, a software program, and gives instructions how they should login. She walks up and down aisles to see what is on students’ screens. After all students have logged in, she clicks on a short video that explains factoring quadratic equations by using an example of jellyfish.

Young explains what the key terms are, the different variables described in video and then applies it to factoring. She gives examples of binominals and asks questions as she goes along. She encourages students to talk to one another if they are stuck. She walks up and down aisles with iPad in hand as students answer. She then reviews binominals and moves to trinominals. “Now, look at polynominals.“ One student asks for clarification of terms. Young clarifies and asks: “You guys understand?” A few heads nod.

(For readers who wish to delve into the details of this lesson’s content, the teacher has made a five minute YouTube video for students that explains the content of this lesson.)

Young moves to next set of slides about “x intercepts” and examples of “distribution.” She then asks: Why do we do factoring? A few students answer. Young explains what the key points are and the differences between factoring and solving an equation. She asks students more questions, encouraging them to talk to one another to figure out answers.

The teacher segues back to a Socrative slide and to a question that she wants student to answer.

Young encourages students to help one another—as she circulates in the room. “If you don’t remember, write it down. It’s OK.” She checks her tablet to see what each student is doing and says aloud—“I see two guys who got it right—I am waiting for 15 of you guys to finish—talk to one another.” A few minutes later, looking at her tablet, she says—“most of you got it. I will give you another minute—I am waiting on eight more here.”

She talks to individual students answering questions and complimenting students as she traverses the aisles.

“Looks like most of you have the idea,” she says.

I scan the class and all students have eyes on screen, and are clicking away or whispering to a neighbor what appears to be an answer to the teacher’s question.

“Now you guys work on the second question.” She chats easily with students—“do you have answer here?” she asks all the while checking the iPad she carries around.

She then directs class to go to next question. “Do it and give me an answer for this—it’s a little tricky. You are more than welcome to ask one another.”

One student asked a question and then the teacher used the student question to correct misconception about solving a quadratic equation. Young answers the student and refers back to jellyfish video.

In scanning the class, all students look engaged. “If you guys have an answer like this—pointing to what she wrote on the whiteboard, then you got it wrong. Here’s a little hint—[could not catch what teacher says]. I’ll give you another 50 seconds—I just want to see what you guys remember”

Again, checking her iPad she can see each student’s work and can help student in real time as she cruises through the classroom.

“Now let’s go to fun stuff.”  After she posts slide from her iPad on the whiteboard on how to factor trinominals, Young explains each problem.

Young sees that some students are confused so she starts over. She continues to work on the numbered problems appearing on the slide, explaining what she is doing at each step. Then, she asks students to factor particular parts of equations. She checks her iPad and says: “I hear guys having an answer already—that’s great!”

“When is a 9 equal to zero or a plus nine equal to zero—now can you answer no. 8?” Students talk to one another, as I scan the room. Young circulates and listens to different students to further explain if they are stuck.

She asks: “Are we ready?” Teacher walks students through how she solves problem on whiteboard using the iPad. She then asks whether students know the difference between factoring and solving. One student says yes. She then asks students to jot down their answers to central question of the lesson —she walks around and talks with students as they click away.

The teacher ends class a few minutes before bell rings and then talks to different students, answering their questions. Other students begin packing up their things to await the end of the class. Bell rings.

8 Comments

Filed under how teachers teach, technology use

Cartoons on the Pluses and Minuses of Technology

Living in Silicon Valley as I do and seeing every-day technologies around me in people walking, eyes down, looking at their smart phones, electric autos, and driver-less cars, so much is taken for granted. The cornucopia of technology runneth over. Yet it is never too late to poke at the pomposity and “irrational exuberance” that accompanies such love for the next new thing. So this month’s cartoons does that. Enjoy !

meaning-of-life

 

workrobotcartoon

 

kids-texting

 

cartoon6063

 

1889-strip-respect

 

computer-repair-cartoon

 

bhsso1kieaebxws

 

doggiecartoon

 

images

 

tech-concepts

 

111003_cartoon_048_a15959_p465

 

liam-walsh-cartoon-new-yorker

 

funny-cartoon-pictures-2

 

o-new-yorker-570

4 Comments

Filed under technology use

Teaching Advanced Placement Composition: Technology Integration

Room 409 at Los Altos High School in the south San Francisco Bay area is one of the most spacious for an academic subject —nearly the size of two regular classrooms–I have ever seen in the many schools I have visited over the years. I marveled at its carpeting, recliner chairs near the teacher’s desk and horseshoe arrangement of 3- and 4-desk clusters facing a table in the center of the room where Michael Moul, a 12 year veteran teacher, presides over his AP class.

Well over six feet tall, the stocky and goateed Moul is wearing a blue shirt, and dark slacks. He looks out on the 32 students in the room. He is also faculty adviser for the Talon, the school newspaper. Twelve desktop computers sit on the ledge below a wall and tall windows in the rear and side of the room that Talon staff use.

moul-classroom

Los Altos High School is a Bring-Your-Own-Device school. * The high school district adopted BYOD two years ago for its three schools after teacher- and administrator-initiated pilot projects established that well over half of the students had laptops or tablets they could use for their classes and enough teachers were sufficiently skilled to integrate the hardware and software into their daily lessons. For students who lack a device, forget theirs, or if one dies suddenly in school, students can easily get a device elsewhere in the school. Teachers decide how to weave technologies into their lessons; there is no district prescribed one-best-way for teachers to follow.

The lesson I observed on September 6, 2016 is the final part of a four and a half week unit on the Narrative Essay that began with the first day of school on August 15th.. In this unit, Moul spent the first two weeks of the semester on building community in the class, setting norms for small group work, and reading excerpts from Machiavelli, George Orwell, James Baldwin and others. Students then analyzed the structures of the essays they read. Moul also uses Socratic Seminars during the unit to have students discuss various writers’ essays and reflect on their own writing before beginning their assigned narrative essay (see here).

Moul’s 50-minute lesson (the class meets four times a week on a modified block schedule) begins a moment after the tardy chime sounds. There are 32 students in the class sitting at clusters of three and four desks facing the front white-board. Today’s lesson is divided into four parts.

1. Since there was a national holiday on Monday, Moul asks the students to close the lids of their devices and then begins with a question: what “good news” do they want to share with class? For a few minutes he listens to what students call out about their long weekend: “it is a four day week,” one says, for example. Then he reviews the assignment of writing two drafts about a story they read and how this AP class differs from Honors English class in the number of drafts they will do. More drafts, more revising, he says, is crucial to writing essays. On Friday, the class had looked at the first draft of a student-written “model” essay entitled “The Vulture” (see here).

2. He segues to the next part of the lesson where he tells students to read the second draft of the student’s essay, make comments and then re-read the first draft and make comments on what changes they see between the two.

Students open lids of their tablets and laptops and proceed to read and type in comments for the second draft. From my perch in the back of the class sitting at a student desk, I see that every student appears to be on task. Moul walks up and down aisles between clusters of desks pausing to see what students are jotting down on their screens and stopping to answer student questions.

After about 10 minutes, he asks students to re-read first draft—“I’ll give you 7-8 minutes”—and asks them to put in their notes the differences they see between the two drafts.

3. Watching the wall clock, Moul asks students to stop and to form their groups. Here is where the clusters of three and four desks closely set together become a venue for small group discussion. Moul reminds students to turn their desks to face one another since eye contact is important in looking at group members and not have one’s eyes glued to screen.

In this small group activity, students discuss what they saw as differences between the two drafts of “The Vulture.” I scan the groups and note that all are engaged in talking to one another. I see no student off-task. Moul continues to walk around and listen in to different groups’ exchanges. “In a few moments,” he says, “we will start chit-chatting.”

After a one-minute warning, the teacher ends this activity and asks students to turn around their desks to face front where he is sitting.

4. The final activity is a whole group discussion of the differences between the two drafts and what students saw as improvements in the second draft. About one-fourth of the students raised their hands to respond to teacher’s request for thoughts in this 12 minute activity. After he called on a few students and they spoke—Moul, sitting at a small desk in the center of the classroom horseshoe said, “let me call on people on this side now.” After students comments, the teacher would offer his opinion of the second draft, saying, for example, “I didn’t see much in the conclusion; there needs to be a balance between narrative and exposition.” When one student comments on use of dialogue within a narrative, Moul points out how dialogue helps the flow of the essay.

I scan the class and see that most students turn to listen to one another during the whole group discussion.

Chime sounds to end the period. Moul says “wait” and students sit as he goes on to remind class that their draft is to be turned in Thursday, two days hence—school is on modified block schedule. Teacher releases students and says: “have a great couple of days.”

_______________________________________________

* The high school has over 1900 students (2015) and its demography is mostly minority (in percentages, Latino 28, Asian 21, African American 2, multiracial 2, and 45 white). The percentage of students eligible for free-and-reduced price lunches (the poverty indicator) is 22 percent. Fourteen percent of students are learning disabled and just over four percent of students are English language learners.

Academically, 99 percent of the students graduate high school and nearly all enter higher education. The school offers 20 AP courses—37 percent of the student body take at least one AP course and of those students taking AP tests– 83 percent have gotten 3 or higher, the benchmark for getting college credit. LAHS has been rated repeatedly as one of the top high schools (52nd out of over 1330 in the state and 339h in the nation’s 26,000 high schools). The gap in achievement between minorities and white remains large, however, and has not shrunk in recent years. The per-pupil expenditure at the high school is just under $15,000 (2014). See here, here, here, here, and here.

 

 

4 Comments

Filed under how teachers teach, technology use

Did That Edtech Tool Really Cause That Growth? (Mary Jo Madda)

The quality of research on technology use in schools and classrooms leaves much to be desired. Yet academics and vendors crank out studies monthly. And they are often cited to justify using particular programs. How practitioners can make sense of research studies is an abiding issue. This post offers viewers some cautionary words in looking carefully at findings drawn from studies of software used in schools.

“Mary Jo Madda (@MJMadda) is Senior Editor at EdSurge, as well as a former STEM middle school teacher and administrator. In 2016, Mary Jo was named to the Forbes ’30 Under 30′ list in education.” This post appeared in EdSurge, August 10, 2016.

How do you know whether an edtech product is effective in delivering its intended outcomes? As the number of edtech products has ballooned in the past five years, educators—and parents—seek information to help them make the best decision. Companies, unsurprisingly, are happy to help “prove” their effectiveness by publishing their own studies, sometimes in partnership with third-party research groups, to validate the impact of a product or service.

But oftentimes, that research draws incorrect conclusions or is “complicated and messy,” as Alpha Public Schools’ Personalized Learning Manager Jin-Soo Huh describes it. With a new school year starting, and many kids about to try new tools for the first time, now is a timely moment for educators to look carefully at studies, scrutinizing marketing language and questioning the data for accuracy and causation vs. correlation. “[Educators] need to look beyond the flash of marketing language and bold claims, and dig into the methodology,” Huh says. But it’s also up to companies and startups to question their own commissioned research.

To help educators and companies alike become the best critics, here are a few pieces of advice from administrators and researchers to consider when reviewing efficacy studies—and deciding whether or not the products are worth your time or attention.

For Educators

#1: Look for the “caveat statements,” because they might discredit the study.

According to Erin Mote, co-founder of Brooklyn Lab Charter School in New York City, one thing she and her team look for in studies are “caveat statements,” where the study essentially admits that it cannot fully draw a link between the product and an outcome.

“[There are] company studies that can’t draw a definitive causal link between their product and gains. The headline is positive, but when you dig down, buried in three paragraphs are statements like this,” she tells EdSurge, pointing to a Digital Learning Now study about math program Teach to One (TtO):

The report concludes, “The TtO students generally started the 2012-13 academic year with mathematics skills that lagged behind national norms. Researchers found that the average growth of by TtO students surpassed the growth achieved by students nationally. Although these findings cannot be attributed to the program without the use of an experimental design, the results appear encouraging. Achievement gains of TtO students, on average, were strong.”

Mote also describes her frustration with companies that call out research studies as a marketing tactic, such as mentioning both studies and the product within a brief, 140-character Tweet orFacebook post—even though the study is not about the product itself, as in the Zearn Tweet below. “I think there is danger in linking studies to products which don’t even talk about the efficacy of that product,” Mote says, calling out that companies that do this effectively co-opt research that is unrelated to their products.

Research from @RANDCorporation shows summer learning is key. Use Zearn this summer to strengthen math skills.”

#2: Be wary of studies that report “huge growth” without running a proper experiment or revealing complexities in the data.

Research at Digital Promise, something consumers should look for is “whether or not the study is rigorous,” specifically by asking questions like the following four:

  • Is the sample size large enough?
  • Is the sample size spread across multiple contexts?
  • Are the control groups mismatched?
  • Is this study even actually relevant to my school, grade, or subject area?

Additionally, what if a company claims massive growth as indicated by a study, but the data in the report doesn’t support those claims?

Back in the early 2000s, John Pane and his team at the RAND Corporation set out to demonstrate the effectiveness of Carnegie Cognitive Tutor Algebra. Justin Reich, an edtech researcher at Harvard University, wrote at length about the study, conceding that the team “did a lovely job with the study.”

However, Reich pointed out that users should be wary of claims made by Carnegie Learning marketers that the product “doubles math learning in one year” when, as Reich describes, “middle school students using Cognitive Tutor performed no better than students in a regular algebra class.” He continues:

“In a two-year study of high school students, one year Cognitive Tutor students performed the same as students in a regular algebra class, and in another year they scored better. In the year that students in the Cognitive Tutor class scored better, the gains were equivalent to moving an Algebra I student from the 50th to the 58th percentile.”

Here’s another example: In a third-party study released by writing and grammar platform NoRedInk involving students at Shadow Ridge Middle School in Thornton, CO, the company claims that every student who used NoRedInk grew at least 3.9 language RIT (student growth) points on the popularly-used MAP exam or—by equivalence—at least one grade level, demonstrated in a graph (shown below) on the company’s website. But upon further investigation, there are a few issues with the bar graph, says Alpha administrator Jin-Soo Huh.

shadow_ridge_chart1-1470811419

While the graph shows that roughly 3.9 RIT points equate to one grade level of growth, there’s more to the story, Huh says. That number is the growth expected for an average student at that grade level, but in reality, this number varies from student to student: “One student may need to grow by 10 RIT points to achieve one year of typical growth, while another another student may just need one point,” Huh says. The conclusion: these NoRedInk student users who grew 3.9 points “may or may not have hit their yearly growth expectation.”

Additionally, one will find another “caveat” statement on Page 4 of the report, which reads: “Although answering more questions is generally positively correlated with MAP improvement, in this sample, there was not a statistically significant correlation with the total number of questions answered.”

According to Jean Fleming, NWEA’s VP of Communications, “NWEA does not vet product efficacy studies and cannot offer insight into the methodologies used on studies run outside our organization when it comes to MAP testing. Hence, all the more reason for users to be aware of potential snags.

For Companies

#1: Consider getting your “study” or “research” reviewed.

No one is perfect, but according to Alpha administrator Jin-Soo Huh, “Edtech companies have a responsibility when putting out studies to understand data clearly and present it accurately.”

To help, Digital Promise launched on Aug. 9 an effort to help evaluate whether or not a research study meets its standard of quality. (Here are a few studies that the nonprofit says pass muster, listed on DP’s “Research Map.“) Digital Promise and researchers from Columbia Teachers College welcome research submissions between now and September from edtech companies in three categories:

  • Learning Sciences: How developers use scientific research to justify why a product might work
  • User Research: Rapid turnaround-type studies, where developers collect and use information (both quantitative and qualitative) about how people are interacting with their product
  • Evaluation Research or Efficacy Studies: How developers determine whether a product has a direct impact on learning outcomes

#2: Continue conducting or orchestrating research experiments.

Jennifer Carolan, a teacher-turned-venture capitalist, says both of her roles have required her to be skeptical about product efficacy studies. But Carolan is also the first to admit that efficacy measurement is hard, and needs to continue happening:

“As a former teacher and educational researcher, I can vouch for how difficult it can be to isolate variables in an educational setting. That doesn’t mean we should stop trying, but we need to bear in mind that learning is incredibly complex.”

When asked about the state of edtech research, Francisco responds that it’s progressing, but there’s work to be done. “We still have a long way to go in terms of being able to understand product impact in a lot of different settings,” she writes. However, she agrees with Carolan, and adds that the possibility of making research mishaps shouldn’t inhibit companies from conducting or commissioning research studies.

“There’s a lot of interest across the community in conducting better studies of products to see how they impact learning in different contexts,” Francisco says.

Disclosure: Reach Capital is an investor in EdSurge.

 

6 Comments

Filed under Uncategorized

Determining Success of Technology Integration in Classrooms, Schools, and Districts (Part 4)

 

I ended my last post by writing that attaining the top stage of popular models of technology integration was often equated with “success.” I stated that it was “unfortunate.”

Why?

The top stage in each model (and similar ones) implies that when the teacher has reached this apex of implementation, students are thoroughly engaged in learning tasks and the classroom has become a site of active student learning—the unspoken goal of process-driven cheerleaders of student-centered classrooms. In effect, those teachers who have reached the top rung of the ladder have fully implemented technology to produce the highest levels of student involvement in learning content and skills. Implicitly, that top rung becomes the gold standard of effective teaching in integrating technologies into classroom lessons. And that is unfortunate.

What many smart people ignore or forget is that describing exemplars of technology integration is not synonymous with student-centered teaching. And student-centered teaching is not the same as “success” in student learning. This bias toward one form of teaching leading to student “success”–however defined–is historic (see here).

After all, should K-12 teacher practices change when they reach the apex of the models for integrating technology into their lessons? Certainly, the technologies themselves do not require such a fundamental change from teacher-centered to student-centered. Evidence of technology use in Europe, Asia, and the Americas  (see JECR PDF) have pointed out how powerful devices often end up being used to support teacher-centered instruction.

What’s missing from the assumption that student-centered learning is the same as “successful” technology integration is that reaching the final stage in these models says little about whether students have actually learned anything from the content- and skill-driven classroom lessons they have experienced. Advocates of these technology integration models assume that engagement—the process of hooking children and youth into learning—will move teachers to become student-centered and that shift in practice will yield gains in academic achievement.  Maybe.

I say “maybe” because there is a prior crucial step that needs elaboration and documentation before anyone can determine what students have learned.  Although existing models of technology integration believe that engagement and student-centered classroom practices will produce gains in academic achievement, the book I am now researching will not test this underlying assumption. In my research, thus far, I focus on whether exemplary teachers, schools, and districts in integrating technologies into daily practices have altered what occurs daily in classrooms.

Why focus on changes in classroom teaching and not student outcomes? My answer goes back to the central issue of putting new technologies into daily practice. The all-important implementation question–too often overlooked, ignored, or forgotten by champions of new technologies–remains: have teachers altered their classroom practices as a consequence of using new technologies? Without such changes in teaching practices, then student learning and outcomes can hardly be expected to improve. That statement is a fundamental belief in establishing and operating any formal school, past and present. Thus, without changes in daily classroom practice, any gains in student academic achievement could not be attributed to what happens in classrooms. Improved measures of student achievement might then be the result of changes in student demography, school leadership, shifts in organizational culture or other factors–not what teachers were doing everyday with students. In my research, then, I am concentrating on determining to what degree teachers have altered how they teach as a consequence of integrating new technologies into their lessons.

Far too little research has been done in answering this question about changes in teaching practices. So in researching and writing this book, I, too, focus on the process of classroom change and not yet how much and to what degree students have learned from these lessons. Once changes in classroom practices can be documented then, and only then, can one begin to research how much and to what degree students have learned content and skills. As you have probably guessed by now, that would be another book, not the one I will be writing.

 

Leave a comment

Filed under how teachers teach, school reform policies, technology use

Stages of Technology Integration in Classrooms (Part 3)

Technology integration is not a binary choice: you either do it or you don’t. Anyone who has taught, observed classrooms and thought about what it means to include electronic devices and software into daily lessons knows that technology integration, like raising a child, learning to drive or cultivating a garden, is a process–not an either/or outcome. One goes through various stages in learning how to raise a child, drive a car, grow a garden. In each instance, a “good” child, driving well, a fruitful garden is the desired but not predictable outcome.

A host of researchers and enthusiasts have written extensively about the different phases a teacher, school, and district goes through in integrating technology into their daily operations. Most of the literature seldom mentions that such movement through increasingly complicated stages is really phases of putting a new idea or practice into action. The labels for the levels of classroom practice vary–novice to expert, traditional to innovative, entry-level to transformational.

Writers and professional associations have described how individuals and organization stumble or glide from one phase to another before smoothly using electronic devices to reach larger ends. And it is the ends (e.g., content, skills, attitudes) that have to be kept in sight for those who want teachers to arrive at the top (or last) stage. Buried in that final implementation stage is a view of “good” technology integration and, implicitly, “good” teaching. Often obscured but still there, these notions of what are “good” teaching and learning are embedded in that last stage. Figuring out those ends and what values are concealed within them is difficult but revealing in the biases that model-builders and users have.

As with arriving at a definition (see last post), I have examined many such conceptual frameworks that lay out a series of steps going from a beginner to an expert (across frameworks the names for each step vary). Most often mentioned are the Apple Classroom of Tomorrow (ACOT) and the SAMR models. Many implementation frameworks in use are variations of these two.

The ACOT model.

The earliest stage model came from the demonstration project Apple launched in the mid-1980s when the company placed in five elementary and secondary classrooms across the country, a desktop computer for each student and teachers—the earliest 1:1 classrooms. Moreover, each classroom had a printer, laser disc, videotape player, modem, CD-ROM drivers and software packages. The project grew over the years to 32 teachers in ACOT schools in four states. [i]

One of the longer initiatives ever undertaken in creating technology-rich classrooms—ACOT lasted nearly a decade—researchers drew from observations and interviews with teachers and students a host of findings one of which was the process that teachers went through in integrating technology into daily lessons.

chapter-10-new-40-638

That five-stage process that ACOT teachers traversed went from Entry where teachers coped with classroom discipline problems, managed software, technical breakdowns and physical re-arrangement of rooms to Adoption where beginners’ issues were resolved. The next stage of implementing technology in the classroom, Adaptation, occurred when teachers figured out ways to use the devices and software to their advantage in teaching—finding new ways of monitoring student work, grading tests, creating new materials, and tailoring content and skills to individual students. At this stage, teachers had fully integrated the technology into traditional classroom practice.

The Appropriation phase comes next when teachers have shifted in their attitudes toward using technology in the classroom. At this point, the teacher uses the technology seamlessly in doing lessons. New classroom habits and ways of thinking about devices and software occur. The authors of Teaching with Technology say: “Appropriation is the turning point for teachers…. It leads to the next stage, invention, where new teaching approaches promote the basics yet open the possibility of a new set of student competencies.” [ii]

In the Invention stage, teachers try out new ways of teaching (e.g., project-based learning, team teaching, individualized lessons) and new ways of connecting to students and other teachers (e.g., students providing technical assistance to other students and teachers, collaboration among students). As the authors summed up: “Reaching the invention stage … was a slow and arduous process for most teachers.” In short, at this stage of implementing technology, ACOT researchers believed that teachers would replace their traditional teacher-centered practices. The majority of teachers, however, never made it to this stage. [iii]

The SAMR model.

Developed by Ruben Puentedura, SAMR stands for: Subsitution, Augmentation, Modification, Redefinition. The four rungs of the implementation ladder go from the lowest, replacing an existing tool (e.g., overhead projector) with an electronic device (e.g., interactive whiteboard) but displaying no change in the pedagogy or lesson content to the next rung where the lesson is modified through use of new technology (e.g., study the concept of the speed of light by using a computer simulation). The third rung of the ladder of putting technology into practice is where the teacher modifies the lesson and “allows for significant task redesign” (e.g., students show their understanding of content in class by recording audio and then saving it as a sound file) and, finally, to the top of the ladder, redefinition, where the technology “allows for the creation of new tasks previously inconceivable.” Examples here would be students creating a movie or podcast and putting it on the Internet to get comments or students writing posts for a class blog on the web about the history of the Great Depression. At this final stage of technology integration, student engagement is highest. The SAMR model assumes that high student engagement leads to gains in student academic achievement. Thus, the SAMR model implicitly promises improved student achievement.

5805548

More popular with practitioners and consultants marketing professional development in the U.S. and abroad than among researchers, this implementation model is context-free, hierarchical, and unanchored in the research literature on integrating technology. While some researchers have criticized it extensively, it remains popular among teachers and technology coordinators. [iv]

Both ACOT and SAMR involve what teachers know of subject-matter content, insights into their own teaching, and what they know about using technology. This interplay between content, pedagogy, and technology has led to another popular model among technology coordinators, practitioners, and researchers in the field.

Not a stage model of implementation, these domains of “Content Knowledge, Pedagogical Content Knowledge, and Technological Knowledge”, like intersecting circles in a Venn, overlap. The resulting clumsy acronym is TPACK for Technological Pedagogical Content Knowledge. TPACK slides easily into SAMR adding to what teachers are expected to know and do in moving from one stage to another. Like the other models, TPACK also has come in for extensive criticism.[v]

TPACK-new

These models—and there are others as well—seek to move teacher use of technology in daily lessons from the primitive to the sophisticated, from exchanging pencil-and-paper for word processing, from redesigning classroom activities through available software to engaging students in learning. The top stages of these implementation models reject traditional modes of teaching and implicitly lean toward a preferred manner of instruction—student-centered. [vi]

Too often, however, the top rung of the ladder—where putting technology integration into creating active learning tasks for students–becomes a proxy for success. Either “Invention” in the ACOT model or “Redefinition” in SAMR becomes surrogates for judging teacher success in not only effectively integrating their use of technology but also in improving student outcomes. And that is unfortunate.

The next and final post explains why I say “unfortunate.”

_________________________________

[i] Judith Sandholtz, Cathy Ringstaff, and David Dwyer, Teaching with Technology : Creating Student-Centered Classrooms (New York: Teachers College Press, 1997). See p. 187 for number of ACOT teachers, schools, and states.

[ii] Ibid., p. 43.

[iii] Ibid., p. 47.

[iv] For a description of SAMR, see Ruben Puentadura’s presentation at: http://www.hippasus.com/rrpweblog/archives/2014/06/29/LearningTechnologySAMRModel.pdf

For a short video on SAMR, see: https://www.youtube.com/watch?v=OBce25r8vto

Critics include Erica Hamilton, et. al., “The Substitution Augmentation Modification Redefinition (SAMR) Model: a Critical Review and Suggestions for its Use,” Tech Trends, 2016, 60(5), pp. 433-441; Jonas Linderoth, “Open Letter to Dr. Ruben Puentadura, October 17, 2013 at

http://spelvetenskap.blogspot.com/2013/10/open-letter-to-dr-ruben-puentedura.html

I did a Google search for “SAMR model” and got 245,000 hits; “ACOT model” received just over 63,000 entries. September 4, 2016.
[v] Punya Mishra and Matthew Koehler, “Technological Pedagogical Content Knowledge: A Framework for Teacher Knowledge,” Teachers College Record, 2006, 108(6), pp. 1017-1054. For criticism of TPACK, see Leanna Archambault and Joshua Barnett, “Revisiting Technological Pedagogical Content Knowledge: Exploring the TPACK Framework,” Computers & Education, 2010, 55, pp. 1656-1662; Scott Bulfin, et. al., “Stepping Back from TPACK,” Learning with New Media, March 19, 2013 at: http://newmediaresearch.educ.monash.edu.au/lnm/stepping-back-from-tpack/

A Google search for “TPACK model” on September 4, 2016 produced just under 90, 000 hits.

[vi] The summary of ACOT research and practice is in: Judith Sandholtz, Cathy Ringstaff, and David Dwyer, Teaching with Technology : Creating Student-Centered Classrooms (New York: Teachers College Press, 1997). The sub-title captures the intent of the model. The SAMR model highlights increasing student engagement at each rung of the ladder. Among advocates of student-centered classrooms, engagement is a synonym for “active learning,” a principle undergirding student-centeredness in teaching. While increasing active student involvement at each stage beyond Substitution, Ruben Puentadura has not stated directly his preference for student-centeredness as a goal. I have found no direct statements on his seeking student-centered instruction. Those curriculum specialists, teachers, technology coordinators and independent consultants who have picked up and ran with SAMR, however, have indeed seen the model as a strategy for teachers to alter their classroom practices—with qualifications and amendments–and embrace student-centered instruction.

See, for example, Cathy Grochowski, “Interactive Technology: What’s SAMR Got To Do With It?” June 1, 2016 at: http://edblog.smarttech.com/2016/06/11471/

Jennifer Roberts, “Turning SAMR into TECH: What Models Are Good For,” November 30, 2013 at: http://www.litandtech.com/2013/11/turning-samr-into-tech-what-models-are.html

Kathy Schrock, “SAMR and Bloom’s,” (no date) at: http://www.schrockguide.net/samr.html

 

 

Leave a comment

Filed under research, technology

Defining Technology Integration (Part 2)

Current definitions of technology integration are a conceptual swamp. Some definitions focus on the technology itself and student access to the devices and software. Some concentrate on the technologies as tools to help teachers and students reach curricular and instructional goals. Some mix a definition with what constitutes success or effective use of devices and software. Some include the various stages of technology integration from simple to complex. And some include in their definitions a one-best-way of integrating technology to advance an instructional method such as student-centered learning. Thus, a conceptual swamp sucks in unknowing enthusiasts and fervent true believers into endless arguing over exactly what is technology integration. [i]

To avoid such a swamp and get into semantic arguments in identifying teachers and schools where a high degree integrated devices in daily practices had occurred, I relied upon informal definitions frequently used by practitioners.

From what practitioners identified as “best cases” of technology integration, I learned that varied indicators came into play when I asked for exemplars. These indicators helped create a grounded definition of technology integration in identifying districts, schools and teachers:

* District had provided wide access to devices and established infrastructure for use . System administrators and cadre of teachers had fought insistently for student access to hardware (e.g., tablets, laptops, interactive whiteboards) and software (e.g., the latest programs in language arts, math, history, and science) either through 1:1 programs for the entire schools, mobile carts, etc.

*District established structures for how schools can improve learning and reach desired outcomes through technology. District administrators and groups of teachers had established formal ways for monitoring academic student progress, created teacher-initiated professional development, launched on-site coaching of teachers and daily mentoring of students, and provided easily accessible assistance when glitches in devices or technological infrastructure occurred. They sought to use technology to achieve content and skill goals.

* Particular schools and teacher leaders had requested repeatedly personal devices and classroom computers for their students. Small teacher-initiated projects–homegrown, so to speak–flowered and gained support of district administrators. Evidence came from sign-up lists for computer carts, volunteering to have pilot 1:1 computer projects in their classrooms and purchase orders from specific teachers and departments.

* Certain teachers and principals came regularly to professional development workshops on computer use in lessons. Voluntary attendance at one or more of these sessions indicated motivation and growing expertise.

* Students had used devices frequently in lessons. Evidence of use came from teacher self-reports, principal observations, student comments to teachers and administrators and word-of-mouth among teachers and administrators in schools.

Note that in all of these conversations, no district administrator, principal, or teacher ever asked me what I meant by “technology integration.” Some or all of the above indicators repeatedly came up in our discussions. I leaned heavily upon the above signs of use and less upon a formal definition (see above) in identifying candidates to study.

I wanted a definition that would fit what I had gleaned from administrators and teachers about how they informally concluded what schools and which teachers were exemplars of technology integration. I wanted a definition that got past the issue of access to glittering new machines and Gee Whiz applications. I wanted a definition that focused on classroom and school use aimed toward achieving teacher and district curricular and instructional goals. I wanted a definition that put hardware and software in the background, not the foreground. I wanted a definition grounded in what I heard and saw in classrooms, schools, and districts.

Of the scores of formal definitions in the literature I have sorted through, I looked for one that would be clear and make sense to experts, professionals, parents, and taxpayers. Only a few met that standard. [ii]

I did fashion one that avoided the conceptual morass of defining technology integration and matched the “best cases” that superintendents, technology coordinators, and teachers had selected for me to observe.[iii]

“Technology integration is the routine and transparent use in learning, teaching, and assessment of computers, smartphones and tablets, digital cameras, social media platforms, networks, software applications and the Internet aimed at helping students reach the district’s and teacher’s curricular and instructional goals.”*

If this definition succeeds in putting technology in the background, not the foreground, then the next step in my research is to elaborate how such a process unfolds in classrooms, schools, and districts by examining the various stages teachers go through in integrating technology before moving to assessments of how successful (or not) the technology integration works.

____________________________________________

*Thanks to reader Seb Schmoller for adding to this definition

[i] Examples of the different definitions mentioned in text can be found at:

http://www.definitions.net/definition/Technology%20integration

 

https://nces.ed.gov/pubs2003/tech_schools/chapter7.asp

 

https://en.wikipedia.org/wiki/Technology_integration

 

http://www.education4site.org/blog/2011/what-do-we-really-mean-by-technology-integration/

 

http://members.tripod.com/sjbrooks_young/techint.pdf

 

http://fcit.usf.edu/matrix/matrix.php

 

http://jan.ucc.nau.edu/~coesyl-p/principle3-article2.pdf

[ii] Rodney Earle, “The Integration of Instructional Technology into Public Education: Promises and Challenges,” Education Technology Magazine, 2002, 42(1), pp. 5-13. His definition of integration concentrates on the teaching, not hardware or software:

“Computer technology is merely one possibility in the selection of media and the delivery mode—part of the instructional design process —not the end but merely one of several means to the end.”

Khe Foon Hew and Thomas Brush, “Integrating Technology into K-12 Teaching and Learning,” Education Tech Research Development, 2007, 55, pp. 223-252. Their definition is:

“[T]echnology integration is thus viewed as the use of computing devices such as desktop computers, laptops, handheld computers, software, or Internet in K-12 schools for instructional purposes.

[iii] I took a definition originally in Edutopia and revised it to make clear that the integration of technology in daily lessons is harnessed to achieving curricular and instructional goals of the teacher, school, and district. The devices and software are not front-and-center but routinely used in lessons.  I then stripped away language that connected usage of technologies to “success” or preferred ways of teaching. (No author) “What is Successful Technology Integration, “ Edutopia, November 5, 2007 at: http://www.edutopia.org/technology-integration-guide-description

 

18 Comments

Filed under research, technology use