Category Archives: technology use

Can You Be a ‘Good Teacher’ Inside a Failing School? (JennyAbamu)

This article appeared in EdSurge, April 2, 2018

“Jenny Abamu is an education technology reporter at EdSurge where she covers technology’s role in both higher education and K-12 spaces. She previously worked at Columbia University’s EdLab’s Development and Research Group, producing and publishing content for their digital education publication, New Learning Times. Before that, she worked as a researcher, planner, and overnight assignment editor for NY1 News Channel in New York City. She holds a Masters degree in International and Comparative Education from Columbia University’s Teacher’s College.”

Untitled_design-1522681129.jpg

Here’s a popular movie plot: Great teacher goes into a troubled neighborhood and turns around a low-performing school. Educators love the messages from these films, and even children are inspired. Unfortunately, many school districts never find the Coach Carters or Erin Gruwells who bring such happy endings. In fact, in a broken district such as Detroit’s, schools in hard-bitten neighborhoods sometimes go from “turnarounds” to closure.

 

Fisher Magnet Upper Academy is a middle school located within one of the toughest neighborhoods in the city, stricken with poverty and crime. In 2013, local news reports named the area the third most violent zip code in America. In 2016, Fisher was named one of the 38 campuses at risk of closure after the Michigan legislature passed a bill saying any school ranked at the bottom 5 percent of state campuses for three years in a row would be subject to consequences.

IMG_4167__1_-1522681605.JPG

Despite a relatively new building, constructed in 2003 as part of a former superintendent’s turnaround project, Fisher has suffered from consistent low performance—falling far below state standards on exams and adjusted growth targets designed specifically for the school. During the 2015-16 school year, only 0.7 percent (3 out of 451) of students met state standards in math, and only 4.5 percent met English Language Arts standards.

Carl Brownlee, a former United States Marine officer, is a middle school social studies teacher at Fisher Magnet Upper Academy. He has been teaching at Fisher for over 10 years. He believes changes in academic performance can happen in a struggling school like Fisher but says he has only seen it happen in the movies.

“The only person I have seen that had the ability change this type of climate and culture was Joe Clark, or Morgan Freeman in that movie, ‘Lean On Me,’” says Brownlee. That doesn’t mean he thinks improvements are implausible, though. He says: “I think there were some good ideas that movie that you could translate into schools.”

Brownlee believes that he is a good teacher, in spite of what test scores may reflect. And he feels as though his students have been slighted by ineffective teachers in the past. So he plans to stay at Fisher, where he hopes to bring advanced teaching skills to students that other educators may ignore.

“My children are being cheated because they are not given the same experiences as their counterparts in other schools, and that’s not fair,” says Brownlee. “That’s one of the reasons I stay where I am at.”

As students trickle in Brownlee’s classroom on a Friday morning, he stands by the door to greet each of them. He instructs them to grab their work folders and get into groups. The 6th, 7th and 8th graders entering Brownlee’s class in their uniforms are respectful, quiet and—though naturally distracted from time to time with whispers and giggles—appear to be on task. They ask questions and support one another as they move around in groups through learning stations Brownlee has set up in class.

IMG_4151-1522681812.JPG

Technology has a role to play in Brownlee’s effort as well. Using the free version of tools such as Edmodo, Kahoot, and Google Forms, Docs and sometimes Slides, Brownlee varies his lessons on topics such as Chinese history and the Missouri Compromise. He opens up classes with hip-hop education from Flocabulary, then goes into worksheets, videos, group assignments, and desktop assignments—incorporating cell phone apps and music in the activities. He constantly walks around the room, refocusing off-task students and offering feedback on their work. His classroom does not fit the “before” image in most romanticized school turnaround stories.

IMG_4140__1_-1522681931.JPG

“The skill sets that I have, I like to use them with these young people instead of going to another school where you might get the test scores that people are asking for,” Brownlee continues, noting that he is not looking to teach the “ideal student” in an exemplary school. “My kids don’t come from that, so I try to hang in there and do my best. They need it. They deserve it.”’

By “doing his best” Brownlee means constantly learning, often looking for resources outside of the district for support. He is also trying to incorporate more personalization into his classroom, noting that the State Department of Education has embraced the implementation of such instructional models.

Since Fisher does not have enough Chromebooks for every student, teachers share a cart of laptops that travel from class to class. Teachers also combine technical and non-technical ways of personalizing instruction. For Brownlee, part of personalization means gauging the social and emotional well-being of students each morning, so he knows how to approach them throughout the lesson.

“Are they ready for school work? You might have [a student] come in who just had a loved one die the night before. We have had that on many occasions. They still come to school,” explains Brownlee. “You can’t just go into teaching when they come into the classroom if you don’t know where they are at.”

To meet students where they are, Brownlee has a couple of go-to tools. He uses apps such as Quizlet to encourage students to learn independently. In addition, he uses Edpuzzle when students are having a hard time with particular topics in class. The app allows them to rewatch annotated video lessons.

“I have done it with several of our resource students,” Brownlee explains, noting how one struggling 7th-grader has been showing improvement since using apps like Edmodo and Edpuzzle. “He comes to class every day, and you can see that he is trying. He wants to understand what is going on. He likes to look at the videos, and his effort is starting to show in his work. You get a little joy when you see them getting it.”

Brownlee also has digital portfolios that he uses to track student mastery and growth, something all teachers in his school incorporate. Yet, he notes that this method has yielded mixed results, particularly since many teachers serve a large number of students–and struggle to keep records up to date.

Brownlee works with 198 students daily and admits its difficult for him and other teachers to add student work to the portfolios consistently. “It’s just very difficult when you have so many students to try to personalize for each one,” he says. “You can tailor for each student, but only to a certain extent.”

Despite the difficulties, Brownlee has not given up trying to tailor instruction for his students. He makes time to celebrate the small gains he sees students making, like the lessons he teaches that students remember long after they graduate. But he admits that there are days that he gets tired, particularly noting the difficulties keeping up with changes in the district.

Screen_Shot_2018-1522682114.jpg

His district has flipped between state and local control over the years and has had a number of different superintendents and principals; most of them bring new initiatives with them. This school year Brownlee has a new superintendent and a new school principal, but real-world challenges facing students in and out of his school continues. He is cautiously hopeful that things can in improve, but the familiarity of changes that don’t yield academic results is haunting— causing him to work overtime with his happy-ending out of sight.

“No matter what you do you are still going to be accountable for the test score. It does not matter if the students just came from another school or district. It does not matter if they came to you four grades behind, if that child’s family is impoverished, or if the child has any type of learning disability that may be undiagnosed,” says Brownlee. “It is more like a professional football team. If the team does not win, it is the coach’s fault, and the coach is fired.”

 

Advertisements

8 Comments

Filed under how teachers teach, technology use

Whatever Happened to One-Laptop-Per-Child?

OLPC-air-drop-laptops-1.jpg

 

 

 

olpc-one-laptop-per-child-educational-program-1-2000-64440.jpg

 

Where did the idea of One-Laptop-Per-Child (OLPC) originate?

In the early aughts of the 21st century, the cost of a laptop ran close to $1000. The idea of producing one that would sell for $100 and be sent to children in Latin America, Africa, and Asia who may or may not have had a schoolhouse or teacher caused giggles among tech engineers, entrepreneurs, and venture capitalists.  A contemporary of Seymour Papert and head of a center at Massachussetts Institute of Technology (MIT), Nicholas Negroponte believed that cheap, durable, and Internet-connected machines could revolutionize teaching and learning. He told conferees in 2010:

One the things people told me about technology, particularly about laptops in the beginning, “Nicholas, you can’t give a kid a laptop laptop that’s connected and walk away.” Well you know what, you can. You actually can. And we have found that kids in the remotest parts of thew world, when given that connected [laptop], like some of the kids in these pictures, not only teach themselves how to read and write, but most importantly, and this we found in Peru first, they teach their parents how to read and write.

OLPC launched in 2006-2007 and the price was around $150 per laptop but crept up to just over $200 over the next few years. Negroponte contracted with ministries of education in various countries to buy laptops.

What is OLPC?

The hardware and software of the initial laptops distributed in Nigeria, Uruguay, and later Peru, Rwanda and other developing nations was simple enough:

240px-LaptopOLPC_a.jpg

The rugged, low-power computers use flash memory instead of a hard drive, run a Fedora-based operating system and use the SugarLabs Sugar user interface…. Mobile ad hoc networking based on the 802.11s wireless mesh network protocol allows students to collaborate on activities and to share Internet access from one connection. The wireless networking has much greater range than typical consumer laptops [of those years]. The XO-1 has also been designed to be lower cost and much longer-lived than typical laptops.

olpc.gif

Depending upon the country, ministry officials distributed the devices to children directly in villages and towns and in rural and urban schools between 2007-2014. OLPC came to the U.S. in 2008.

olpc-2-1.jpg

 

What problems did OLPC seek to solve?  Massive poverty in economically developing countries was the chief problem. Education was the solution. But building schools and providing teachers was costly. Children and youth were motivated to learn but lacked access to schools. And where schools were available, tuition and inadequately trained teachers often made education a rote-filled sequence of lessons resulting in high student attrition. OLPC connected to the Internet was seemingly a solution to problems of limited access to schooling and traditional teaching by permitting students to use software to acquire knowledge and cognitive skills in a variety of subjects beyond memorizing lessons.  The belief that increased access to schools, teachers, devices, and the like would break the shackles of poverty continues.

Or as the OLPC project said:

We aim to provide each child with a rugged, low-cost, low-power, connected laptop. To this end, we have designed hardware, content and software for collaborative, joyful, and self-empowered learning. With access to this type of tool, children are engaged in their own education, and learn, share, and create together. They become connected to each other, to the world and to a brighter future.

Negroponte believed that children could be “agents of change” to create their own learning with these laptops (see video of Negroponte talking about OLPC in Afghanistan here).

Did OLPC work?

Amid that optimism, the issue of putting the devices into hands of teachers and students–implementation–was given short shrift. Without careful thought and action on Internet-connected devices and software into hands of teachers and students (or children and youth not in school), any definition of “work” becomes suspect. There was a magical belief in OLPC, like a fairy Godmother turning a pumpkin into Cinderella’s carriage to take her to the ball.

How teachers were to use the laptop or how students were to magically learn led one skeptic to put it this way:

olpc-plan.jpg

If “work” means that students used the devices until they wore out or had to be trashed because once broken they could not be fixed, then OLPC “worked.” But if “work” means that students learned more, faster, and better (as measured by existing national tests or other metrics of academic achievement) available evidence is close to nil (see here, here, and here). And if “work” means as the founders sought, that is, With access to this type of tool, children are engaged in their own education, and learn, share, and create together, no evidence of such grand dreams for OLPC exists.

What happened to OLPC?

OLPC exists today. It is a small operation with projects in Africa, Asia, and Latin America (see here and here).

The details of the splitting apart of OLPC into two organizations since 2008, the departure of the founders including Negroponte, the constant searching for new contracts in Latin America, Africa, and Asia, and the shell of the once expansive organization that continues to exist now is described here, here, and here.

Did OLPC fail? Succeed? Depends on how “success” and “failure” are defined,  who does the defining, and the criteria used to make the judgment.

8 Comments

Filed under Reforming schools, technology use

12 Things Everyone Should Understand About Tech (Anil Dash)

“Anil Dash is an entrepreneur, activist and writer recognized as one of the most prominent voices advocating for a more humane, inclusive and ethical technology industry. He is the CEO of Fog Creek Software, the renowned independent tech company behind Glitch, the friendly new community that helps anyone make the app of their dreams, as well as its past landmark products like Trello and Stack Overflow.

Dash was an advisor to the Obama White House’s Office of Digital Strategy, and today advises major startups and non-profits including Medium and DonorsChoose. He also serves as a board member for companies like Stack Overflow, the world’s largest community for computer programmers, and non-profits like the Data & Society Research Institute, whose research examines the impact of tech on society and culture; the NY Tech Alliance, America’s largest tech trade organization; and the Lower East Side Girls Club, which serves girls and families in need in New York City…. Dash is based in New York City, where he lives with his wife Alaina Browne and their son Malcolm. Dash has never played a round of golf, drank a cup of coffee, or graduated from college.”

This post appeared March 14, 2018 on Humane Tech

 

Tech is more important than ever, deeply affecting culture, politics and society. Given all the time we spend with our gadgets and apps, it’s essential to understand the principles that determine how tech affects our lives.

Understanding technology today

Technology isn’t an industry, it’s a method of transforming the culture and economics of existing systems and institutions. That can be a little bit hard to understand if we only judge tech as a set of consumer products that we purchase. But tech goes a lot deeper than the phones in our hands, and we must understand some fundamental shifts in society if we’re going to make good decisions about the way tech companies shape our lives—and especially if we want to influence the people who actually make technology.

Even those of us who have been deeply immersed in the tech world for a long time can miss the driving forces that shape its impact. So here, we’ll identify some key principles that can help us understand technology’s place in culture.

What you need to know:

1. Tech is not neutral.

One of the most important things everybody should know about the apps and services they use is that the values of technology creators are deeply ingrained in every button, every link, and every glowing icon that we see. Choices that software developers make about design, technical architecture or business model can have profound impacts on our privacy, security and even civil rights as users. When software encourages us to take photos that are square instead of rectangular, or to put an always-on microphone in our living rooms, or to be reachable by our bosses at any moment, it changes our behaviors, and it changes our lives.

All of the changes in our lives that happen when we use new technologies do so according to the priorities and preferences of those who create those technologies.

2. Tech is not inevitable.

Popular culture presents consumer technology as a never-ending upward progression that continuously makes things better for everybody. In reality, new tech products usually involve a set of tradeoffs where improvements in areas like usability or design come along with weaknesses in areas like privacy & security. Sometimes new tech is better for one community while making things worse for others. Most importantly, just because a particular technology is “better” in some way doesn’t guarantee it will be widely adopted, or that it will cause other, more popular technologies to improve.

In reality, technological advances are a lot like evolution in the biological world: there are all kinds of dead-ends or regressions or uneven tradeoffs along the way, even if we see broad progress over time.

3. Most people in tech sincerely want to do good.

We can be thoughtfully skeptical and critical of modern tech products and companies without having to believe that most people who create tech are “bad”. Having met tens of thousands of people around the world who create hardware and software, I can attest that the cliché that they want to change the world for the better is a sincere one. Tech creators are very earnest about wanting to have a positive impact. At the same time, it’s important for those who make tech to understand that good intentions don’t absolve them from being responsible for the negative consequences of their work, no matter how well-intentioned.

It’s useful to acknowledge the good intentions of most people in tech because it lets us follow through on those intentions and reduce the influence of those who don’t have good intentions, and to make sure the stereotype of the thoughtless tech bro doesn’t overshadow the impact that the majority of thoughtful, conscientious people can have. It’s also essential to believe that there is good intention underlying most tech efforts if we’re going to effectively hold everyone accountable for the tech they create.

4. Tech history is poorly documented and poorly understood.

People who learn to create tech can usually find out every intimate detail of how their favorite programming language or device was created, but it’s often near impossible to know why certain technologies flourished, or what happened to the ones that didn’t. While we’re still early enough in the computing revolution that many of its pioneers are still alive and working to create technology today, it’s common to find that tech history as recent as a few years ago has already been erased. Why did your favorite app succeed when others didn’t? What failed attempts were made to create such apps before? What problems did those apps encounter — or what problems did they cause? Which creators or innovators got erased from the stories when we created the myths around today’s biggest tech titans?

All of those questions get glossed over, silenced, or sometimes deliberately answered incorrectly, in favor of building a story of sleek, seamless, inevitable progress in the tech world. Now, that’s hardly unique to technology — nearly every industry can point to similar issues. But that ahistorical view of the tech world can have serious consequences when today’s tech creators are unable to learn from those who came before them, even if they want to.

5. Most tech education doesn’t include ethical training.

In mature disciplines like law or medicine, we often see centuries of learning incorporated into the professional curriculum, with explicit requirements for ethical education. Now, that hardly stops ethical transgressions from happening—we can see deeply unethical people in positions of power today who went to top business schools that proudly tout their vaunted ethics programs. But that basic level of familiarity with ethical concerns gives those fields a broad fluency in the concepts of ethics so they can have informed conversations. And more importantly, it ensures that those who want to do the right thing and do their jobs in an ethical way have a firm foundation to build on.

But until the very recent backlash against some of the worst excesses of the tech world, there had been little progress in increasing the expectation of ethical education being incorporated into technical training. There are still very few programs aimed at upgrading the ethical knowledge of those who are already in the workforce; continuing education is largely focused on acquiring new technical skills rather than social ones. There’s no silver-bullet solution to this issue; it’s overly simplistic to think that simply bringing computer scientists into closer collaboration with liberal arts majors will significantly address these ethics concerns. But it is clear that technologists will have to rapidly become fluent in ethical concerns if they want to continue to have the widespread public support that they currently enjoy.

6. Tech is often built with surprising ignorance about its users.

Over the last few decades, society has greatly increased in its respect for the tech industry, but this has often resulted in treating the people who create tech as infallible. Tech creators now regularly get treated as authorities in a wide range of fields like media, labor, transportation, infrastructure and political policy — even if they have no background in those areas. But knowing how to make an iPhone app doesn’t mean you understand an industry you’ve never worked in!

The best, most thoughtful tech creators engage deeply and sincerely with the communities that they want to help, to ensure they address actual needs rather than indiscriminately “disrupting” the way established systems work. But sometimes, new technologies run roughshod over these communities, and the people making those technologies have enough financial and social resources that the shortcomings of their approaches don’t keep them from disrupting the balance of an ecosystem. Often times, tech creators have enough money funding them that they don’t even notice the negative effects of the flaws in their designs, especially if they’re isolated from the people affected by those flaws. Making all of this worse are the problems with inclusion in the tech industry, which mean that many of the most vulnerable communities will have little or no representation amongst the teams that create new tech, preventing those teams from being aware of concerns that might be of particular importance to those on the margins.

7. There is never just one single genius creator of technology.

One of the most popular representations of technology innovation in popular culture is the genius in a dorm room or garage, coming up with a breakthrough innovation as a “Eureka!” moment. It feeds the common myth-making around people like Steve Jobs, where one individual gets credit for “inventing the iPhone” when it was the work of thousands of people. In reality, tech is always informed by the insights and values of the community where its creators are based, and nearly every breakthrough moment is preceded by years or decades of others trying to create similar products.

The “lone creator” myth is particularly destructive because it exacerbates the exclusion problems which plague the tech industry overall; those lone geniuses that are portrayed in media are seldom from backgrounds as diverse as people in real communities. While media outlets may benefit from being able to give awards or recognition to individuals, or educational institutions may be motivated to build up the mythology of individuals in order to bask in their reflected glory, the real creation stories are complicated and involve many people. We should be powerfully skeptical of any narratives that indicate otherwise.

8. Most tech isn’t from startups or by startups.

Only about 15% of programmers work at startups, and in many big tech companies, most of the staff aren’t even programmers anyway. So the focus on defining tech by the habits or culture of programmers that work at big-name startups deeply distorts the way that tech is seen in society. Instead, we should consider that the majority of people who create technology work in organizations or institutions that we don’t think of as “tech” at all.

What’s more, there are lots of independent tech companies — little indie shops or mom-and-pop businesses that make websites, apps, or custom software, and a lot of the most talented programmers prefer the culture or challenges of those organizations over the more famous tech titans. We shouldn’t erase the fact that startups are only a tiny part of tech, and we shouldn’t let the extreme culture of many startups distort the way we think about technology overall.

9. Most big tech companies make money in just one of three ways.

It’s important to understand how tech companies make money if you want to understand why tech works the way that it does.

  • Advertising: Google and Facebook make nearly all of their money from selling information about you to advertisers. Almost every product they create is designed to extract as much information from you as possible, so that it can be used to create a more detailed profile of your behaviors and preferences, and the search results and social feeds made by advertising companies are strongly incentivized to push you toward sites or apps that show you more ads from these platforms. It’s a business model built around surveillance, which is particularly striking since it’s the one that most consumer internet businesses rely upon.
  • Big Business: Some of the larger (generally more boring) tech companies like Microsoft and Oracle and Salesforce exist to get money from other big companies that need business software but will pay a premium if it’s easy to manage and easy to lock down the ways that employees use it. Very little of this technology is a delight to use, especially because the customers for it are obsessed with controlling and monitoring their workers, but these are some of the most profitable companies in tech.
  • Individuals: Companies like Apple and Amazon want you to pay them directly for their products, or for the products that others sell in their store.
  • (Although Amazon’s Web Services exist to serve that Big Business market, above.) This is one of the most straightforward business models—you know exactly what you’re getting when you buy an iPhone or a Kindle, or when you subscribe to Spotify, and because it doesn’t rely on advertising or cede purchasing control to your employer, companies with this model tend to be the ones where individual people have the most power.

That’s it. Pretty much every company in tech is trying to do one of those three things, and you can understand why they make their choices by seeing how it connects to these three business models

10. The economic model of big companies skews all of tech.

Today’s biggest tech companies follow a simple formula:

  1. Make an interesting or useful product that transforms a big market
  2. Get lots of money from venture capital investors
  3. Try to quickly grow a huge audience of users even if that means losing a lot of money for a while
  4. Figure out how to turn that huge audience into a business worth enough to give investors an enormous return
  5. Start ferociously fighting (or buying off) other competitive companies in the market

This model looks very different than how we think of traditional growth companies, which start off as small businesses and primarily grow through attracting customers who directly pay for goods or services. Companies that follow this new model can grow much larger, much more quickly, than older companies that had to rely on revenue growth from paying customers. But these new companies also have much lower accountability to the markets they’re entering because they’re serving their investors’ short-term interests ahead of their users’ or community’s long-term interests.

The pervasiveness of this kind of business plan can make competition almost impossible for companies without venture capital investment. Regular companies that grow based on earning money from customers can’t afford to lose that much money for that long a time. It’s not a level playing field, which often means that companies are stuck being either little indie efforts or giant monstrous behemoths, with very little in between. The end result looks a lot like the movie industry, where there are tiny indie arthouse films and big superhero blockbusters, and not very much else.

And the biggest cost for these big new tech companies? Hiring coders. They pump the vast majority of their investment money into hiring and retaining the programmers who’ll build their new tech platforms. Precious little of these enormous piles of money are put into things that will serve a community or build equity for anyone other than the founders or investors in the company. There is no aspiration that making a hugely valuable company should also imply creating lots of jobs for lots of different kinds of people.

To outsiders, creating apps or devices is presented as a hyper-rational process where engineers choose technologies based on which are the most advanced and appropriate to the task. In reality, the choice of things like programming languages or toolkits can be subject to the whims of particular coders or managers, or to whatever’s simply in fashion. Just as often, the process or methodology by which tech is created can follow fads or trends that are in fashion, affecting everything from how meetings are run to how products are developed.

Sometimes the people creating technology seek novelty, sometimes they want to go back to the staples of their technological wardrobe, but these choices are swayed by social factors in addition to an objective assessment of technical merit. And a more complex technology doesn’t always equal a more valuable end product, so while many companies like to tout how ambitious or cutting-edge their new technologies are, that’s no guarantee that they provide more value for regular users, especially when new technologies inevitably come with new bugs and unexpected side-effects.

12. No institution has the power to rein in tech’s abuses.

In most industries, if companies start doing something wrong or exploiting consumers, they’ll be reined in by journalists who will investigate and criticize their actions. Then, if the abuses continue and become serious enough, the companies can be sanctioned by lawmakers at the local, state, governmental or international level.

Today, though, much of the tech trade press focuses on covering the launch of new products or new versions of existing products, and the tech reporters who do cover the important social impacts of tech are often relegated to being published alongside reviews of new phones, instead of being prominently featured in business or culture coverage. Though this has started to change as tech companies have become absurdly wealthy and powerful, coverage is also still constrained by the culture within media companies. Traditional business reporters often have seniority in major media outlets, but are commonly illiterate in basic tech concepts in a way that would be unthinkable for journalists who cover finance or law. Meanwhile, dedicated tech reporters who may have a better understanding of tech’s impact on culture are often assigned to (or inclined to) cover product announcements instead of broader civic or social concerns.

The problem is far more serious when we consider regulators and elected officials, who often brag about their illiteracy about tech. Having political leaders who can’t even install an app on their smartphones makes it impossible to understand technology well enough to regulate it appropriately, or to assign legal accountability when tech‘s creators violate the law. Even as technology opens up new challenges for society, lawmakers lag tremendously behind the state of the art when creating appropriate laws.

Without the corrective force of journalistic and legislative accountability, tech companies often run as if they’re completely unregulated, and the consequences of that reality usually fall on those outside of tech. Worse, traditional activists who rely on conventional methods such as boycotts or protests often find themselves ineffective due to the indirect business model of giant tech companies, which can rely on advertising or surveillance (“gathering user data”) or venture capital investment to continue operations even if activists are effective in identifying problems.

This lack of systems of accountability is one of the biggest challenges facing tech today.

If we understand these things, we can change tech for the better.

If everything is so complicated, and so many important points about tech aren’t obvious, should we just give up hope? No.

Once we know the forces that shape technology, we can start to drive change. If we know that the biggest cost for the tech giants is attracting and hiring programmers, we can encourage programmers to collectively advocate for ethical and social advances from their employers. If we know that the investors who power big companies respond to potential risks in the market, we can emphasize that their investment risk increases if they bet on companies that act in ways that are bad for society.

If we understand that most in tech mean well, but lack the historic or cultural context to ensure that their impact is as good as their intentions, we can ensure that they get the knowledge they need to prevent harm before it happens.

So many of us who create technology, or who love the ways it empowers us and improves our lives, are struggling with the many negative effects that some of these same technologies are having on society. But perhaps if we start from a set of common principles that help us understand how tech truly works, we can start to tackle technology’s biggest problems.

10 Comments

Filed under technology, technology use

Corporate Responsibility for Children Addictions?

18STATE1-master768.gif

 

Like most contentious issues in the U.S. where health and safety are concerned, historically two broad approaches have been used to deal with the effects of products that may be harmful to adults and children.

The dominant approach is to educate the public to the possible dangers (e.g., tainted food, harmful drugs, contaminated water, drunk drivers). In effect, put it on the individual consumer to read and hear about the dangers and then avoid illness and death. When there is a huge outcry over the damage done by, say, alcohol, tobacco, drugs, and reckless driving, for example, schools have been dragged into teaching safe and sane use of potentially dangerous products. Recall that drug, sex ,and driver education  were (and are) staples in district curricula across the country in the 20th century. Educate individual adults and children at home and in school (also with public service ads) and they will be alert to what can hurt them.

Image result for public service ads

 

The second approach, and one that has been used far less than the more popular changing of
individual American’s behavior,  is to convince corporations and their investors who make
money from the product through public persuasion, legislation and fines to create safer products
(e.g., tobacco companies, car makers, major oil firms). Focusing on economic and political
structures–big business and big government–draws attention to altering organizational behavior
rather than individual actions thereby increasing the chances of making significant changes.
From Upton Sinclair’s The Jungle, a novel about Chicago’s meat packing industry in the early 20th
century leading to the federal Pure Food and Drug Act (1906) to Ralph Nader’s Unsafe at Any Speed
and carmakers’ adoption of seat belts and better engineering of highways, public outcries
produced political coalitions that led to changes in corporate behavior and governmental
legislation. While such campaigns take decades to gain more safety and less harmful products, that
has not been the case with guns.

The rash of in-school shootings in the past few years have yet to persuade the Congress to ban

purchase of assault weapons and other ways of restricting who buys guns. Gun-makers and the

National Rifle Association (NRA) have made massive political contributions to presidential and

congressional campaigns to block legislation banning certain weapons time and again. In the wake

of the Parkland High School (FLA) killings of students and teachers, Political groups have formed to

get the President and members of Congress to do something about Americans’ addiction to buy and

use handguns and assault weapons.

 

throw-them-out-nyt-full-640.png

These examples of mobilizing political coalitions to make changes in improving safety and health concentrate on private and public organizations that influence our daily lives rather than focusing on altering the behavior of each and every individual affected. Of course, both strategies come into play; it is neither one or the other but historical examples show repeatedly that the dominant approach in a society where individualism reigns and choice is sacrosanct is to persuade individual Americans to change their behavior. Not large corporations or state and federal laws.

When it comes to addictions to new technologies and social media, the dominant approach remains–change individual behavior with campaigns to have tech-free weekends, urging parents to restrict children’s use of devices to an hour a day, and similar solutions (see here and here).

But in the past few months, the strategy of getting corporations that produce these devices and software to take responsibility for their actions and change what they do rather than focusing on the individual has emerged. Consider the action of two major investors in technology who own over two billion dollars of shares in Apple (Jana Partners and California State Teachers Retirement System) calling upon the Apple Board of Directors to help parents and children avoid addictive behavior in overusing the iPhone, iPad, and laptops.

we have reviewed the evidence and we believe there is a clear need for Apple to offer parents more choices and tools to help them ensure that young consumers are using your products in an optimal manner. By doing so, we believe Apple would once again be playing a pioneering role, this time by setting an example about the obligations of technology companies to their youngest customers.

The investors go on in the letter to the Board of Directors to say the strategy of depending upon individual parents to do the heavy lifting of constraining use of devices is insufficient. Apple has responsibilities to both parents and children to reduce addictive behavior:

Some may argue that the research is not definitive, that other factors are also at work, and that in any case parents must take ultimate responsibility for their children.  These statements are undoubtedly true, but they also miss the point.  The average American teenager who uses a smart phone receives her first phone at age 10 and spends over 4.5 hours a day on it (excluding texting and talking). 78% of teens check their phones at least hourly and 50% report feeling “addicted” to their phones. It would defy common sense to argue that this level of usage, by children whose brains are still developing, is not having at least some impact, or that the maker of such a powerful product has no role to play in helping parents to ensure it is being used optimally.  It is also no secret that social media sites and applications for which the iPhone and iPad are a primary gateway are usually designed to be as addictive and time-consuming as possible, as many of their original creators have publicly acknowledged.  According to the APA survey cited above, 94% of parents have taken some action to manage their child’s technology use, but it is both unrealistic and a poor long-term business strategy to ask parents to fight this battle alone.  Imagine the goodwill Apple can generate with parents by partnering with them in this effort and with the next generation of customers by offering their parents more options to protect their health and well-being.

The letter ends with what the two investors believe Apple can do:

This is a complex issue and we hope that this is the start of a constructive and well-informed dialogue, but we think there are clear initial steps that Apple can follow, including:

  • Expert Committee: Convening a committee of experts including child development specialists (we would recommend Dr. Rich and Professor Twenge be included) to help study this issue and monitor ongoing developments in technology, including how such developments are integrated into the lives of children and teenagers.
  • Research: Partnering with these and other experts and offering your vast information resources to assist additional research efforts.
  • New Tools and Options: Based on the best available research, enhancing mobile device software so that parents (if they wish) can implement changes so that their child or teenager is not being handed the same phone as a 40-year old, just as most products are made safer for younger users.  For example, the initial setup menu could be expanded so that, just as users choose a language and time zone, parents can enter the age of the user and be given age-appropriate setup options based on the best available research including limiting screen time, restricting use to certain hours, reducing the available number of social media sites, setting up parental monitoring, and many other options.
  • Education: Explaining to parents why Apple is offering additional choices and the research that went into them, to help parents make more informed decisions.
  • Reporting: Hiring or assigning a high-level executive to monitor this issue and issuing annual progress reports, just as Apple does for environmental and supply chain issues.

For investors to write such a letter asking one of the wealthiest corporations in the world to take responsibility for its product in influencing children’s behavior is unusual (and in my opinion, about time). But as New York Times reporter Natasha Singer says:

Yes, it would be terrific if Apple introduced new control options for parents. But if shareholders want to fault companies for manipulating or addicting users, they should also be taking a hard look at Facebook, YouTube, Instagram, Snapchat, Netflix, and many more.

Amen.

Turning the spotlight on organizational behavior and the behind-the-scenes structures within which all of us live is a welcome turnabout in a society where the dominant strategy is to get individuals to alter their behavior (see here, here, and here). Yet, as some argue the research driving the case for technology addiction in children and youth is closer to the colloquial use of the word than a medical diagnosis (see here). Thus, public persuasion as in pressuring corporations to do something about their products aligned to political action as in making cars safer (rather than the smoking tobacco campaign) may be more effective in achieving corporate accountability.

6 Comments

Filed under technology use

Spilling the Beans on “Personalized Learning”

Years ago, I met Larry Berger at a conference. I had been impressed with the digital tools his company called Wireless Generation had developed to assess student learning and increase teacher efficiency. We talked briefly at the time. My hunch is that he neither remembers the conversation or my name.

Since that time, his career soared and he is now CEO of Amplify, a technology company once owned by Rupert Murdock’s News Corporation but since sold to Amplify executives who now run it. The company creates and develops curricular and assessment software for schools.

Rick Hess, educational policy maven at the American Enterprise Institute had invited Berger to a conference on the meaning of “personalized learning.” Berger could not attend and he asked a colleague who did attend to read a “confession” that he had to make about his abiding interest in “personalized learning.” Hess included Berger letter to the conferees and it appears below.

Until a few years ago, I was a great believer in what might be called the “engineering” model of personalized learning, which is still what most people mean by personalized learning. The model works as follows:

You start with a map of all the things that kids need to learn.

Then you measure the kids so that you can place each kid on the map in just the spot where they know everything behind them, and in front of them is what they should learn next.

Then you assemble a vast library of learning objects and ask an algorithm to sort through it to find the optimal learning object for each kid at that particular moment.

Then you make each kid use the learning object.

Then you measure the kids again. If they have learned what you wanted them to learn, you move them to the next place on the map. If they didn’t learn it, you try something simpler.

If the map, the assessments, and the library were used by millions of kids, then the algorithms would get smarter and smarter, and make better, more personalized choices about which things to put in front of which kids.

I spent a decade believing in this model—the map, the measure, and the library, all powered by big data algorithms.

Here’s the problem: The map doesn’t exist, the measurement is impossible, and we have, collectively, built only 5% of the library.

To be more precise: The map exists for early reading and the quantitative parts of K-8 mathematics, and much promising work on personalized learning has been done in these areas; but the map doesn’t exist for reading comprehension, or writing, or for the more complex areas of mathematical reasoning, or for any area of science or social studies. We aren’t sure whether you should learn about proteins then genes then traits—or traits, then genes, then proteins.

We also don’t have the assessments to place kids with any precision on the map. The existing measures are not high enough resolution to detect the thing that a kid should learn tomorrow. Our current precision would be like Google Maps trying to steer you home tonight using a GPS system that knows only that your location correlates highly with either Maryland or Virginia.

We also don’t have the library of learning objects for the kinds of difficulties that kids often encounter. Most of the available learning objects are in books that only work if you have read the previous page. And they aren’t indexed in ways that algorithms understand.

Finally, as if it were not enough of a problem that this is a system whose parts don’t exist, there’s a more fundamental breakdown: Just because the algorithms want a kid to learn the next thing doesn’t mean that a real kid actually wants to learn that thing.

So we need to move beyond this engineering model. Once we do, we find that many more compelling and more realistic frontiers of personalized learning opening up.

Berger’s confession about believing in “engineering” solutions such as “personalized learning” to school and classroom problems, of course, has a long history of policy elites in the 20th and 21st centuries seeing technical solutions to school governance, organization, curriculum, and instruction flop. After the post-Sputnik education reforms introduced curricular reforms in math and the natural and social sciences, cheerleaders for that reform confessed that what they had hoped would occur didn’t materialize (see here). After No Child Left Behind became law in 2002, for example, one-time advocates for the law confessed that there was too much testing and too little flexibility in the law for districts and schools (see here).

“Buyer’s remorse” is an abiding tradition.

I have a few observations about contrition and public confessions over errors in thinking about “personalized learning.”.

First, those confessing their errors about solving school problems seldom looked at previous generations of reformers seeking major changes in schools.They were ahistorical. They thought that they knew better than other very smart people who had earlier sought to solve  problems in schooling

Second, the confessions seldom go beyond blaming their own flawed thinking (or others who failed to carry out their instructions) and coming to realize the obvious:  schooling is far more complex a human institution than they had ever considered.

Finally, few of these confessions take a step back to not only consider the complexity of schooling and its many moving parts but also the political, social, and economic structures that keep it in place (see Audrey Watters here). As I and many others have said often, schools are political institutions deeply entangled in American society, culture, and democracy. Keeping the macro and micro-perspectives in sight is a must for those seeking major changes in how teachers teach or how schools educate. Were that to occur the incidence of “morning after” regret might decrease.

 

12 Comments

Filed under how teachers teach, technology use

Thoughts about Technology Then and Now

Nearly two decades ago–1998-1999–my research on schools in Silicon Valley was published as Oversold and Underused: Computers in Classrooms.  Next month, The Flight of a Butterfly or Path of a Bullet, another book about 41 exemplary Silicon Valley teachers who integrated technology into their daily lessons will become available.

What similarities and differences do I see between the two periods of intense activity in getting hardware and software into schools and classrooms?

The similarities are easy to list.

*At both times, policy elites including donors and computer companies urged districts and schools to get desktops into classroom teachers’ and students’ hands.

The hype then and now promised that students would learn more, faster, and better; that classroom teaching would be more student-friendly and individualized–the word today is “personalized”; and that graduates would enter the high-tech workplace fully prepared from day one.

*Teacher and student access to the new technologies expanded.

For example, in the mid-to-late 1990s, Silicon Valley companies and philanthropists gave desktops to schools and districts purchased loads of personal computers. The influx of machines was often distributed within schools to computer labs and media centers (formerly libraries) with most teachers having at least one in their classroom and a couple for students in academic classes. Some software, mostly adaptations of business applications, were given to schools and also purchased. Students had far more access to desktops in labs and classrooms a few times a week, depending upon availability and the lesson content, than ever before.

Nearly twenty years later, that expansion of access student access to digital devices and software is now nearly ubiquitous. Most labs have been retired’ carts holding 25-30 devices are available in classrooms. Many districts now have a device available for each student. As access has increased, so has teacher and student use in lessons.

What about differences?

* Goals for using digital tools have changed.

The initial purposes over thirty years ago for buying and distributing desktops to schools were to solve the nation’s economic problems: U.S. students performing at levels lower than students in other countries. Teachers teaching an outmoded curriculum in traditional ways that failed to exploit the wealth of information available to them and their students electronically. Unpreparedness of students entering the job market in an economy that shifted from industrial- to information-based (see the 1983 report, A Nation at Risk). These were problems that higher standards, better teaching, and new technologies could solve, reformers thought. To end those problems, solutions of stiffer graduation requirements (e.g., four years of each academic subject), uniform and tougher curriculum standards (e.g. Common Core), and, yes, lots of electronic devices and software (e.g., computer labs, 1:1 laptops and tablets) were adopted to accelerate the improvement of U.S. schools and to thereby strengthen the economy.

The preschools and high schools that I visited and observed in action in 1998-1999 (including schools across the country) pursued these goals. The evidence I found, however, that increased access and use of these technological tools has, indeed, achieved those goals was missing. Student academic achievement had not risen because of teachers and students using technologies in their lessons. The dream that teaching would become more efficient and constructivist (an earlier generation would have said “student-centered” and “progressive”) had not materialized. And high school graduates displaying technological skills learned in school did not necessarily step into better-paying jobs.

But in the past decade, those initial goals in the 1990s generating the expansion of access to digital tools have since shifted. Seeking higher academic achievement through using digital tools is no longer a goal. Instead, new devices and software now have the potential for engagement (assuming that it leads directly to higher academic achievement) through “personalized learning.” Moreover, the technology is essential since  students with take state tests online. And the continuing dream of graduating students marching into high-tech jobs, well, that goal has persisted.

*Combined similarities and differences across time.

apples-and-oranges-300x2682.png

The Path of a Butterfly describes and analyzes the observations I made and interviews I conducted in 2016 of 41 elementary and secondary teachers in Silicon Valley who had a reputation for integrating technology into their daily lessons. I found both similarities and differences with the earlier study I did and prior historical research on how teachers taught in the 20th century.

These Silicon Valley teachers that I observed in 2016 were hard working and in using digital tools as familiarly as paper and pencil. Devices and software were now in the background, not foreground–as the previous generation of teachers using devices in computer labs and media centers.

The lessons these 41 teachers taught were expertly arranged with a variety of student activities. These teachers had, indeed, made changes in how they managed administrative details quietly and effortlessly in taking attendance and communicating with students, colleagues, and parents. They saved time and were more efficient using these digital tools than the earlier generation of teachers. For their lessons, they used these tools to create playlists for students, pursue problem-based units, and assess student learning during the actual lesson and afterwards as well. All of this work was seamlessly integrated into the flow of the lesson. I could see that the students were intimately familiar with the devices and how the teacher wove the content of the lesson effortlessly into the different activities. They surely differed from their comrades who I had observed two decades earlier.

But I also noted no fundamental or startling changes in the usual flow of their lessons such as setting goals, designing varied activities and groupings, eliciting student participation, and assessing student understanding. The format of lessons appeared similar to the earlier generation I observed 20 years ago and experienced peers a half- and full century ago whose classrooms I had studied through archival research. These contemporary lessons I observed were teacher-directed and post-observation interviews revealed continuity in how teachers have taught for decades. Sure, the content of lessons had changed–students working with DNA in a biology lesson differed from biology classes I had observed earlier. But the sequence of activities and what students did over the course of a lesson resembled what I had seen many times earlier. Again, stability and change in teaching emerged clearly for me as did the pervasive use of digital tools.

 

3 Comments

Filed under how teachers teach, technology use

The Flight of a Butterfly Or the Path of a Bullet

My next book arrives in early March and I want readers and followers of this blog to be aware of it.

Readers will remember parts of this book showing up as first drafts in posts I published last year about exemplary teachers at various schools in Silicon Valley who integrated technologies into their daily lessons. Many readers commented on the descriptions of lessons of elementary and secondary school teachers across all academic subjects that I posted on this blog. In many cases, those comments were helpful in revising the first draft and choosing which of the descriptions should be in the book. I thank those readers who took the time to comment.

Here is the book cover. The full title takes up a lot of space on the cover. So be it.  Yes, that is a laptop upon which the butterfly is either alighting or fleeing.

cuban-butterfly-border-web.jpg

Here is the publisher’s description of the book:

In this book, Larry Cuban looks at the uses and effects of digital technologies in K–12 classrooms, exploring if and how technology has transformed teaching and learning. In particular, he examines forty-one classrooms across six districts in Silicon Valley that have devoted special attention and resources to integrating digital technologies into their education practices.

Cuban observed all of the classrooms and interviewed each of the teachers in an effort to answer several straightforward, if also elusive, questions: Has technology integration been fully implemented and put into practice in these classrooms, and has this integration and implementation resulted in altered teaching practices? Ultimately, Cuban asks if the use of digital technologies has resulted in transformed teaching and learning in these classrooms.

The answers to these questions reflect Cuban’s assessment not only of digital technologies and their uses, but of the complex interrelations of policy and practice, and of the many—often unintended—consequences of reforms and initiatives in the education world. Similarly, his answers reflect his … understanding of change and continuity in education practice, and of the varying ways in which different actors in the education world—policy makers, school leaders, teachers, and others—understand, and sometimes misinterpret, those changes….

If any readers do get the book (through a library or purchase), read it ,and have some thoughts (critical or positive or a mix of both) about the argument, logic, and evidence I use, please let me know.

 

 

6 Comments

Filed under how teachers teach, technology use