The Future of Coding in Schools

Edutopia

Mitch Resnick, one of the creators of Scratch, on why he thinks coding should be taught in all schools—it’s not the reason you’d expect.

For more than three decades, Mitch Resnick has immersed himself in educational technology and innovative learning models. Now a professor at the MIT Media Lab, and a co-creator of the popular Scratch programming language, Resnick remains a tireless advocate for student-centered education, collaborative learning environments, and the idea that coding is a form of literacy.

His new book, Lifelong Kindergarten: Cultivating Creativity Through Projects, Passion, Peers, and Play, is a look at our current educational moment. “Roughly two-thirds of grade school students will end up doing work that hasn’t been invented yet,” Resnick contends, hinting at the emerging worlds of artificial intelligence, self-driving cars, and “smart” houses. How do we prepare today’s students to meet that challenge?

Get the best of Edutopia in your inbox each week.

We talked with Resnick about the importance of coding in our school system, his thoughts on the changing roles of teachers, and new ways to engage students—and assess their work.

EDUTOPIA: You moved from journalism—writing about computers and business—to the field of educational technology and learning in the 1980s. What inspired that move?

MITCH RESNICK: The most important shift for me in thinking about computers and learning was actually the spring of 1982, the West Coast Computer Faire—which is like an early form of Maker Faire—and Seymour Papert was giving a keynote address. When I heard Seymour talk, it gave me new vision of what role computers might play in people’s lives: They weren’t just machines to get a job done—they could enable people to express themselves in new ways, and change the way people thought about themselves and thought about the world. That was very exciting to me.

EDUTOPIA: Are we still struggling with Papert’s early insight—almost astonishing at the time—that the computer isn’t just a processor of information but a platform for constructing human knowledge?

RESNICK: Yes I think so, and it mirrors a struggle in the education system that has nothing to do with technology. Many people think of learning and education as a process of delivering information or delivering instruction. Other people see learning and education as student-centered—learning is about exploring, experimenting, creating. Those are very different visions that predate the computer, but of course the computer can fit into either of those two models. It’s a wonderful device for delivering information, but it can also be a wonderful device for creating, exploring, and experimenting.

EDUTOPIA: There are influential people, like Apple CEO Tim Cook, saying, “What we need to do is get coding into every single public school. It needs to be a requirement in public schools across the board.” Is that right?

RESNICK: If it were up to me, I would introduce it. But I want to be careful because I don’t want to embrace it for the same reason that some people might. The first question I would ask is: “Why should we learn coding at all?” Many people embrace coding in schools as a pathway to jobs as computer programmers and computer scientists, and of course they’re right that those opportunities are expanding rapidly. But that’s not a great reason for everyone to learn how to code.

Very few people grow up to be professional writers, but we teach everyone to write because it’s a way of communicating with others—of organizing your thoughts and expressing your ideas. I think the reasons for learning to code are the same as the reasons for learning to write. When we learn to write, we are learning how to organize, express, and share ideas. And when we learn to code, we are learning how to organize, express, and share ideas in new ways, in a new medium.

EDUTOPIA: What does that look like in the school system? Does coding sit alongside math and reading? Is it integrated in some way?

RESNICK: These days I talk about our approach in terms of these four words that begin with the letter p: projects, passion, peers, and play. So that’s the approach I would take with coding, but also with any other learning: getting students to work on projects, based on their passion, in collaboration with peers, in a playful spirit. And each of those p’s is important. I think work on projects gives you an understanding of the creative process, how to start with just the inkling of an idea and then to build a prototype, share it with people, experiment with it, and continue to modify and improve it.

We know that kids are going to work longer and make deeper connections to the content when they are passionate about the ideas—when they care—and when they’re learning with and being inspired by peers. And I’d want to have kids experience coding in the same way.

EDUTOPIA: You’re describing a high-choice learning environment rooted in student passion and project work. Where’s the teacher in that mix?

RESNICK: The teacher still plays an incredibly important role, but in this approach it’s not so much about delivering instruction. One role the teacher is playing is the role of connector—connecting peers with one another to work together on solving problems. Teachers also act as catalysts by asking provocative questions: “What do you think will happen if…?” or “That surprised me, why do you think that happened?”

They’re consultants, too, and it’s not just about consulting on technical skills, but also about things like how you continue to work on something even when you are frustrated, or suggesting strategies for working with diverse groups of people. Finally, the teacher can be a collaborator, working together with kids on projects—because kids should see teachers as learners too.

EDUTOPIA: It sounds like a more democratic, open system, which seems to imply breaking down a lot of barriers?

RESNICK: I think breaking down barriers is a good way to think about it. When I think about the type of things that I might change in schools—and I know none of it is easy—a lot of it is about breaking down barriers. Break down the barriers between class periods, because 50-minute chunks are too constraining if you want to work on projects. Break down the barriers between disciplines, because meaningful projects almost always cut across disciplines. Break down the barriers between ages and have older kids work with younger kids—both groups benefit. And break down the barriers between inside of school and outside of school—have kids work on projects that are meaningful to their communities and bring people from the communities into the schools to support the teachers.

That’s one way of dealing with the challenge of a single teacher committed to 30 or more kids. It doesn’t have to be that way. Older kids can be helping younger kids, people from the community can be helping.

EDUTOPIA: A fair question—and a common criticism—is: How do you figure out whether kids are learning anything? How do you assess it?

RESNICK: I would take a portfolio-like approach, looking at what kids create. That’s what we do in our Scratch online community. You can see that a kid has created several dozen digital projects, and you can look through their projects and see their progression. For example, you might see the gradual adoption of new strategies—new types of artwork, but also new and improved programming structures.

I acknowledge that it’s difficult to arrive at quantitative measures, but I also think we each don’t necessarily need to. I sometimes make the analogy to the way I’ve been evaluated here at MIT. There are actually no quantitative measures in the process. Basically, they look at my portfolio: They see what I’ve created, they look at the trajectory and the progress over time, and they ask other people’s opinions about it. You’ll sometimes hear, “Well that’s not serious, we need quantitative measures to be serious.” Are they making the claim that MIT is not serious? I understand the criticism that it’s inefficient, but I think those are things we are going to need to deal with.

Again, it’s a big change and I’m not saying it’s easy, but I do think we need to move in that direction.

Advertisements

The surprising thing Google learned about its employees — and what it means for today’s students

 December 20, 2017

(Marcio Jose Sanchez/AP)

The conventional wisdom about 21st century skills holds that students need to master the STEM subjects — science, technology, engineering and math — and learn to code as well because that’s where the jobs are. It turns out that is a gross simplification of what students need to know and be able to do, and some proof for that comes from a surprising source: Google.

This post explains what Google learned about its employees, and what that means for students across the country.  It was written by Cathy N. Davidson, founding director of the Futures Initiative and a professor in the doctoral program in English at the Graduate Center, CUNY, and author of the new book, “The New Education: How to Revolutionize the University to Prepare Students for a World in Flux.” She also serves on the Mozilla Foundation board of directors,  and was appointed by President Barack Obama to the National Council on the Humanities.

By Cathy N. Davidson

All across America, students are anxiously finishing their “What I Want To Be …” college application essays, advised to focus on STEM (Science, Technology, Engineering, and Mathematics) by pundits and parents who insist that’s the only way to become workforce ready.  But two recent studies of workplace success contradict the conventional wisdom about “hard skills.” Surprisingly, this research comes from the company most identified with the STEM-only approach: Google.

Sergey Brin and Larry Page, both brilliant computer scientists, founded their company on the conviction that only technologists can understand technology. Google originally set its hiring algorithms to sort for computer science students with top grades from elite science universities.

In 2013, Google decided to test its hiring hypothesis by crunching every bit and byte of hiring, firing, and promotion data accumulated since the company’s incorporation in 1998. Project Oxygen shocked everyone by concluding that, among the eight most important qualities of Google’s top employees, STEM expertise comes in dead last. The seven top characteristics of success at Google are all soft skills: being a good coach; communicating and listening well; possessing insights into others (including others different values and points of view); having empathy toward and being supportive of one’s colleagues; being a good critical thinker and problem solver; and being able to make connections across complex ideas.

Those traits sound more like what one gains as an English or theater major than as a programmer. Could it be that top Google employees were succeeding despite their technical training, not because of it?  After bringing in anthropologists and ethnographers to dive even deeper into the data, the company enlarged its previous hiring practices to include humanities majors, artists, and even the MBAs that, initially, Brin and Page viewed with disdain.

Project Aristotle, a study released by Google this past spring, further supports the importance of soft skills even in high-tech environments. Project Aristotle analyzes data on inventive and productive teams. Google takes pride in its A-teams, assembled with top scientists, each with the most specialized knowledge and able to throw down one cutting-edge idea after another. Its data analysis revealed, however, that the company’s most important and productive new ideas come from B-teams comprised of employees who don’t always have to be the smartest people in the room.

Project Aristotle shows that the best teams at Google exhibit a range of soft skills: equality, generosity, curiosity toward the ideas of your teammates, empathy, and emotional intelligence. And topping the list: emotional safety. No bullying. To succeed, each and every team member must feel confident speaking up and making mistakes. They must know they are being heard.

Google’s studies concur with others trying to understand the secret of a great future employee. A recent survey of 260 employers by the nonprofit National Association of Colleges and Employers, which includes both small firms and behemoths like Chevron and IBM, also ranks communication skills in the top three most-sought after qualities by job recruiters. They prize both an ability to communicate with one’s workers and an aptitude for conveying the company’s product and mission outside the organization. Or take billionaire venture capitalist and “Shark Tank” TV personality Mark Cuban: He looks for philosophy majors when he’s investing in sharks most likely to succeed.

STEM skills are vital to the world we live in today, but technology alone, as Steve Jobs famously insisted, is not enough. We desperately need the expertise of those who are educated to the human, cultural, and social as well as the computational.

No student should be prevented from majoring in an area they love based on a false idea of what they need to succeed. Broad learning skills are the key to long-term, satisfying, productive careers. What helps you thrive in a changing world isn’t rocket science. It may just well be social science, and, yes, even the humanities and the arts that contribute to making you not just workforce ready but world ready.

Will Robots Take Our Children’s Jobs?

Photo

CreditRichie Pope

Like a lot of children, my sons, Toby, 7, and Anton, 4, are obsessed with robots. In the children’s books they devour at bedtime, happy, helpful robots pop up more often than even dragons or dinosaurs. The other day I asked Toby why children like robots so much.

“Because they work for you,” he said.

What I didn’t have the heart to tell him is, someday he might work for them — or, I fear, might not work at all, because of them.

It is not just Elon MuskBill Gates and Stephen Hawking who are freaking out about the rise of invincible machines. Yes, robots have the potential to outsmart us and destroy the human race. But first, artificial intelligence could make countless professions obsolete by the time my sons reach their 20s.

You do not exactly need to be Marty McFly to see the obvious threats to our children’s future careers.

Say you dream of sending your daughter off to Yale School of Medicine to become a radiologist. And why not? Radiologists in New York typically earn about $470,000, according to Salary.com.

But that job is suddenly looking iffy as A.I. gets better at reading scans. A start-up called Arterys, to cite just one example, already has a program that can perform a magnetic-resonance imaging analysis of blood flow through a heart in just 15 seconds, compared with the 45 minutes required by humans.

Maybe she wants to be a surgeon, but that job may not be safe, either. Robots already assist surgeons in removing damaged organs and cancerous tissue, according to Scientific American. Last year, a prototype robotic surgeon called STAR (Smart Tissue Autonomous Robot) outperformed human surgeons in a test in which both had to repair the severed intestine of a live pig.

Photo

Robots put together vehicle frames on the assembly line at the Peugeot Citroën Moteurs factory.CreditSebastien Bozon/Agence France-Presse — Getty Images

So perhaps your daughter detours to law school to become a rainmaking corporate lawyer. Skies are cloudy in that profession, too. Any legal job that involves lots of mundane document review (and that’s a lot of what lawyers do) is vulnerable.

Software programs are already being used by companies including JPMorgan Chase & Company to scan legal papers and predict what documents are relevant, saving lots of billable hours. Kira Systems, for example, has reportedly cut the time that some lawyers need to review contracts by 20 to 60 percent.

As a matter of professional survival, I would like to assure my children that journalism is immune, but that is clearly a delusion. The Associated Press already has used a software program from a company called Automated Insights to churn out passable copy covering Wall Street earnings and some college sports, and last year awarded the bots the minor league baseball beat.

What about other glamour jobs, like airline pilot? Well, last spring, a robotic co-pilot developed by the Defense Advanced Research Projects Agency, known as Darpa, flew and landed a simulated 737. I hardly count that as surprising, given that pilots of commercial Boeing 777s, according to one 2015 survey, only spend seven minutes during an average flight actually flying the thing. As we move into the era of driverless cars, can pilotless planes be far behind?

Then there is Wall Street, where robots are already doing their best to shove Gordon Gekko out of his corner office. Big banks are using software programs that can suggest bets, construct hedges and act as robo-economists, using natural language processing to parse central bank commentary to predict monetary policy, according to Bloomberg. BlackRock, the biggest fund company in the world, made waves earlier this year when it announced it was replacing some highly paid human stock pickers with computer algorithms.

So am I paranoid? Or not paranoid enough? A much-quoted 2013 studyby the University of Oxford Department of Engineering Science — surely the most sober of institutions — estimated that 47 percent of current jobs, including insurance underwriter, sports referee and loan officer, are at risk of falling victim to automation, perhaps within a decade or two.

Just this week, the McKinsey Global Institute released a report that found that a third of American workers may have to switch jobs in the next dozen or so years because of A.I.

I know I am not the only parent wondering if I can robot-proof my children’s careers. I figured I would start by asking my own what they want to do when they grow up.

Photo

Elon Musk, the C.E.O. of Tesla Motors. CreditMarcio Jose Sanchez/Associated Press

Toby, a people pleaser and born entertainer, is obsessed with cars and movies. He told me he wanted to be either an Uber driver or an actor. (He is too young to understand that those jobs are usually one and the same).

As for Uber drivers, it is no secret that they are headed to that great parking garage in the sky; the company recently announced plans to buy 24,000 Volvo sport utility vehicles to roll out as a driverless fleet between 2019 and 2021.

And actors? It may seem unthinkable that some future computer-generated thespian could achieve the nuance of expression and emotional depth of, say, Dwayne Johnson. But Hollywood is already Silicon Valley South. Consider how filmmakers used computer graphics to reanimate Carrie Fisher’s Princess Leia and Peter Cushing’s Grand Moff Tarkin as they appeared in the 1970s (never mind that the Mr. Cushing died in 1994) for “Rogue One: A Star Wars Story.

My younger son Anton, a sweetheart, but tough as Kevlar, said he wanted to be a football player. Robot football may sound crazy, but come to think of it, a Monday night battle between the Dallas Cowdroids and Seattle Seabots may be the only solution to the sport’s endless concussion problems.

He also said he wanted to be a soldier. If he means foot soldier, however, he might want to hold off on enlistment. Russia recently unveiled Fedor, a humanoid robot soldier that looks like RoboCop after a Whole30 crash diet; this space-combat-ready android can fire handguns, drive vehicles, administer first aid and, one hopes, salute. Indeed, the world’s armies are in such an arms race developing grunt-bots that one British intelligence expert predicted that American forces will have more robot soldiers than humans by 2025.

And again, all of this stuff is happening now, not 25 years from now. Who knows what the jobs marketplace might look like by then. We might not even be the smartest beings on the planet.

Ever heard of the “singularity”? That is the term that futurists use to describe a potentially cataclysmic point at which machine intelligence catches up to human intelligence, and likely blows right past it. They may rule us. They may kill us. No wonder Mr. Musk says that A.I. “is potentially more dangerous than nukes.”

But is it really that dire? Fears of technology are as old as the Luddites, those machine-smashing British textile workers of the early 19th century. Usually, the fears turn out to be overblown.

The rise of the automobile, to cite the obvious example, did indeed put most manure shovelers out of work. But it created millions of jobs to replace them, not just for Detroit assembly line workers, but for suburban homebuilders, Big Mac flippers and actors performing “Greased Lightnin’” in touring revivals of “Grease.” That is the process of creative destruction in a nutshell.

But artificial intelligence is different, said Martin Ford, the author of “Rise of the Robots: Technology and the Threat of a Jobless Future.”Machine learning does not just give us new machines to replace old machines, pushing human workers from one industry to another. Rather, it gives us new machines to replace us, machines that can follow us to virtually any new industry we flee to.

Since Mr. Ford’s book sent me down this rabbit hole in the first place, I reached out to him to see if he was concerned about all this for his own children: Tristan, 22, Colin, 17, and Elaine, 10.

He said the most vulnerable jobs in the robot economy are those involving predictable, repetitive tasks, however much training they require. “A lot of knowledge-based jobs are really routine — sitting in front of a computer and cranking out the same application over and over, whether it is a report or some kind of quantitative analysis,” he said.

Professions that rely on creative thinking enjoy some protection (Mr. Ford’s older son is a graduate student studying biomedical engineering). So do jobs emphasizing empathy and interpersonal communication (his younger son wants to be a psychologist).

Even so, the ability to think creatively may not provide ultimate salvation. Mr. Ford said he was alarmed in May when Google’s AlphaGo software defeated a 19-year-old Chinese master at Go, considered the world’s most complicated board game.

“If you talk to the best Go players, even they can’t explain what they’re doing,” Mr. Ford said. “They’ll describe it as a ‘feeling.’ It’s moving into the realm of intuition. And yet a computer was able to prove that it can beat anyone in the world.”

Looking for a silver lining, I spent an afternoon Googling TED Talks with catchy titles like “Are Droids Taking Our Jobs?”

Photo

“Rise of the Robots,” by Martin Ford.

In one, Albert Wenger, an influential tech investor, promoted the Basic Income Guarantee concept. Also known as Universal Basic Income, this sunny concept holds that a robot-driven economy may someday produce an unlimited bounty of cool stuff while simultaneously releasing us from the drudgery of old-fashioned labor, leaving our government-funded children to enjoy bountiful lives of leisure as interpretive dancers or practitioners of bee-sting therapy, as touted by Gwyneth Paltrow.

The idea is all the rage among Silicon Valley elites, who not only understand technology’s power, but who also love to believe that it will be used for good. In their vision of a post-A.I. world without traditional jobs, everyone will receive a minimum weekly or monthly stipend (welfare for all, basically).

Another talk by David Autor, an economist, argued that reports of the death of work are greatly exaggerated. Almost 50 years after the introduction of the A.T.M., for instance, more humans actually work as bank tellers than ever. The computers simply freed the humans from mind-numbing work like counting out 20-dollar bills to focus on more cognitively demanding tasks like “forging relationships with customers, solving problems and introducing them to new products like credit cards, loans and investments,” he said.

Computers, after all, are really good at some things and, for the moment, terrible at others. Even Anton intuits this. The other day I asked him if he thought robots were smarter or dumber than humans. “Sdumber,” he said after a long pause. Confused, I pushed him. “Smarter and dumber,” he explained with a cheeky smile.

He was joking. But he also happened to be right, according to Andrew McAfee, a management theorist at the Massachusetts Institute of Technology whom I interviewed a short while later.

Discussing another of Anton’s career aspirations — songwriter — Dr. McAfee said that computers were already smart enough to come up with a better melody than a lot of humans. “The things our ears find pleasant, we know the rules for that stuff,” he said. “However, I’m going to be really surprised when there is a digital lyricist out there, somebody who can put words to that music that will actually resonate with people and make them think something about the human condition.”

Not everyone, of course, is cut out to be a cyborg-Springsteen. I asked Dr. McAfee what other jobs may exist a decade from now.

“I think health coaches are going to be a big industry of the future,” he said. “Restaurants that have a very good hospitality staff are not about to go away, even though we have more options to order via tablet.

“People who are interested in working with their hands, they’re going to be fine,” he said. “The robot plumber is a long, long way away.”

CAN ROBOTS HELP GET MORE GIRLS INTO SCIENCE AND TECH?

Wired

WONDER WORKSHOP
By Matt Simon

HERE’S A DEPRESSING number for you: 12. Just 12 percent of engineers in the United States are women. In computing it’s a bit better, where women make up 26 percent of the workforce—but that number has actually fallen from 35 percent in 1990.

The United States has a serious problem with getting women into STEM jobs and keeping them there. Silicon Valley and other employers bear the most responsibility for that: Discrimination, both overt and subtle, works to keep women out of the workforce. But this society of ours also perpetuates gender stereotypes, which parents pass on to their kids. Like the one that says boys enjoy building things more than girls.

There’s no single solution to such a daunting problem, but here’s an unlikely one: robots. Not robots enforcing diversity in the workplace, not robots doing all the work and obviating the concept of gender entirely, but robots getting more girls interested in STEM. Specifically, robot kits for kids—simple yet powerful toys for teaching youngsters how to engineer and code.

VAIDAS SIRTAUTUS

Plenty of toys are targeted at getting kids interested in science and engineering, and many these days are gender specific. Roominate, for instance, is a building kit tailored for girls, while the Boolean Box teaches girls to code. “Sometimes there’s this idea that girls need special Legos, or it needs to be pink and purple for girls to get into it, and sometimes that rubs me the wrong way,” says Amanda Sullivan, who works in human development at Tufts University. “If the pink and purple colored tools is what’s going to engage that girl, then that’s great. But I think in general it would be great if there were more tools and books and things that were out there for all children.”

So Sullivan decided to test the effects of a specifically non-gendered robotics kit called Kibo. Kids program the rolling robot by stringing together blocks that denote specific commands. It isn’t marketed specifically to boys or girls using stereotypical markings of maleness or femaleness. It’s a blank slate.

Before playing with Kibo, boys were significantly more likelyto say they’d enjoy being an engineer than the girls did. But after, boys had about the same opinion, while girls were now equally as likely to express an engineering interest as the boys. (In a control group that did not play with Kibo, girls’ opinions did not significantly change.) “I think that robots in general are novel to young children, both boys and girls,” Sullivan says. “So aside from engaging girls specifically, I think robotics kits like Kibo bring an air of excitement and something new to the classroom that gets kids psyched and excited about learning.”

There’s a problem, though. While Sullivan’s research shows that a gender-neutral robotics kit can get girls interested in engineering, that doesn’t mean it will sell. “If you look at sales data, it clearly shows that they’re not being used by girls,” says Sharmi Albrechtsen, CEO and co-founder of SmartGurlz, which makes a programmable doll on a self-balancing scooter. “Even the ones that are considered gender-neutral, if you look at the sales data it clearly shows a bias, and it’s towards boys. That’s the reality of the situation.” Gender sells—at least when it’s the parents doing the buying.

Regardless, companies are designing a new generation of toys in deliberate ways. Take Wonder Workshop and its non-gendered robots Dash and Cue. As they were prototyping, they’d test their designs with boys and girls. “One of the things we heard a lot from girls was this isn’t quite their toy,” says Vikas Gupta, co-founder and CEO of Wonder Workshop. “This is probably what their brother would play with.”

Why? Because they thought it looked like a car or truck. So the team covered up the wheels. “And all of a sudden girls wanted to play with it,” Gupta says. “Our takeaway from that in a big way was that every child brings their preconceived notions to play. So when they see something they map it back to something they’ve already seen.” Though not always. “What we do find actually, funnily enough,” says Albrechtsen of the SmartGurlz scooter doll, “is that a lot of boys actually end up edging in and wanting to play. So we have a lot of brothers who are also playing with the product.”

Whatever gets a child interested, it’s on parents and educators to make sure the spark stays alive. And maybe it’s the increasingly sophisticated, increasingly awesome, and increasingly inexpensive robots that can begin to transform the way America gets girls into science and tech. Short of becoming self aware and taking over the world, the machines certainly couldn’t hurt.

Learning to Think Like a Computer

Photo

A kindergartner organizes blocks into a sequence of commands at the Eliot-Pearson Children’s School at Tufts University. CreditCharlie Mahoney for The New York Times

In “The Beauty and Joy of Computing,” the course he helped conceive for nonmajors at the University of California, Berkeley, Daniel Garcia explains an all-important concept in computer science — abstraction — in terms of milkshakes.

“There is a reason when you go to the ‘Joy of Cooking’ and you want to make a strawberry milkshake, you don’t look under ‘strawberry milkshake,’ ” he said. Rather, there is a recipe for milkshakes that instructs you to add ice cream, milk and fruit of your choice. While earlier cookbooks may have had separate recipes for strawberry milkshakes, raspberry milkshakes and boysenberry milkshakes, eventually, he imagines, someone said, “Why don’t we collapse that into one milkshake recipe?”

“The idea of abstraction,” he said, “is to hide the details.” It requires recognizing patterns and distilling complexity into a precise, clear summary. It’s like the countdown to a space launch that runs through a checklist — life support, fuel, payload — in which each check represents perhaps 100 checks that have been performed.

Concealing layers of information makes it possible to get at the intersections of things, improving aspects of a complicated system without understanding and grappling with each part. Abstraction allows advances without redesigning from scratch.

It is a cool and useful idea that, along with other cool and useful computer science ideas, has people itching to know more. It’s obvious that computers have become indispensable problem-solving partners, not to mention personal companions. But it’s suddenly not enough to be a fluent user of software interfaces. Understanding what lies behind the computer’s seeming magic now seems crucial. In particular, “computational thinking” is captivating educators, from kindergarten teachers to college professors, offering a new language and orientation to tackle problems in other areas of life.

This promise — as well as a job market hungry for coding — has fed enrollments in classes like the one at Berkeley, taken by 500 students a year. Since 2011, the number of computer science majors has more than doubled, according to the Computing Research Association. At Stanford, Princeton and Tufts, computer science is now the most popular major. More striking, though, is the appeal among nonmajors. Between 2005 and 2015, enrollment of nonmajors in introductory, mid- and upper-level computer science courses grew by 177 percent, 251 percent and 143 percent, respectively.

In the fall, the College Board introduced a new Advanced Placement course, Computer Science Principles, focused not on learning to code but on using code to solve problems. And WGBH, the PBS station in Boston, is using National Science Foundation money to help develop a program for 3- to 5-year-olds in which four cartoon monkeys get into scrapes and then “get out of the messes by applying computational thinking,” said Marisa Wolsky, executive producer of children’s media. “We see it as a groundbreaking curriculum that is not being done yet.”

Computational thinking is not new. Seymour Papert, a pioneer in artificial intelligence and an M.I.T. professor, used the term in 1980 to envision how children could use computers to learn. But Jeannette M. Wing, in charge of basic research at Microsoft and former professor at Carnegie Mellon, gets credit for making it fashionable. In 2006, on the heels of the dot-com bust and plunging computer science enrollments, Dr. Wing wrote a trade journal piece, “Computational Thinking.” It was intended as a salve for a struggling field.

“Things were so bad that some universities were thinking of closing down computer science departments,” she recalled. Some now consider her article a manifesto for embracing a computing mind-set.

Like any big idea, there is disagreement about computational thinking — its broad usefulness as well as what fits in the circle. Skills typically include recognizing patterns and sequences, creating algorithms, devising tests for finding and fixing errors, reducing the general to the precise and expanding the precise to the general.

It requires reframing research, said Shriram Krishnamurthi, a computer science professor at Brown, so that “instead of formulating a question to a human being, I formulate a question to a data set.” For example, instead of asking if the media is biased toward liberals, pose the question as: Are liberals identified as liberal in major newspapers more often or less often than conservatives are identified as conservative?

Dr. Krishnamurthi helped create “Introduction to Computation for the Humanities and Social Sciences” more than a decade ago because he wanted students “early in their undergrad careers to learn a new mode of thinking that they could take back to their discipline.” Capped at 20 students, the course now has a waitlist of more than 100.

Just as Charles Darwin’s theory of evolution is drafted to explain politics and business, Dr. Wing argued for broad use of computer ideas. And not just for work. Applying computational thinking, “we can improve the efficiencies of our daily lives,” she said in an interview, “and make ourselves a little less stressed out.”

Photo

In his computer science course for nonmajors at the University of California, Berkeley, Dan Garcia wants students to understand why computers are “not magical.” In this exercise, students sort a deck of shuffled cards into ordered suits while being timed. They sort solo, then in pairs, then fours, then eights. But more people don’t always make it go faster. Amdahl’s law offers an equation to show that even with many computers tackling a problem, the time to complete the task does not decrease linearly. There are bottlenecks.CreditJim Wilson/The New York Times

Computing practices like reformulating tough problems into ones we know how to solve, seeing trade-offs between time and space, and pipelining (allowing the next action in line to begin before the first completes the sequence) have many applications, she said.

Consider the buffet line. “When you go to a lunch buffet, you see the forks and knives are the first station,” she said. “I find that very annoying. They should be last. You shouldn’t have to balance your plate while you have your fork and knife.” Dr. Wing, who equates a child filling her backpack to caching (how computers retrieve and store information needed later), sees the buffet’s inefficiency as a failure to apply logical thinking and sequencing.

Computational thinking, she said, can aid a basic task like planning a trip — breaking it into booking flights, hotels, car rental — or be used “for something as complicated as health care or policy decision-making.” Identifying subproblems and describing their relationship to the larger problem allows for targeted work. “Once you have well-defined interfaces,” she said, “you can ignore the complexity of the rest of the problem.”

Can computational thinking make us better at work and life? Dr. Krishnamurthi is sometimes seduced. “Before I go grocery shopping, I sort my list by aisles in the store,” he said. Sharing the list on the app Trello, his family can “bucket sort” items by aisle (pasta and oils, canned goods, then baking and spices), optimizing their path through Whole Foods. It limits backtracking and reduces spontaneous, “i.e., junk,” purchases, he said.

Despite his chosen field, Dr. Krishnamurthi worries about the current cultural tendency to view computer science knowledge as supreme, better than that gained in other fields. Right now, he said, “we are just overly intoxicated with computer science.”

It is certainly worth wondering if some applications of computational thinking are trivial, unnecessary or a Stepford Wife-like abdication of devilishly random judgment.

Alexander Torres, a senior majoring in English at Stanford, has noted how the campus’s proximity to Google has lured all but the rare student to computer science courses. He’s a holdout. But “I don’t see myself as having skills missing,” he said. In earning his degree he has practiced critical thinking, problem solving, analysis and making logical arguments. “When you are analyzing a Dickinson or Whitman or Melville, you have to unpack that language and synthesize it back.”

There is no reliable research showing that computing makes one more creative or more able to problem-solve. It won’t make you better at something unless that something is explicitly taught, said Mark Guzdial, a professor in the School of Interactive Computing at Georgia Tech who studies computing in education. “You can’t prove a negative,” he said, but in decades of research no one has found that skills automatically transfer.

Still, he added, for the same reasons people should understand biology, chemistry or physics, “it makes a lot of sense to understand computing in our lives.” Increasing numbers of people must program in their jobs, even if it’s just Microsoft Excel. “Solving problems with computers happens to all of us every day,” he said. How to make the skills available broadly is “an interesting challenge.”

“It’s like being a diplomat and learning Spanish; I feel like it’s essential,” said Greer Brigham, a Brown freshman who plans to major in political science. He’s taking the course designed by Dr. Krishnamurthi, which this term is being taught by a graduate student in robotics named Stephen Brawner.

On a March morning at the Brown computer science center, Mr. Brawner projected a student’s homework assignment on the screen. Did anyone notice a problem? Nary a humanities hand was raised. Finally, a young woman suggested “centimeters” and “kilograms” could be abbreviated. Fine, but not enough.

Mr. Brawner broke the silence and pointed out long lines of code reaching the far side of the screen. With a practiced flurry, he inserted backslashes and hit “return” repeatedly, which drew the symbols into a neat block. It may all be directions to a machine, but computer scientists care a great deal about visual elegance. As Mr. Brawner cut out repeated instructions, he shared that “whenever we define a constant, we want that at the top of our code.” He then explained the new assignment: write a program to play “rock, paper, scissors” against a computer.

Mili Mitra, a junior majoring in public policy and economics who sat with a MacBook on her lap, would not have considered this class a year ago. But seeing group research projects always being handed off to someone with computing knowledge, she decided that she “didn’t want to keep passing them along.” She has learned to write basic code and fetch data sets through the internet to analyze things she’s interested in, such as how geographic proximity shapes voting patterns in the United Nations General Assembly.

Despite finding interactions with a computer much like “explaining things to a toddler,” Ms. Mitra credits the class for instilling the habit of “going step by step and building a solution.” She admits to being an impatient learner: “I jump ahead. In C.S. you don’t have a choice. If you miss a step, you mess up everything.”

Just as children are drilled on the scientific method — turn observations into a hypothesis, design a control group, do an experiment to test your theory — the basics of working with computers is being cast as a teachable blueprint. One thing making this possible is that communicating with computers has become easier.

“Block” programming languages like Scratch, released by the M.I.T. Media Lab a decade ago, hide text strings that look like computer keys run amok. That makes coding look less scary. Instead of keyboard letters and symbols, you might select from a menu and drag a color-coded block that says “say ( ) for ( ) secs” or “play note ( ) for ( ) beats.” The colors and shapes correspond to categories like “sound” or “motion”; the blocks can be fit together like stacked puzzle pieces to order instructions. Students use this to, say, design a game.

One need not be a digital Dr. Doolittle, fluent in hard-core programming languages like Java or Python, to code. Block languages cut out the need to memorize commands, which vary depending on the computer language, because the block “is read just the way you think about it,” Dr. Garcia said. Students in his Berkeley course use the block language Snap! for assignments — he doesn’t teach Python until the last two weeks, and then just so they can take higher-level courses. “We tell them, ‘You already know how to program,’ ” he said, because the steps are the same.

Computer Science A, which teaches Java, is the fastest-growing Advanced Placement course. (The number of students taking the exam in 2016 rose 18 percent over 2015 and nearly tripled in a decade.) But professors complained that “Java was not the right way” to attract a diverse group of students, said Trevor Packer, head of the A.P. program, so a new course was developed.

The course, Computer Science Principles, is modeled on college versions for nonmajors. It lets teachers pick any coding language and has a gentler vibe. There is an exam, but students also submit projects “more similar to a studio art portfolio,” Mr. Packer said. The course covers working with data and understanding the internet and cyber security, and it teaches “transferable skills,” he said, like formulating precise questions. That’s a departure from what the College Board found in many high schools: “They were learning how to keyboard, how to use Microsoft applications.” The goal is that the new course will be offered in every high school in the country.

President Obama’s “Computer Science for All” initiative, officially launched last year, resulted in educators, lawmakers and computer science advocates spreading the gospel of coding. It also nudged more states to count computer science toward high school graduation requirements. Thirty-two states and the District of Columbia now do, up from 12 in 2013, according to Code.org. It’s what Dr. Wing had hoped for when she advocated in her 2006 article that, along with reading, writing and arithmetic “we should add computational thinking to every child’s analytical ability.”

Photo

After arranging blocks into a sequence of commands, kindergartners at the Eliot-Pearson Children’s School scanned the bar codes on the blocks into their yellow robot. It obeyed their commands.CreditCharlie Mahoney for The New York Times

In an airy kindergarten classroom at Eliot-Pearson Children’s School, in the Tufts University Department of Child Study and Human Development, children program with actual blocks. Marina Umaschi Bers, a child development and computer science professor, created wooden blocks that bear bar codes with instructions such as “forward,” “spin” and “shake” that are used to program robots — small, wheeled carts with built-in scanners — by sequencing the blocks, then scanning them. Each “program” starts with a green “begin” block and finishes with a red “end.”

Coding for the youngest students has become the trendy pedagogy, with plentiful toys and apps like Dr. Bers’s blocks. Dr. Bers, who with M.I.T. collaborators developed the block language ScratchJr, is evangelical about coding. Learning the language of machines, she said, is as basic as writing is to being proficient in a foreign language. “You are able to write a love poem, you are able to write a birthday card, you are able to use language in many expressive ways,” she said. “You are not just reading; you are producing.”

Peer-reviewed studies by Dr. Bers show that after programming the robots, youngsters are better at sequencing picture stories. Anecdotally, she said, when they ask children to list steps for brushing teeth, they get just a few, “but after being exposed to this work, they’ll have 15 or 20 steps.”

Dr. Bers embeds computing in activities familiar to young children like inventing stories, doing dances and making art. At the Tufts school on a recent morning, children puzzled over a question: How does a robot celebrate spring?

“He’s going to dance, and then he will pretend that he is wet,” offered Hallel Cohen-Goldberg, a kindergartner with a mane of curls.

Solina Gonzalez, coloring a brown, blue and red circle with markers, peered soberly through pink-framed glasses: “He just does a lollipop dance.” Solina’s partner, Oisin Stephens, fretted about the root beer lollipop drawing she had taped to a block. “The robot won’t be able to read this,” he said. (It’s an invalid input.)

As they lurched around the carpet on their knees, the children executed computer science concepts like breaking instructions into sequenced commands, testing and debugging. One team used “repeat” and “stop repeat” blocks, forming a programming “loop,” a sequence of instructions that is continually repeated until a certain condition is reached.

To Write Better Code, Read Virginia Woolf

22hipps-master768

Mountain View, Calif. — THE humanities are kaput. Sorry, liberal arts cap-and-gowners. You blew it. In a software-run world, what’s wanted are more engineers.

At least, so goes the argument in a rising number of states, which have embraced a funding model for higher education that uses tuition “bonuses” to favor hard-skilled degrees like computer science over the humanities. The trend is backed by countless think pieces. “Macbeth does not make my priority list,” wrote Vinod Khosla, a co-founder of Sun Microsystems and the author of a widely shared blog post titled “Is Majoring in Liberal Arts a Mistake for Students?” (Subtitle: “Critical Thinking and the Scientific Process First — Humanities Later”).

The technologist’s argument begins with a suspicion that the liberal arts are of dubious academic rigor, suited mostly to dreamers. From there it proceeds to a reminder: Software powers the world, ergo, the only rational education is one built on STEM. Finally, lest he be accused of making a pyre of the canon, the technologist grants that yes, after students have finished their engineering degrees and found jobs, they should pick up a book — history, poetry, whatever.

As a liberal-arts major who went on to a career in software, I can only scratch my head.

Fresh out of college in 1993, I signed on with a large technology consultancy. The firm’s idea was that by hiring a certain lunatic fringe of humanities majors, it might cut down on engineering groupthink. After a six-week programming boot camp, we were pitched headfirst into the deep end of software development.

My first project could hardly have been worse. We (mostly engineers, with a spritzing of humanities majors) were attached to an enormous cellular carrier. Our assignment was to rewrite its rating and billing system — a thing that rivaled maritime law in its complexity.

I was assigned to a team charged with one of the hairier programs in the system, which concerned the movement of individual mobile subscribers from one “parent” account plan to another. Each one of these moves caused an avalanche of plan activations and terminations, carry-overs or forfeitures of accumulated talk minutes, and umpteen other causal conditionals that would affect the subscriber’s bill.

This program, thousands of lines of code long and growing by the hour, was passed around our team like an exquisite corpse. The subscribers and their parent accounts were rendered on our screens as a series of S’s and A’s. After we stared at these figures for weeks, they began to infect our dreams. (One I still remember. I was a baby in a vast crib. Just overhead, turning slowly and radiating malice, was an enormous iron mobile whose arms strained under the weight of certain capital letters.)

Our first big break came from a music major. A pianist, I think, who joined our team several months into the project. Within a matter of weeks, she had hit upon a method to make the S’s hold on to the correct attributes even when their parent A was changed.

We had been paralyzed. The minute we tweaked one bit of logic, we realized we’d fouled up another. But our music major moved freely. Instead of freezing up over the logical permutations behind each A and S, she found that these symbols put her in the mind of musical notes. As notes, they could be made to work in concert. They could be orchestrated.

On a subsequent project, our problem was pointers. In programming language, a pointer is an object that refers to some master value stored elsewhere. This might sound straightforward, but pointers are like ghosts in the system. A single misdirected one can crash a program. Our pointer wizard was a philosophy major who had no trouble at all with the idea of a named “thing” being a transient stand-in for some other unseen Thing. For a Plato man, this was mother’s milk.

I’ve worked in software for years and, time and again, I’ve seen someone apply the arts to solve a problem of systems. The reason for this is simple. As a practice, software development is far more creative than algorithmic.

The developer stands before her source code editor in the same way the author confronts the blank page. There’s an idea for what is to be created, and the (daunting) knowledge that there are a billion possible ways to go about it. To proceed, each relies on one part training to three parts creative intuition. They may also share a healthy impatience for the ways things “have always been done” and a generative desire to break conventions. When the module is finished or the pages complete, their quality is judged against many of the same standards: elegance, concision, cohesion; the discovery of symmetries where none were seen to exist. Yes, even beauty.

To be sure, each craft also requires a command of the language and its rules of syntax. But these are only starting points. To say that more good developers will be produced by swapping the arts for engineering is like saying that to produce great writers, we should double down on sentence diagraming.

Here the technologists may cry foul, say I’m misrepresenting the argument, that they’re not calling to avoid the humanities altogether, but only to replace them in undergraduate study. “Let college be for science and engineering, with the humanities later.” In tech speak, this is an argument for the humanities as plug-in.

But if anything can be treated as a plug-in, it’s learning how to code. It took me 18 months to become proficient as a developer. This isn’t to pretend software development is easy — those were long months, and I never touched the heights of my truly gifted peers. But in my experience, programming lends itself to concentrated self-study in a way that, say, “To the Lighthouse” or “Notes Toward a Supreme Fiction” do not. To learn how to write code, you need a few good books. To enter the mind of an artist, you need a human guide.

For folks like Mr. Khosla, such an approach is dangerous: “If subjects like history and literature are focused on too early, it is easy for someone not to learn to think for themselves and not to question assumptions, conclusions, and expert philosophies.” (Where some of these kill-the-humanities pieces are concerned, the strongest case for the liberal arts is made just in trying to read them.)

How much better is the view of another Silicon Valley figure, who argued that “technology alone is not enough — it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our heart sing.”

His name? Steve Jobs.

Why Kids Should Make the Video Games They Love to Play

Mindshift

Screen grab of a coin collecting game created by a middle school student using Gamestar Mechanic.

When educator Lynn Koresh hears from kids that they want a career doing something with computers, she asks, “To do what with computers?”

Adults often encourage kids to pursue science, technology, engineering and math (STEM) skills, and computing classes are usually a first stop. But Koresh knows it’s the real-world applications of computational thinking and coding language skills that bring such knowledge to life.

She reasoned that most middle school students are already playing video games and might respond well to a unit on how to design, create, test and promote video games. Along the way, she’s also teaching them about digital citizenship and entrepreneurship.

“I wanted to give kids exposure to what it means to have a career using computers,” said Koresh, technology coordinator at Edgewood Campus School in Madison, Wisconsin.

She gave students the task of designing a game using Gamestar Mechanic. It’s a Web tool that helps kids create games. Before any programming begins, students talk about their games, set objectives and start storyboarding on paper. They think about the game’s avatars and how the game mechanics will work. Koresh shared her experience teaching this class at the Games Learning Society conference in Madison.

As students develop their games, they test them on one another throughout the semester. Koresh has found kids often give short and positive feedback, making it challenging to learn enough to improve the game. She says the kids respond this way mostly because they’re concerned for their friends and worry that they’ll get a bad grade, even though that’s not the case.

“You have to get specific enough so they don’t say, ‘It’s good, I liked it.’ You have to force them to take a stand.”

To help improve the process, she has reframed the questions around student game critiques in a consumer-oriented way, such as, “Would you pay 99 cents for this app? Would you give it three stars or four stars?”

To help them become more critical thinkers, the students read product reviews on blogs and business sites to learn about features that might improve the user experience. In the process, Koresh hopes the kids learn to be selective digital consumers and do research before making purchases or trusting a source.

It’s also an opportunity to talk about a person’s digital footprint and the types of comments, images and videos that can come back to haunt someone.

“If you put it online, it should be worthy of other students, grandma, everyone seeing it,” said Koresh.

Once the games are completed, the middle school students have three seconds to pitch their game to fourth-grade players in the form of a slide on a computer screen. Since time to persuade the audience is limited, much like in real life, game designers have to “sell” their game with one compelling slide. Students have to be selective about which elements of the game to highlight. Creating the slide is also an opportunity to talk about marketing.

“It’s great you’ve made something, but how do you get other people to use it?” Koresh asks her students. They get a good idea about how well their ad has worked based on the number of plays their games receive.

As for whether parents object to kids spending more time on video games, she says they have been supportive of STEM activities and pre-coding skills learned in game design. Koresh has found the time students spent on the games, both inside and outside class, has helped them think about coding as an extracurricular activity. Girls who have created games in her class have gone on to enter STEM design competitions.

Here are some of the ads Koresh’s students created that link to their games:

Dive, Dive, DiveCoin Collecting Game Dive Dive Dive

Those Mondays
Those Mondays game

Plague DustersPlague Dusters