In the era of artificial intelligence, robots and more, higher education is arguably more important than ever. Academic researchers are producing the ideas that lead to technology after technology. On the other hand, a challenge exists for higher education: how to produce graduates whose careers won’t be derailed by all of these advances. Now that robots can pick stocks, this isn’t just about factory jobs, but the positions that college graduates have long assumed were theirs.
Northeastern University is involved in both sides of that equation. Its academic programs in engineering, computer science and other fields are producing these breakthroughs. And its students — at an institution known for close ties to employers — of course want good careers. Joseph E. Aoun, Northeastern’s president, explores these issues in Robot-Proof: Higher Education in the Age of Artificial Intelligence (MIT Press). Aoun is a scholar in linguistics when he’s not focused on university administration. His book argues that changes in the college curriculum are needed to prepare students in this new era, but that doesn’t mean ignoring the humanities or general education.
Q: How worried should college graduates be about being replaced by technology? Is it likely that many jobs today held by those with college degrees will be replaced by robots or some form of technology?
A: Smart machines are getting smarter, and many of the jobs performed by people today are going to disappear. Some studies predict that half of all U.S. jobs are at risk within the next 20 years. And it’s not just blue-collar jobs; today intelligent machines are picking stocks, doing legal research and even writing news articles. Simply put, if a job can be automated in the future, it will be.
For higher education to meet this challenge — for us to make people robot-proof — we need to change. In my book, I offer a blueprint for how we can accomplish this. We will need to re-envision the curriculum, invest in experiential education and put lifelong learning at the heart of what we do. It will not be easy, but we have a responsibility — to the students of today and tomorrow — to change the way we do business.
Q: In an era of adaptive learning and online learning, should faculty members be worried about their jobs in the future?
A: We’re seeing educational content become commoditized. Therefore, the job of faculty members has to go beyond simply transmitting knowledge. More than ever, the priority for faculty is to create new knowledge and act as the catalysts to make their students robot-proof. The personal connection between student and teacher cannot be replaced by a machine.
But, like students, faculty members must act to meet the challenge of today’s world and should embrace the transformation of higher education that I describe in my book.
Q: What is “humanics,” and what are the three kinds of literacy that you want colleges to teach?
A: Humanics is the curriculum for a robot-proof education. It is based on the purposeful integration of technical literacies, such as coding and data analytics, with uniquely human literacies, such as creativity, entrepreneurship, ethics, cultural agility and the ability to work with others.
The key is integration. We need to break down the academic silos that separate historians from engineers.
When I talk to employers, they tell me that they would give their right arm for more systems thinkers — quarterbacks who can see across disciplines and analyze them in an integrated way. And every student should be culturally agile, able to communicate across boundaries, and to think ethically. By integrating technology, data and humanities, we can help students become robot-proof.
Q: In your vision for the future of higher education, is this about embedding these skills into existing programs or starting from scratch?
A: Higher education has the elements for a robot-proof model, but we need to be much more intentional about how we integrate them. As I’ve mentioned, our curriculum needs to change so that technical and human literacies are unified.
We need to deliver this curriculum in an experiential way. This means recognizing that learning happens beyond the classroom through co-ops and meaningful internships. I truly believe that experiential education is the most powerful way to learn.
Still, no one is going to be set for life. We need to commit to lifelong learning in a way that we haven’t done in the past. Universities have been engaged in lifelong learning for many years, but it is usually treated as a second-class operation. We need to bring lifelong learning to the core of our mission.
This will require us to rethink the way we deliver education, particularly to working professionals who don’t have time to be on campus every day. Online and hybrid delivery modes will be essential. We have to meet learners wherever they are — in their careers and around the world.
Credentials will need to be unbundled so that learners don’t have to commit to long-term degree programs. Stackable certificates, badges and boot camps will become the norm.
These changes won’t happen by themselves. Institutions should establish authentic partnerships with employers, redesign courses to fill gaps that employers actually need and connect them with students through co-ops and internships.
Q: How is Northeastern getting ready for these changes?
A: Northeastern has designed its academic plan to meet the challenges — and opportunities — presented by smart machines. Beyond the curricular changes required by humanics, and our leadership in experiential learning, we are building a multicampus network spanning different cities, regions and countries. Learners will be able to gain access to this network wherever they are and whenever it’s convenient for them.
Throughout its history, higher education has adapted to changes in the world. Knowing what we know about the revolution of smart machines, we have a responsibility to remain relevant and an opportunity to make our learners robot-proof.
In “The Beauty and Joy of Computing,” the course he helped conceive for nonmajors at the University of California, Berkeley, Daniel Garcia explains an all-important concept in computer science — abstraction — in terms of milkshakes.
“There is a reason when you go to the ‘Joy of Cooking’ and you want to make a strawberry milkshake, you don’t look under ‘strawberry milkshake,’ ” he said. Rather, there is a recipe for milkshakes that instructs you to add ice cream, milk and fruit of your choice. While earlier cookbooks may have had separate recipes for strawberry milkshakes, raspberry milkshakes and boysenberry milkshakes, eventually, he imagines, someone said, “Why don’t we collapse that into one milkshake recipe?”
“The idea of abstraction,” he said, “is to hide the details.” It requires recognizing patterns and distilling complexity into a precise, clear summary. It’s like the countdown to a space launch that runs through a checklist — life support, fuel, payload — in which each check represents perhaps 100 checks that have been performed.
Concealing layers of information makes it possible to get at the intersections of things, improving aspects of a complicated system without understanding and grappling with each part. Abstraction allows advances without redesigning from scratch.
It is a cool and useful idea that, along with other cool and useful computer science ideas, has people itching to know more. It’s obvious that computers have become indispensable problem-solving partners, not to mention personal companions. But it’s suddenly not enough to be a fluent user of software interfaces. Understanding what lies behind the computer’s seeming magic now seems crucial. In particular, “computational thinking” is captivating educators, from kindergarten teachers to college professors, offering a new language and orientation to tackle problems in other areas of life.
This promise — as well as a job market hungry for coding — has fed enrollments in classes like the one at Berkeley, taken by 500 students a year. Since 2011, the number of computer science majors has more than doubled, according to the Computing Research Association. At Stanford, Princeton and Tufts, computer science is now the most popular major. More striking, though, is the appeal among nonmajors. Between 2005 and 2015, enrollment of nonmajors in introductory, mid- and upper-level computer science courses grew by 177 percent, 251 percent and 143 percent, respectively.
In the fall, the College Board introduced a new Advanced Placement course, Computer Science Principles, focused not on learning to code but on using code to solve problems. And WGBH, the PBS station in Boston, is using National Science Foundation money to help develop a program for 3- to 5-year-olds in which four cartoon monkeys get into scrapes and then “get out of the messes by applying computational thinking,” said Marisa Wolsky, executive producer of children’s media. “We see it as a groundbreaking curriculum that is not being done yet.”
Computational thinking is not new. Seymour Papert, a pioneer in artificial intelligence and an M.I.T. professor, used the term in 1980 to envision how children could use computers to learn. But Jeannette M. Wing, in charge of basic research at Microsoft and former professor at Carnegie Mellon, gets credit for making it fashionable. In 2006, on the heels of the dot-com bust and plunging computer science enrollments, Dr. Wing wrote a trade journal piece, “Computational Thinking.” It was intended as a salve for a struggling field.
“Things were so bad that some universities were thinking of closing down computer science departments,” she recalled. Some now consider her article a manifesto for embracing a computing mind-set.
Like any big idea, there is disagreement about computational thinking — its broad usefulness as well as what fits in the circle. Skills typically include recognizing patterns and sequences, creating algorithms, devising tests for finding and fixing errors, reducing the general to the precise and expanding the precise to the general.
It requires reframing research, said Shriram Krishnamurthi, a computer science professor at Brown, so that “instead of formulating a question to a human being, I formulate a question to a data set.” For example, instead of asking if the media is biased toward liberals, pose the question as: Are liberals identified as liberal in major newspapers more often or less often than conservatives are identified as conservative?
Dr. Krishnamurthi helped create “Introduction to Computation for the Humanities and Social Sciences” more than a decade ago because he wanted students “early in their undergrad careers to learn a new mode of thinking that they could take back to their discipline.” Capped at 20 students, the course now has a waitlist of more than 100.
Just as Charles Darwin’s theory of evolution is drafted to explain politics and business, Dr. Wing argued for broad use of computer ideas. And not just for work. Applying computational thinking, “we can improve the efficiencies of our daily lives,” she said in an interview, “and make ourselves a little less stressed out.”
Computing practices like reformulating tough problems into ones we know how to solve, seeing trade-offs between time and space, and pipelining (allowing the next action in line to begin before the first completes the sequence) have many applications, she said.
Consider the buffet line. “When you go to a lunch buffet, you see the forks and knives are the first station,” she said. “I find that very annoying. They should be last. You shouldn’t have to balance your plate while you have your fork and knife.” Dr. Wing, who equates a child filling her backpack to caching (how computers retrieve and store information needed later), sees the buffet’s inefficiency as a failure to apply logical thinking and sequencing.
Computational thinking, she said, can aid a basic task like planning a trip — breaking it into booking flights, hotels, car rental — or be used “for something as complicated as health care or policy decision-making.” Identifying subproblems and describing their relationship to the larger problem allows for targeted work. “Once you have well-defined interfaces,” she said, “you can ignore the complexity of the rest of the problem.”
Can computational thinking make us better at work and life? Dr. Krishnamurthi is sometimes seduced. “Before I go grocery shopping, I sort my list by aisles in the store,” he said. Sharing the list on the app Trello, his family can “bucket sort” items by aisle (pasta and oils, canned goods, then baking and spices), optimizing their path through Whole Foods. It limits backtracking and reduces spontaneous, “i.e., junk,” purchases, he said.
Despite his chosen field, Dr. Krishnamurthi worries about the current cultural tendency to view computer science knowledge as supreme, better than that gained in other fields. Right now, he said, “we are just overly intoxicated with computer science.”
It is certainly worth wondering if some applications of computational thinking are trivial, unnecessary or a Stepford Wife-like abdication of devilishly random judgment.
Alexander Torres, a senior majoring in English at Stanford, has noted how the campus’s proximity to Google has lured all but the rare student to computer science courses. He’s a holdout. But “I don’t see myself as having skills missing,” he said. In earning his degree he has practiced critical thinking, problem solving, analysis and making logical arguments. “When you are analyzing a Dickinson or Whitman or Melville, you have to unpack that language and synthesize it back.”
There is no reliable research showing that computing makes one more creative or more able to problem-solve. It won’t make you better at something unless that something is explicitly taught, said Mark Guzdial, a professor in the School of Interactive Computing at Georgia Tech who studies computing in education. “You can’t prove a negative,” he said, but in decades of research no one has found that skills automatically transfer.
Still, he added, for the same reasons people should understand biology, chemistry or physics, “it makes a lot of sense to understand computing in our lives.” Increasing numbers of people must program in their jobs, even if it’s just Microsoft Excel. “Solving problems with computers happens to all of us every day,” he said. How to make the skills available broadly is “an interesting challenge.”
“It’s like being a diplomat and learning Spanish; I feel like it’s essential,” said Greer Brigham, a Brown freshman who plans to major in political science. He’s taking the course designed by Dr. Krishnamurthi, which this term is being taught by a graduate student in robotics named Stephen Brawner.
On a March morning at the Brown computer science center, Mr. Brawner projected a student’s homework assignment on the screen. Did anyone notice a problem? Nary a humanities hand was raised. Finally, a young woman suggested “centimeters” and “kilograms” could be abbreviated. Fine, but not enough.
Mr. Brawner broke the silence and pointed out long lines of code reaching the far side of the screen. With a practiced flurry, he inserted backslashes and hit “return” repeatedly, which drew the symbols into a neat block. It may all be directions to a machine, but computer scientists care a great deal about visual elegance. As Mr. Brawner cut out repeated instructions, he shared that “whenever we define a constant, we want that at the top of our code.” He then explained the new assignment: write a program to play “rock, paper, scissors” against a computer.
Mili Mitra, a junior majoring in public policy and economics who sat with a MacBook on her lap, would not have considered this class a year ago. But seeing group research projects always being handed off to someone with computing knowledge, she decided that she “didn’t want to keep passing them along.” She has learned to write basic code and fetch data sets through the internet to analyze things she’s interested in, such as how geographic proximity shapes voting patterns in the United Nations General Assembly.
Despite finding interactions with a computer much like “explaining things to a toddler,” Ms. Mitra credits the class for instilling the habit of “going step by step and building a solution.” She admits to being an impatient learner: “I jump ahead. In C.S. you don’t have a choice. If you miss a step, you mess up everything.”
Just as children are drilled on the scientific method — turn observations into a hypothesis, design a control group, do an experiment to test your theory — the basics of working with computers is being cast as a teachable blueprint. One thing making this possible is that communicating with computers has become easier.
“Block” programming languages like Scratch, released by the M.I.T. Media Lab a decade ago, hide text strings that look like computer keys run amok. That makes coding look less scary. Instead of keyboard letters and symbols, you might select from a menu and drag a color-coded block that says “say ( ) for ( ) secs” or “play note ( ) for ( ) beats.” The colors and shapes correspond to categories like “sound” or “motion”; the blocks can be fit together like stacked puzzle pieces to order instructions. Students use this to, say, design a game.
One need not be a digital Dr. Doolittle, fluent in hard-core programming languages like Java or Python, to code. Block languages cut out the need to memorize commands, which vary depending on the computer language, because the block “is read just the way you think about it,” Dr. Garcia said. Students in his Berkeley course use the block language Snap! for assignments — he doesn’t teach Python until the last two weeks, and then just so they can take higher-level courses. “We tell them, ‘You already know how to program,’ ” he said, because the steps are the same.
Computer Science A, which teaches Java, is the fastest-growing Advanced Placement course. (The number of students taking the exam in 2016 rose 18 percent over 2015 and nearly tripled in a decade.) But professors complained that “Java was not the right way” to attract a diverse group of students, said Trevor Packer, head of the A.P. program, so a new course was developed.
The course, Computer Science Principles, is modeled on college versions for nonmajors. It lets teachers pick any coding language and has a gentler vibe. There is an exam, but students also submit projects “more similar to a studio art portfolio,” Mr. Packer said. The course covers working with data and understanding the internet and cyber security, and it teaches “transferable skills,” he said, like formulating precise questions. That’s a departure from what the College Board found in many high schools: “They were learning how to keyboard, how to use Microsoft applications.” The goal is that the new course will be offered in every high school in the country.
President Obama’s “Computer Science for All” initiative, officially launched last year, resulted in educators, lawmakers and computer science advocates spreading the gospel of coding. It also nudged more states to count computer science toward high school graduation requirements. Thirty-two states and the District of Columbia now do, up from 12 in 2013, according to Code.org. It’s what Dr. Wing had hoped for when she advocated in her 2006 article that, along with reading, writing and arithmetic “we should add computational thinking to every child’s analytical ability.”
In an airy kindergarten classroom at Eliot-Pearson Children’s School, in the Tufts University Department of Child Study and Human Development, children program with actual blocks. Marina Umaschi Bers, a child development and computer science professor, created wooden blocks that bear bar codes with instructions such as “forward,” “spin” and “shake” that are used to program robots — small, wheeled carts with built-in scanners — by sequencing the blocks, then scanning them. Each “program” starts with a green “begin” block and finishes with a red “end.”
Coding for the youngest students has become the trendy pedagogy, with plentiful toys and apps like Dr. Bers’s blocks. Dr. Bers, who with M.I.T. collaborators developed the block language ScratchJr, is evangelical about coding. Learning the language of machines, she said, is as basic as writing is to being proficient in a foreign language. “You are able to write a love poem, you are able to write a birthday card, you are able to use language in many expressive ways,” she said. “You are not just reading; you are producing.”
Peer-reviewed studies by Dr. Bers show that after programming the robots, youngsters are better at sequencing picture stories. Anecdotally, she said, when they ask children to list steps for brushing teeth, they get just a few, “but after being exposed to this work, they’ll have 15 or 20 steps.”
Dr. Bers embeds computing in activities familiar to young children like inventing stories, doing dances and making art. At the Tufts school on a recent morning, children puzzled over a question: How does a robot celebrate spring?
“He’s going to dance, and then he will pretend that he is wet,” offered Hallel Cohen-Goldberg, a kindergartner with a mane of curls.
Solina Gonzalez, coloring a brown, blue and red circle with markers, peered soberly through pink-framed glasses: “He just does a lollipop dance.” Solina’s partner, Oisin Stephens, fretted about the root beer lollipop drawing she had taped to a block. “The robot won’t be able to read this,” he said. (It’s an invalid input.)
As they lurched around the carpet on their knees, the children executed computer science concepts like breaking instructions into sequenced commands, testing and debugging. One team used “repeat” and “stop repeat” blocks, forming a programming “loop,” a sequence of instructions that is continually repeated until a certain condition is reached.
Today’s job candidates must be able to collaborate, communicate and solve problems – skills developed mainly through social and emotional learning (SEL). Combined with traditional skills, this social and emotional proficiency will equip students to succeed in the evolving digital economy.
What skills will be needed most?
An analysis of 213 studies showed that students who received SEL instruction had achievement scores that averaged 11 percentile points higher than those who did not. And SEL potentially leads to long-term benefits such as higher rates of employment and educational fulfillment.
Good leadership skills as well as curiosity are also important for students to learn for their future jobs.
The report asked chief human resources and strategy officers from leading global employers what the current shifts mean, specifically for employment, skills and recruitment across industries and geographies.
Policy-makers, educators, parents, businesses, researchers, technology developers, investors and NGOs can together ensure that development of social and emotional skills becomes a shared goal and competency of education systems everywhere.
The maker space at the Rutgers Livingston campus offers a clubhouse ambience and high-tech tools. Credit Richard Perry/The New York Times
You remember wood shop. You made that swan-shaped planter your parents pretended to like. And then you moved on.
These days, tinkering is a bit more high tech. The blending of technology and craft in tools like 3-D printers and laser cutters has made it possible for ordinary people to make extraordinary things. And many ordinary people, living as they do, more and more in their heads and online, are yearning to do something with their hands.
So the “maker space” movement — D.I.Y. communities to get people creating, be it for fun, for art or for entrepreneurship — is booming. Maker Faires are held around the world. Commercial operations like TechShop have popped up across the country. And tinkering is being promoted on college campuses from M.I.T. to Santa Clara University, as well as in high schools and elementary schools.
There’s even a massive open online course, offered by the MOOC provider Coursera and taught by three scientists from the Exploratorium in San Francisco, called “Tinkering Fundamentals: A Constructionist Approach to STEM Learning.”
A computer-controlled drumstick was made with a 3-D plastic printer and laser cutter, right (that’s a student artwork on the cutter). Credit Richard Perry/The New York Times
Yes, tinkering is now a pedagogy.
Taking things apart and putting them together — skills children used to absorb in Dad’s or Mom’s workshop — has an important role to play in learning, according to Karen Cator, the chief executive of Digital Promise, a nonprofit organization created by Congress that focuses on the use of technology to improve education. “You’re exploring creativity, you’re exploring design thinking, you’re developing a sense of persistence,” she said. Building something new requires planning, trying and, yes, failing, and then trying again.
“These are incredibly important mind-sets for today’s world,” she said.
Ms. Cator, who served in the Department of Education during the first Obama term, talked excitedly about students who have designed child prostheses. “That’s what they’re going to remember their entire life,” she said. “They aren’t going to remember sitting in an electronics lecture.”
At Rutgers, a bustling maker space can be found in a moldering wood-frame structure on the Livingston campus in Piscataway, N.J. The building once served as the command headquarters for Camp Kilmer, a transportation hub for soldiers mobilizing for World War II; today, the building, still called Headquarters, houses computer repair offices and the division of continuing studies. And upstairs, there are wonders.
On any given day, as many as 20 students could be working on the array of equipment that the center offers training on and time to use, said Stephen M. Carter, who directs the university’s Center for Innovation Education and co-founded the New Jersey Makerspace Association in 2012. Students might be working on a class project, doing “something entrepreneurial” or making Halloween costumes, he said. “We support all of it.”
A 3-D plastic printer, top left, used to make various objects, including a robot with a motion-sensor heart. Credit Richard Perry/The New York Times
There are 3-D printers, which can be programmed to create wildly inventive shapes out of plastic or resin (like a decent copy of the Iron Throne from “Game of Thrones” or a bust of Groot from “Guardians of the Galaxy”). There is a laser cutter to etch materials like fabric, marble or wood and cut through plastic. Next door is an electronics shop, with racks upon racks of parts. Close by are drill presses, a router and a key cutter, which Mr. Carter refers to as “our gateway drug,” a piece of equipment neophytes can use to produce something they really need. A common space with couches and a television gives students a place to talk, show off their projects or just hang out.
Mr. Carter cobbled it all together “by hook and crook and grants and saving.”
Students love it. Alexandra Garey, who graduated from Rutgers in May, credits tinkering with changing the course of her studies, and life: “I went from somebody who was majoring in Italian and European studies to someone who was designing and prototyping products and realizing any product that came into my head.”
October of senior year, she wandered into the maker space because she’d heard “you can make cool products” and was interested in exploring entrepreneurship and learning some business skills. “I had no idea what I was doing,” she admitted. But the students who used the place, mostly in science and engineering disciplines, were accommodating and patient, and soon she was on her way.
A month in, she got a call from a friend who wanted help coming up with a tool for children on the autism spectrum — a grip for a pencil or crayon that could be fitted with an extension so the teacher could guide the hand of students who dislike being touched. By January, Ms. Garey had designed and fabricated a piece through 3-D printing and it was being tested in New Jersey classrooms; she later modified the design for stroke victims and people with brain injuries. Now she is working on making French presses and coffee mugs out of Illy cans.
Then there’s Jason Baerg, an M.F.A. student from Canada, who paints in acrylics on paper or wood, and uses the laser cutter to etch the paintings and cut out shapes that he arranges into assemblages. “It allows me to bounce between abstract and figurative spaces in production and presentation,” he said. “I’m liberated.”
He appreciates that this is not a sterile engineering environment. The setting’s funkiness makes it “probably the perfect place to do this work,” he said, “like an exploratory safe space for you to go and try out your ideas.”
That kind of enthusiasm tells Mr. Carter he is on the right track. “U.S. schools are very good at finding the brain-smart people,” he said. “They are also very good at finding the best athletes.” But they are not so good at finding and nurturing people who, he said, describing himself, think with their fingers. The next Steve Jobs and Steve Wozniak, he said, are more likely to emerge from a maker space than a garage. Besides, he said, “it keeps kids off the street.”
My classroom was alive with activity and a palpable sense of purpose as I maneuvered around the scattered knots of students, 10 in all, to get a closer look at their design journal entries. Group discussions, the random clatter of keyboard taps and mouse clicks, and the mechanical beeping and whirring of the 3D printer created a surprising harmony as my seventh-graders put the finishing touches on their latest creation — a prototype of a portable electric lamp.
“Oh no, stop the print!” a student cried out suddenly. “We need to re-measure the handle to make sure Anielka can hold it comfortably. Her hands are kind of small.”
This was the third week of STEAM, a new semester-long elective course offered at St. Gabriel’s Catholic School, a PreK–8 school of 450 students, nestled in the hill country surrounding Austin, Texas. STEAM class — which integrates art with the traditional science, technology, engineering, and math emphasis — takes an inquiry-based learning approach with a focus on design thinking, engineering, and computer science.
In this case, we were in the middle of a collaborative project with eight elementary-age students from NicaPhoto, a nonprofit dedicated to empowering children who live in one of the poorest barrios in Nicaragua.
In this semester-long Global Inventors/3D printing course, students from St. Gabriel’s and NicaPhoto joined forces to co-create a workable solution to a real-world problem: how to safely provide electric light for the Nicaraguan students, including Anielka, who lived in an area without a reliable power grid.
When I heard my student call out to her classmates, I was reminded that this process of persisting, problem solving, and creating something meaningful is the very essence of constructionism, a decades-old philosophy visible in its latest incarnation — the maker movement.
Creating a Nation of Makers
In 2007, the National Academy of Sciences released a report calling for sweeping improvements in K–12 STEM education. But even earlier, an eclectic assortment of tinkerers and hobbyists was already quietly changing the world. This motley group drew upon a variety of influences, including the DIY counterculture aesthetics of the Whole Earth Catalog and the whimsical mix of artistry and computing exhibited by MIT’s Tech Model Railroad Club. The group also capitalized on the open exchange of ideas during the meetings of the Homebrew Computer Club in the late 1970s. Club members included the eventual founders of Apple and Microsoft.
Then, 10 years ago, Maker Media, Inc., increased both the credibility and visibility of this growing subculture by hosting Maker Faires around the world. At this point, the movement captured the attention of mainstream educational institutions.
The efficacy of the maker phenomenon rests in the fact that it is not really a new idea at all. Rather, it’s the expression of an educational philosophy that goes back many decades to Seymour Papert, the father of the maker movement. Not only did he predict the benefits of one-to-one student computing in the early 1970s, but he also developed the notion of constructionism. In his works, including the seminal Mindstorms, Papert suggests that deep learning occurs through the process of creating an artifact, be it a computer program, a sonnet, or a robot.
When offered the opportunity to create something personally relevant and meaningful, students willingly and enthusiastically embark on the iterative design process, overcome challenges, and collaborate with others as they seek to learn more skills to help with future endeavors. The proof for this is visible not only at Maker Faires but increasingly in school makerspaces around the world. However, implementing a program that harnesses the compelling nature of making in support of a formal curriculum can still be daunting.
Establishing a Global Partnership
In the summer of 2014, I was brought aboard at St. Gabriel’s to develop a program that would both integrate STEM learning across the curriculum and use the new makerspace, which was still under construction. While I was researching various approaches to this, I was about to have the STEAM elective class — Global Inventors/3D printing — field test concepts before the makerspace opened. A chief goal was to provide students with authentic learning experiences that would incorporate the tools, skills, and dispositions necessary to be successful in an increasingly complex and connected world.
Although the school had access to a 3D printer, at that time it was not widely used, and it caught the interest of my class. Through the local Austin Maker Ed community, I learned about the educational services company Level Up Village (LUV) that connects U.S. schools to a global network of partnering organizations to provide real-world, collaborative projects. After speaking with a LUV representative, I selected the Inventors Course because its curriculum provided students with skill-building opportunities in engineering, 3D design, and fabrication. It was also open-ended enough for me to tailor for the STEAM class.
My students were motivated by a desire that went beyond using a 3D printer. They wanted to help their eight Nicaraguan partners, who because they lacked reliable electricity were forced to study, read, and play by potentially dangerous oil-fueled lamps. During the course of a semester, St. Gabriel’s and NicaPhoto students were equal partners in the mission to design a working solution to this problem. Over time, the students from both schools also developed a deeper understanding of the lives and circumstances of their distant partners.
Students assemble the final iteration of a lantern. Credit: Patrick Benfield
The final lantern model. Credit: Patrick Benfield
Through regular video exchanges, Skype sessions, and cloud-based file sharing, they worked their way through the iterative design process together, overcame failures, refined ideas, and eventually engineered working solar-powered lamps in a 3D-printed enclosure. Although the basic components of each lamp were identical (a single rechargeable battery, a solar cell, and an LED), the designers’ creativity and the end users’ needs were apparent in each unit. During this semester-long course, my students were delighted to discover that what they were learning in their math, science, social studies, and Spanish classes had real-world, practical applications.
My students Skyping with NicaPhoto students. Credit: Patrick Benfield
NicaPhoto students during a Skype session. Credit: Patrick Benfield
Moving from STEM to STEAM
Meanwhile, my experiences during this period at St. Gabriel’s, including during the STEAM course, were influencing the trajectory of the maker program I was developing. Seeing my seventh-graders initially struggle with “non-Googleable” questions and then over time grow into confident, purpose-driven thinkers confirmed, for me, the efficacy of making in school.
However, I did not want these gains to be confined solely to an elective class. Other local approaches for STEM education tended to limit student access to tools and concepts, including programming, fabrication, and robotics, by offering them only through electives or in after-school clubs. Similarly, younger students typically made infrequent visits to a computer lab with little practical application within their own classrooms, much less their personal lives.
With this in mind, my goal was to combine the analytical nature of STEM topics with the more personal, creative aesthetics inherent in making. By explicitly integrating the arts into learning and adopting a STEAM approach, the program could help our youngest students develop important cognitive competencies.
For instance, when creating a work in any artistic discipline, an innate part of the process involves considering how parts of a system work together to form a whole, working within time or material constraints, and determining the best path for a problem with multiple solutions. In addition, to honor the mission-driven nature of the school, our maker program would incorporate a design-thinking model, rooted in empathy, to support our burgeoning student inventors’ desire to serve those outside the community.
Implementing STEAM by Design
The creation and eventual full implementation of this STEAM by Design program is just one piece of a profound systemic transformation that St. Gabriel’s is undergoing. The school is doing great things this year: integrating social-emotional learning, redesigning the schedule to support teacher collaboration, and hiring additional personnel to assist with technology integration. As of this writing, we have also completed a major expansion of the campus, including the d.lab for Making. The lab is quickly becoming a favorite space for students.
A busy day in the new d.lab for Making. Credit: Patrick Benfield
I’m working closely with teachers from grades K–8 to introduce them to the key tenets and best practices of making, and designing curriculum opportunities that promote STEAM experiences, not just in this makerspace, but in every classroom as well.
Making a Difference
As educational paradigms keep shifting, the quest to find the precise combination of cognitive and soft skills leading to successful student outcomes will continue as well. While it does, makers around the world, in workshops and in schools, will keep innovating and honing skills to create something meaningful. I’m certain that for my seventh-grade students, this first experience with making was about more than just a grade; they were designing with their new NicaPhoto friends in mind. This wasn’t just project-based learning. It was people-based, and that made all the difference.
Patrick Benfield is the STEAM director for St. Gabriel’s Catholic School (Texas) and the creator of its maker education program, STEAM by Design. When he’s not working with students, he spends his time as a professional musician, tinkering with a vintage Hammond B3 organ, and learning how to program interactive fiction computer games.
…innovation as a way of thinking that creates something new and better. Innovation can come from either “invention” (something totally new) or “iteration” (a change of something that already exists), but if it does not meet the idea of “new and better,” it is not innovative.
As I was working with a group of administrators, something stuck out to me. Sharing a Google Doc that we could easily collaborate on, they had never seen this before, and were somewhat in a state of awe, yet to me, this was normal, or my “best practice”. In the terms of teaching and learning, “innovation” can be a very personal practice. One’s “best practice” could be another’s “innovation”.
Discussing “The Innovator’s Mindset” in a Voxer group with educators, in what is becoming global bookclub, Leigh Cassell made the comparison of this concept in literacy, which is a constant state of flux. If literacy is ever-changing, do educators change alongside of it? Others in the group made a unique comparison to the “decline of newspapers” and that some students are still tested on their ability to write a “news report” using the same format. Does this “testing” include the ability to link articles, embed media, and source from different mediums (amongst other things), or is still your typical “newspaper” report? The continuum could be from “innovation” to “best practice” to “dead practice”, if we are not trying to understand our current realities, let alone anticipate the future.
My belief is that innovation in teaching and learning starts with empathy; truly trying to understand those that you serve. Yet this is not only a starting point, but a continuous part of the process. Once the needs of the learner are defined, innovative practices may be developed, which if they truly are “better” as per the definition, will eventually become “best practice”. For them to stay as “best practice”, they will need to be constantly revisited and reflected upon, with reflection, tweaking, and recreating as part of the process, with the possibility of eventually discarding the process altogether. Some things could always be considered “best practice” (applicable to individuals, not necessarily as standardized solutions), but could eventually become obsolete. This is why reflection is crucial to the process of teaching and learning.
This is not about change for the sake of change; it is about constantly understanding and questioning why we do what we do1, not just taking it for granted. Some practices in education from before I was born, could still be utilized in education if they work for learners, but we can’t simply rely on TTWWHADI (that’s the way we have always done it) as an effective answer when it comes to learners. We must understand deeply why we do what we do to effectively serve the needs of learners.
(I am wanting to try different mediums so here is a short reflection I shared on Facebook.)
Progress is often driven not by the accumulation of small steps, but by dramatic leaps. The television wasn’t an iteration of a previous device, it was a new technology altogether. Einstein’s General Theory of Relativity didn’t tinker with Newton’s Law of Universal Gravitation, it replaced it in almost every detail. Likewise Dyson’s dual-cyclone vacuum cleaner was not a marginal improvement on the conventional Hoover that existed at the time, it represented a shift that altered the way insiders think about the very problem of removing dust and hair from household floors.
James Dyson is an evangelist for the creative process of change, not least because he believes it is fundamentally misconceived in the world today. As we talk in his office, he darts around picking up papers, patents, textbooks, and his own designs to illustrate his argument. He says:
Dyson’s journey into the nature of creativity started while vacuuming his own home, a small farmhouse in the west of England, on a Saturday morning in his mid-twenties. Like everyone else he was struck by just how quickly his cleaner lost suction.
Dyson strode into his garden and opened up the device. Inside he could see the basic engineering proposition of the conventional vacuum cleaner: a motor, a bag (which also doubled as a filter), and a tube. The logic was simple: dust and air is sucked into the bag, the air escapes through the small holes in the lining of the bag and into the motor, and the dust (thicker than the air) stays in the bag.
This realization triggered a new thought: what if there were no bag?
This idea percolated in Dyson’s mind for the next three years. A graduate of the Royal College of Art, he was already a qualified engineer and was helping to run a local company in Bath. He enjoyed pulling things apart and seeing how they worked. He was curious, inquisitive, and willing to engage with a difficulty rather than just accepting it. But now he had a live problem, one that intrigued him.
It wasn’t until he went to a lumberyard that the solution powered into his mind like a thunderbolt.
Dyson rushed home. This was his moment of insight. “I vaguely knew about cyclones, but not really the detail. But I was fascinated to see if it would work in miniature form. I got an old cardboard box and made a replica of what I had seen with gaffer tape and cardboard. I then connected it via a bit of hose to an upright vacuum cleaner. And I had my cardboard cyclone.”
His heart was beating fast as he pushed it around the house. Would it work? “It seemed absolutely fine,” he says. “It seemed to be picking up dust, but the dust didn’t seem to be coming out of the chimney. I went to my boss and said: ‘I think I have an interesting idea.’ ”
This simple idea, this moment of insight, would ultimately make Dyson a personal fortune in excess of ￡3 billion.
A number of things jump out about the Dyson story. The first is that the solution seems rather obvious in hindsight. This is often the case with innovation, and it’s something we will come back to.
But now consider a couple of other aspects of the story. The first is that the creative process started with a problem, what you might even call a failure, in the existing technology. The vacuum cleaner kept blocking. It let out a screaming noise. Dyson had to keep bending down to pick up bits of trash by hand.
Had everything been going smoothly Dyson would have had no motivation to change things. Moreover, he would have had no intellectual challenge to sink his teeth into. It was the very nature of the engineering problem that sparked a possible solution (a bag less vacuum cleaner).
And this turns out to be an almost perfect metaphor for the creative process, whether it involves vacuum cleaners, a quest for a new brand name, or a new scientific theory. Creativity is, in many respects, a response.
Relativity was a response to the failure of Newtonian mechanics to make accurate predictions when objects were moving at fast speeds.
Masking tape was a response to the failure of existing adhesive tape, which would rip the paint off when it was removed from cars and walls.
Dropbox, as we have seen, was a response to the problem of forgetting your flash drive and thus not having access to important files.
This aspect of the creative process, the fact that it emerges in response to a particular difficulty, has spawned its own terminology. It is called the “problem phase” of innovation. “The damn thing had been bugging me for years,” Dyson says of the conventional vacuum cleaner. “I couldn’t bear the inefficiency of the technology. It wasn’t so much a ‘problem phase’ as a ‘hatred phase.’ ”
Creativity is, in many respects, a response.
We often leave this aspect of the creative process out of the picture. We focus on the moment of epiphany, the detonation of insight that happened when Newton was hit by the apple or Archimedes was taking a bath. That is perhaps why creativity seems so ethereal. The idea is that such insights could happen anytime, anywhere. It is just a matter of sitting back and letting them flow.
But this leaves out an indispensable feature of creativity. Without a problem, without a failure, without a flaw, without a frustration, innovation has nothing to latch on to. It loses its pivot. As Dyson puts it: “Creativity should be thought of as a dialogue. You have to have a problem before you can have the game-changing riposte.”
Perhaps the most graphic way to glimpse the responsive nature of creativity is to consider an experiment by Charlan Nemeth, a psychologist at the University of California, Berkeley, and her colleagues. She took 265 female undergraduates and randomly divided them into five-person teams. Each team was given the same task: to come up with ideas about how to reduce traffic congestion in the San Francisco Bay Area. These five-person teams were then assigned to one of three ways of working.
The first group were given the instruction to brainstorm. This is one of the most influential creativity techniques in history, and it is based on the mystical conception of how creativity happens: through contemplation and the free flow of ideas. In brainstorming the entire approach is to remove obstacles. It is to minimize challenges. People are warned not to criticize each other, or point out the difficulties in each other’s suggestions. Blockages are bad. Negative feedback is a sin.
The second group were given no guidelines at all: they were allowed to come up with ideas in any way they thought best.
But the third group were actively encouraged to point out the flaws in each other’s ideas. Their instructions read: “Most research and advice suggests that the best way to come up with good solutions is to come up with many solutions. Free-wheeling is welcome; don’t be afraid to say anything that comes to mind. However, in addition, most studies suggest that you should debate and even criticize each other’s ideas [my italics].”
The results were remarkable. The groups with the dissent and criticize guidelines generated 25 percent more ideas than those who were brainstorming (or who had no instructions). Just as striking, when individuals were later asked to come up with more solutions for the traffic problem, those with the dissent guidelines generated twice as many new ideas as the brainstormers.
Further studies have shown that those who dissent rather than brainstorm produce not just more ideas, but more productive and imaginative ideas. As Nemeth put it: “The basic finding is that the encouragement of debate— and even criticism if warranted— appears to stimulate more creative ideas. And cultures that permit and even encourage such expression of differing viewpoints may stimulate the most innovation.”
The reason is not difficult to identify. The problem with brainstorming is not its insistence on free-wheeling or quick association. Rather, it is that when these ideas are not checked by the feedback of criticism, they have nothing to respond to. Criticism surfaces problems. It brings difficulties to light. This forces us to think afresh. When our assumptions are violated we are nudged into a new relationship with reality. Removing failure from innovation is like removing oxygen from a fire.
Think back to Dyson and his Hoover. It was the flaw in the existing technology that forced Dyson to think about cleaning in a new way. The blockage in the filter wasn’t something to hide away from or pretend wasn’t there. Rather, the blockage, the failure, was a gilt-edged invitation to reimagine vacuum-cleaning.
Imagination is not fragile. It feeds off flaws, difficulties, and problems. Insulating ourselves from failuresis to rob one of our most valuable mental faculties of fuel.
“It always starts with a problem,” Dyson says. “I hated vacuum cleaners for twenty years, but I hated hand dryers for even longer. If they had worked perfectly, I would have had no motivation to come up with a new solution. But more important, I would not have had the context to offer a creative solution. Failures feed the imagination. You cannot have the one without the other.”