Alan Kay: How I Would Teach Computer Science 101

“One of the reasons to actually go to university is to go beyond mere professional training and instead cling to deeper ideas.”

Alan Kay: How I Would Teach Computer Science 101

Let's think about this question for a bit. A few years ago, I was invited by Computer Science departments to lecture at a number of universities. Almost by accident, I asked my first audience of undergrads, graduate students, and professors about their definition of "Computer Science." All were able to give only an engineering definition. I did this in every new place, and everywhere there were similar results.

Another question was: "Who is Douglas Engelbart?". Several people said, "wasn't it somehow related to a computer mouse?" (and this was very disappointing to me, since my scientific community put a lot of effort in order to answer this question, it was possible after two or three mouse clicks and to make sure that Engelbart really had something to do with a computer mouse).

Part of the problem was the lack of curiosity, partly the narrowness of personal goals that were not related to learning, partly the lack of an idea of ​​\uXNUMXb\uXNUMXbwhat this science is, and so on.

I've been working part-time in the Department of Computer Science at the University of California for several years (I'm basically a professor, but I don't have to go to department meetings). From time to time I conduct classes, sometimes for first-year students. Over the years, the already low level of curiosity about Computer Science has significantly decreased (but the level of popularity has also increased, as computing is seen as a path to a well-paid job if you can code and have received a certificate in the top 10 schools). Accordingly, not a single student has yet complained that the first language at the University of California is C ++!

It seems to me that we are faced with a situation in which both the meanings of "Computer" and "Science" were destroyed by weak massive concepts to create a new term - a kind of label on jeans - that sounds good, but is empty enough. A related term that has been similarly destroyed is “software engineering”, which, again, did not use the most ingenious ideas of “programming” and “engineering”, but simply combined them (this was deliberately done in the sixties, when it was coined term).

One of the reasons to actually go to university is to go beyond mere professional training and instead cling to deeper ideas. It seems to me quite reasonable for an introduction to a specialty to try - if possible with the help of examples - to get students to deal with real problems and begin to understand what is really interesting, important and central in this area.

First graders rejoice when they are shown how a ruler on top of another ruler becomes an arithmometer, with which they can outperform the 5th graders when adding a fractional part. And then they will be happy to take part in the development of improved adding machines. They touched a real computer, a physical and mental tool that helps us think. They learned a really efficient way to represent numbers - more efficient than what is taught in schools!

They were able to combine their common sense idea of ​​"adding" as "accumulation" with something similar with powerful new properties. They programmed it to be able to solve a variety of problems.

They also expanded it. And so on. This is not a digital computer. And it's not a computer with a memorized program. But this is the essence of the computer. Just like antikythera mechanism - this is generally the essence of the computer and computing.

Alan Kay: How I Would Teach Computer Science 101

Antikythera mechanism

How far can we go and how much can we do before things get out of control and we get lost in abstractions? I've always been partial to characterization Alana Perlisa — the first Turing Award winner who may have coined the term “Computer Science” — who said in the 60s, “Computer Science is the science of processes.” All processes.

For the sake of Quora, let's not try to push this further or turn it into a religious dogma. Let's just happily use the idea Ala Perlisato better think about our area. And especially about how to teach it. Now we need to look at the modern meaning of "science", and Perlis was pretty sure that it should not be diluted with old meanings (such as "collection of knowledge" for example) and usages (such as "librarianship" or even "social sciences"). "). By "science" he tried to understand the phenomenon by creating models/maps that attempt to show, "track" and predict phenomena.

Alan Kay: How I Would Teach Computer Science 101

I've done several interviews about how the best cards and designs can often fit on a T-shirt, the way Maxwell's equations and others fit. The analogy is that there is a "science of bridges" even though most bridges are man-made. But once a bridge is built, it displays phenomena, scientists can study them, bridges can be made into models of many kinds, and comprehensive and useful "bridge theories" can be formed. The fun is that you can then design and build new bridges (I mentioned before that there is hardly anything more fun than scientists and engineers working together to solve big and important problems!)

Alan Kay: How I Would Teach Computer Science 101

Turing and Nobel laureate Herbert Simon called it all "artificial sciences" (and wrote an excellent book of the same name).

Alan Kay: How I Would Teach Computer Science 101

Let me give you an example. In the 50s, companies and universities were building memory-program computers and starting to program them - and there was a special moment when Fortran appeared in 1956 - which was not the first high-level language, but perhaps the first one done so well that it used in many different areas, including many that were previously made only in machine language.

All this gave rise to "phenomena".

Alan Kay: How I Would Teach Computer Science 101

John McCarthy

The history of Lisp is more complex, but John McCarthy became interested in trying to find a "mathematical theory of computation" and was determined to make it work just fine. The eval function that interprets Lisp fits on a T-shirt! Compared to the "programming system" - this is negligible. More importantly, this "theory of computation" had a more powerful concept than Fortran! It was the best bridge idea ever!

The miniaturization of Lisp allows the whole idea of ​​programming to be grasped in a couple of clicks at a deeper level and thought out at a level that just seems impossible when you look at huge artifacts (this is one of the reasons why scientists like to keep mathematics compact and powerful). The mathematics used here is new mathematics in that it allows for concepts such as "before" and "after", and this leads to "logic of variable magnitude", which allows both functional dependence and logical thought to be preserved, while also allowing for the proposition and the passage of time. (This is still not understood in our time in the cruel world of situational programming).

Lisp, as a powerful programming language and metalanguage capable of presenting its own theory, is an example of true computer science. If you learn it and other things like that, you will be able to think more deeply and be more responsible for your own destiny than if you just learned to program in Fortran or its modern equivalents (...so you can get closer to programmers!).

You will learn a lot more about the special kinds of design that are needed in computing (for example, it is not usually appreciated when the calculations often require going outside the computing environment: one of the special characteristics of stored program computing is that it is not just a material for a program, but material for a brand new computer).

Another reason for choosing the Perlis definition is that, in general, computation is much more about building systems of many kinds than it is about algorithms, "structure data," or even programming per se. For example, a computer is a system, computing is a system, the local area network and the internet are systems, and most programs should be better systems than they are (the old style of programming from the 50s lasted to the point where it seems that programming should be like this - nothing is further from the truth).

The Internet is a good example - unlike most programs these days, the Internet doesn't need to be stopped to fix or improve anything - it's more like a biological system - by our intent than what most people think of as a computing system. And it's much more scalable and reliable than almost any software system in existence today. This is really something to think about before teaching less powerful concepts to newbie programmers!

So what we need to do in a first year in Computer Science is take into account what exactly students can do at the very beginning, and then try to stay within their "cognitive load" to help them get to what is really important. . It is very important to "stay present" and find ways that are intellectually honest and suitable for those who are just starting out. (Please don't teach bad ideas just because they seem a little simpler - a lot of bad ideas are actually simpler!).

Students should start by building something that has many of the important features I've discussed here. It should be a system of several dynamically interacting parts, and so on. A good way to decide which programming language to use is to simply make something that has thousands of interacting parts! If not, then you should find one. The worst thing that can be done is to put students on a path of too little fluency, which would severely limit big ideas. It just kills them - and we want to raise them, not kill them.

About GoTo School

Alan Kay: How I Would Teach Computer Science 101

Source: habr.com

Add a comment