?

Log in

No account? Create an account

Previous Entry | Next Entry

Fibonacci's Daughter: Math Tolerance

Just at that moment, Professor Pietro Baglioni looked forth from the window, and called loudly, in a tone of triumph mixed with horror, to the thunder-stricken man of science: "Rappaccini! Rappaccini! And is this the upshot of your experiment?"
    -"Rappacini's Daughter", Nathaniel Hawthorne

Computer science fundamentals

Thanks to all who participated in my "computer science fundamentals" curriculum discussion and meme. You've given me a feast for thought (or at least a few months' worth of MREs). I agree with those of you (especially kakarigeiko and zengeneral) who opined that most universities should have colleges of computing with a software engineering or Computational Science and Engineering (CSE) degree program and a Computer Science (CS) certification that consists mostly of mathematical coursework that is required early in the curriculum. Several of you (tmehlinger, prolog, and mapjunkie) seem to agree.

Computational science and engineering versus computer science

The thing about CSE, though, is that while it's more of an engineering program as opposed to a science program, it's still got a lot of [computer science], as opposed to [computational [science]] in it. In CS, the "science" modifies "computer", and so we are talking about the theory of algorithms, computational models, programming languages, operating systems, computer organization and architectures, and logics. In CSE, the "computational" modifies science, and so we are talking about technical computing (from high performance Fortran, C, and Java to MATLAB and Mathematica), parallel and randomized algorithms, optimization, and performance measurement and optimization. At Illinois, there is a CSE Ph.D. and a CS Ph.D., and the former requires a lot more numerical analysis and high-performance computing. As a doctoral program, however, it requires a lot of theory.

Discussions with KDD research assistant and other students

zengeneral and I had a hour-long conversation (10:10 - 11:10) this morning that prefigured my discussions with Tim last night and with tmehlinger in the hour after that. He believes that the pervasive sprinkling of proof theory that our math majors get throughout their undergrad education is better than the concentrated chunk of it that computer science undergrads get in courses such as CIS 301 (Logic for Computer Science). Upon reflection, this is sound thinking on the face of it - with one caveat. The analogy I'd use is trying to eat cubes of beef boullion or taking them like pills versus cooking broth and using it as a soup base. Only someone very hardcore would be able to stomach the former, and nobody who had a choice would prefer it, right? Well, here's the caveat: I think we can do both. Computer science should be a science first, and an engineering discipline second. Honestly, though, I couldn't care less whether we are in Arts and Sciences or Engineering. It helps to be aware of both. Politically and economically, there's a reason we are in engineering: a lot of our funding comes from applications development, and insofar as we are in business, we are not in the business of pure science.1

IMO, however, we've become very complacent about the erosion of basic math principles. There was a time when math anxiety would never have extended as far as students thinking themselves unable to grasp the principles of constructive proof, sequent calculi, or soundness and completeness at the sophomore level. For example:

  • Constructive proof: "Design a finite state machine that recognizes Set-Theoretic-Expression(L1, L2) given regular languages L1 and L2." (We did this in 600.191, Computational Models, at Johns Hopkins, taught by Michael Goodrich. I took it in fall, 1990, my third semester as an undergrad.)

  • Sequent calculi: "Explain how Generalized Modus Ponens is a special case of resolution over first-order definite clauses." (We did this in 600.335, Artificial Intelligence, taught by Simon Kasif in the same semester, although it took me another year or so to fully grok it.)

  • Soundness and completeness: "Learning from only positive examples neither sound but not complete for propositional concepts." (OK, bad example, as all of the machine learning courses I've taken or taught are graduate-level. Let's try again.) "Algorithm A/A* is sound and complete, given an admissible heuristic." (Another one from 600.335.)


BTW, the same semester I took Comp Models and AI, I was also taking 550.171, Discrete Mathematics, taught by Michael Schneider, 600.333, Computer System Fundamentals (the basic architecture course, according to the JHU undergrad catalog), and doing a 600.492 (Computer Programming Workshop) independent study project for Steve Salzberg on hypertext data models for MRI document databases.

My proposed fix

So, what do I propose as a "fix"? I'd say that the first order of business is to have a version of MATH 510 (our discrete math course, which students tellingly call "combinatorics") that students are supposed to take by the third or fourth semester as CS majors, and can take in their freshman year if they are well-qualified. I think IS majors should take this course as well! From talking to a few IS majors, it seems I'm not the only one who thinks so.

In my opinion as an educator, what we need to do is to condition students to be comfortable with fundamental theory. We need to train them to feel confidence, not aversion and anxiety, at seeing challenging problems and being asked how to approach them from scratch. Above all, we need to indoctrinate students early with the principle that being "clean and correct" is good, that rigor is useful, and that - as with physical fitness - these intellectual skills do not involve pain and struggle if you seek to learn and stay in practice. I said to tmehlinger that CS students need to develop a tolerance, if not initially an interest in and enthusiasm for, foundational concepts of math and CS. "Like poison," quipped masaga, which is how the title of this entry came to pass. I was not joking, though my terminology was tongue-in-cheek. Too often I have seen students "freeze up" and act increasingly discouraged until I just know that their inner monologue is "this is beyond me; I can't do this". kakarigeiko is exactly right when he says that part of being a CS is developing the belief and the comprehension that "CS is not hard. CS is not baffling. CS is full of massively useful ideas where developments happen amazingly quickly and so things are always very exciting."

Back in my day, young whippersnappers...

As a grad at Illinois, I was a TA for courses such as CS125 (Intro to CS, analogous to CIS 200 at K-State), CS325 (the first programming languages course, similar to CIS 505 here), and CS348 (Intro to AI, similar to my course, CIS 730). We cut no corners: look at the most recent offering of CS173 (Discrete Mathematical Structures), taught by my friend and same-year classmate, Cinda Heeren.

Even earlier, when I was an undergrad at Hopkins, Bogart's Discrete Mathematics was our discrete math book, Lewis and Papadimitriou's Elements of the Theory of Computation our comp models book, and our professors thought it was perfectly reasonable to have use take 600.140 using Abelson, Sussman, and Sussman's Structure and Interpretation of Computer Programs2 (SICP, aka the Purple Book) - all in our first two years. JHU and K-State proudly occupied adjacent positions on the Schemers list of universities that used Scheme in the undergrad program. I remember seeing that SICP was 6.001 (the first CS course) at MIT and being quite awed. (gregbo, did you take this? If so, what can you tell us about your experience?) Later I saw that Hindley and Seldin used their combinators textbook as a freshman-sophomore CS course and was highly impressed, but by then I no longer considered it an intimidating thing to learn.

You see, I was inured, like Rappacini's daughter, who grew up in a garden of poisonous flowers in Hawthorne's cautionary short story. In fact, if you took me out and fed me the "antidote" of software engineering serum, I would rather wither and die on the spot.3 Thus should a good CS eat, sleep, and breathe "correctness": knowing how to "prove your program" is not some kind of ivory-tower idealism. As Dijkstra said: you shouldn't call a programming error a "bug"; it implies that some third party snuck in and messed with your otherwise correct program. No, if your program fails (and here he surely meant "barring hardware problems" such as Admiral Hopper's infamous moth), you made an error. Dijkstra may have been pugnacious with his famous wielding of the term "worthy of the middle ages" (using which he attacked informal methods in programming), but if you think about it, there are no two ways about it. If you aren't with correctness, you're against it!

Now, I'm ecumenical: there are many paths to correctness, and not all of them involve verifiers and model checkers; Cooper and Clancy had a great little section in Oh! Pascal! about antibugging that I think every programmer should read. But to say that we can get by with "mere" testing or by the seat of our pants? No. Well, actually... no.

1 By "pure science", I mean theoretical and empirical inquiry for the sake of producing knowledge about the natural world. We are a science, but we are also engineers, who define and treat knowledge, perforce, in intentional (applied) terms rather than in terms of information. So, in epistemological terms, it seems to me that "science vs. engineering" has always been more of a tension over "how you use knowledge" than "how you get knowledge".
2 SICP, Second Edition is available for free online in its entirety, here.
3 Yes, I realize that the point is that Rappacini was evil in embracing empiricism to the detriment of humanity in his greed for knowledge. The point of my analogy is that the "rarefied" stuff of theoretical CS isn't poison; rather, the "wide and crooked path" of informality should be corruption to us. Think rather of the flowers that Mrs. Whatsit has Meg, Charles Wallace, and Calvin sniff on Uriel in A Wrinkle in Time, if you prefer. Or (if you've read it) think of Bunyan's Pilgrim's Progress.

On a tangentially related note: I came in at 10:00 today to give masaga his make-up exam in CIS 732 and merrily forgot about the kickoff meeting for our Targeted Excellence grant on sensor networks! Argh, argh, argh. I'll have to ask Mitch and Gurdip to catch me up. "bk. wb. ty." Oi. (Hey, kakarigeiko! I see you do a lot with sensor networks; do you do any work with Carlos Guestrin by any chance?)

--
Banazir

Comments

( 8 comments — Leave a comment )
prolog
Oct. 29th, 2005 11:28 pm (UTC)
a couple of things
I couldn't care less whether we are in Arts and Sciences or Engineering.

I do. Engineering at the U of S required so many math and engineering courses that the students typically got two, maybe three half-courses in subjects other than those required by their degree.

To me, that's positively criminal.

Specialization is for graduate degrees. Undergraduate degrees are supposed to be about exploring many different ideas and disciplines. Outside of CS and math, I took astronomy, French, a number of English courses, economics, and music. I wish I had room for ancient Greek and history.

That's why I've always felt that putting CS in the Arts & Science college (or whatever the equivalent) is a good thing.

In my opinion as an educator, what we need to do is to condition students to be comfortable with fundamental theory.

Absolutely agreed.
(Deleted comment)
gondhir
Oct. 30th, 2005 02:23 am (UTC)
Re: a couple of things
"A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects."
-Robert A. Heinlein
hfx_ben
Oct. 30th, 2005 12:40 am (UTC)
*shouts comment to Witch-King*
HeyYa -

Nice to see someone pushing LJ to the limits ... and I do mean "to the limits".

<==== thinks anything more than 3 screens high is too long ... and knows that TimbBL feels likewise

I conceptualized a variant of LJ that would support this sort of multi-threaded activity. When I find someone with seed capital who's willing to sign a standard non-disclosure agreement I'll spill my guts.

cheers
:-)
banazir
Oct. 30th, 2005 12:57 am (UTC)
Hey, Ben!
Long time, no see!

I have been catching your wibblings on hfx_ben and gnodal, but havne't had time to reply. Good to hear from you!

My posts are fewer than 3 screens long on my 1600x1200 monitor. ;-)

Wot's TimbBL, precious?

Here's hoping you get your LJ variant idea funded! :-)

BTW, care to be friended to tanelos, my Nanowrimo LJ?

--
Banazir
hfx_ben
Oct. 30th, 2005 11:11 pm (UTC)
Re: Hey, Ben!
HeyYa Bill -

huh huh ... trust you to have a good programmer's monitor!
;-)

TimBL? Tim Berners Lee of W3C, u'course. "Back in the day" he was accessible by email. (In '95 he and I exchanged notes concerning a functionality I had been using in '89 ... something like "sticky notes" launched from a hyper-enabled page of text [I had been creating hyper-help using "Peabody".] ... this was pre-frames.)

My LJ variant ... funding ... gawd ... everyone is too busy for a new idea. Winners only listen to winners.

<=== not in "winner" mode at the moment.

I think the kiss of death came last spring when a) my HD died (3rd in 2 years) and b) the Macromedia software I'd acquired to produce a prototype declared that it needed Win2000 min ... and here I am with 2 300MHz Win98 laptops. *sigh* Hence my renewed interest in Java (though I really yearn to produce an ECMAscript interface running PHP/MySQL on Apache).

Sorry chum, I really don't have the concentration for NanoWriMo ATM ... maybe when I find a place to live. *sigh*

BTW: I'm "willowbear" on Skype ... tremben@netscape.net. Ijust ran the test-call ... seems fine.

gregbo
Oct. 30th, 2005 03:20 am (UTC)
I took 6.001, but it was a long time ago. The course has changed quite a bit since I took it in 1980. Mine was the last time that Unix Lisp and ALGOL were used. However, Abelson did cover the same types of principles that are used today (hierarchy, modularity, etc.).
kakarigeiko
Oct. 31st, 2005 12:01 am (UTC)
Haven't had a chance to work with Carlos yet - he's doing more inference / machine learning stuff, I'm doing more security / general distributed algorithms stuff, so as yet there's no intersect. Maybe in the near future though!
( 8 comments — Leave a comment )

Latest Month

December 2008
S M T W T F S
 123456
78910111213
14151617181920
21222324252627
28293031   

KSU Genetic and Evolutionary Computation (GEC) Lab

Teunciness

Breakfast

Science, Technology, Engineering, Math (STEM) Communities

Fresh Pages

Tags

Powered by LiveJournal.com
Designed by Naoto Kishi