Today was seminar day. I gave my talk about a brief introduction to
Algorithm Analysis and I covered topics: a simplified computer
architecture (with special attention to the role of the memory and
of the CPU), a simple program for computing squares of integers and
the concept of iteration, the accounting of steps of the program
(which results in a linear polynomial in the number of squares
printed), the fact that we can ignore constants when doing analysis
of algorithms and asymptotic notation (O,
\sim). Statement of the Prime Number Theorem in terms of
asymptotic notation. Comments regarding difficulties of factoring
integers, despite tests for primality being "easy".
It seems that I didn't scare people when talking such things, and that they felt like they understood what I said, which is good, since there are often conflicts between the language that Computer Science people (mainly the "practical people") use and the language that Mathematicians use. Since I am mostly in the middle, I think that I can talk with both without problems (but this means that I am not an specialist in anything, alas).