From: J. Andrew Rogers (andrew@ceruleansystems.com)
Date: Tue Jan 31 2006 - 19:07:13 MST
On Jan 31, 2006, at 4:17 PM, Robin Lee Powell wrote:
> This seems to break down to:
>
> 1.  People familiar with C will have no real problems with pointers.
>
>     Absolute rubbish.  I programmed C for *years*, and they never
>     stopped causing me trouble.  This is true for 9/10ths of the
>     programmers I've discussed this with.
It is a good thing that the really important code is written by the  
other 10% then, isn't it? ;-)
I never said that programmers do not have problems with pointers  
(clearly many do), but that there is nothing intrinsic about pointers  
in C that means they *will* cause trouble.  I programmed C for many  
years too, and any problems I did have disappeared after a couple  
years simply by learning proper practice.  The primary problem is not  
pointers but memory management -- reference counting languages have  
similar issues in the hands of careless programmers.  I was really  
reacting to how some people recoil in horror at the thought of  
programming in C, as though all C code is unavoidably saddled with  
memory leaks and similar.  It requires a little extra discipline and  
a clue, which is not an insurmountable sacrifice if you need or want  
the benefits of C.  Writing bulletproof C *is* painfully verbose.
There are many reasons to use GC languages or other memory management  
schemes notably lazy, careless, or bad programmers.  I use languages  
like Python because I am lazy, not because pointers have ever caused  
me problems, and much of the C I have written is for server processes  
that require years of nonstop operation -- if there were problems,  
they would have become apparent.  Maybe I am special, but I doubt it.
> 2.  Using pointers directly increases code speed.
>
>     Even more incorrect.
You did not comprehend what I was stating.  Direct access to memory  
management (which pretty much requires pointers) *allows* very  
significant performance improvements that are not accessible  
otherwise.  No language is immune to naive code.
>     In the long run, performance is a factor
>     of the big-O of the code, and I am, in general, *guaranteed* to
>     come up with a worse (in terms of big-O) piece of code than
>     someone who has, say, been studying sorting algorithms for their
>     entire career.  Thus, if I use a language that has such a
>     sorting algorithm built into it, I am guaranteed to be better
>     off.
Not necessarily true; the designer of the generic sorting algorithm  
has no idea what you are going to do with it.  This issue comes up  
pretty frequently in the real world when using "highly optimized" but  
generic libraries for performance sensitive code.  But what do you do  
after you've squeezed out all the big-O you can?  There is usually  
integer factors of potential improvement left over in direct memory  
management and memory layout, even for natively compiled languages.
In fact, the performance improvement gained by direct memory  
management is nothing more than optimizing the big-O of memory access/ 
management algorithms.  Big-O optimization applies as much to what is  
under the hood as what is on top of it.
>     This is not to say that all languages without pointers are
>     faster than all languages with them, but the idea that having
>     direct access to memory pointers in a language *necessarily*
>     makes things faster is preposterous.
No, it necessarily *allows* a programmer to make things faster, at  
least with the programming technologies we have today.  Direct memory  
management in all its guises will buy integer factors performance  
improvement for many types of codes if you know what you are doing,  
which is pretty significant.  Compilers and garbage collectors still  
have pretty crude ideas on how to keep the cache lines full.
Do what you gotta do.  If speed is important above all else, direct  
memory management will buy you a hell of a lot of it in many cases,  
and that requires pointers.  If speed is not so important in the  
calculus, use anything that appeals and reduces line count.  I use  
Python most of the time myself for screwing around with AI stuff, but  
the highly optimized C versions of the same complex algorithms are  
about 10^2 - 10^3 times faster and use about 10x less memory.   
Languages like Java can be somewhere in the middle when optimized.
I was not saying that C was great for everything, just that C is hard  
to touch for raw performance potential if you bother to spend some  
time with it.  For many things, raw performance does not matter  
enough to be worth the time.  TANSTAAFL.  Pick the right tool for the  
right job and realize that sometimes you cannot have your cake and  
eat it too.
Cheers,
J. Andrew Rogers
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT