'time' utility

I’m running some tests on a cpu intensive process
that does no disk accesses, I’m using the ‘time’
utility and finding that user time is greater than
real time. How can that be possible?

The output looks like:

230.35s real 378.68s user 4.49s system

I’m also finding that this process, which is on a
133 mhz Pentium, runs about 60 times slower than
on a 600 mhz Pentium.

Thanks for any insight.
Carlos