Tony said:
Hi!
One computer with two cores running an application that use only the main
thread will be given time slice
but if this same application use several threads then as I assume will be
given more cpu time because of using additional threads.
I mean for example if you have three threads then you will be given three
time slices.
It really just depends on what else is going on. Brian's reply assumes
that your hypothetical application (process) is already getting 100% of
the CPU with just one thread. So of course with that assumption it
cannot possible get more CPU time by adding threads (and in fact, will
get less _useful_ CPU time because there's overhead in switching from
one thread to another). You can't have more than 100% of the total CPU
time.
But of course that's not a very interesting example, and is trivially
designed to make it impossible to increase the CPU time allocated to the
process.
A more interesting example would be to assume _two_ running processes,
each with one thread, and each consuming as much CPU time is granted it.
In that case, adding threads to one process _does_ increase the amount
of CPU time it gets. Assume we keep one process having just one thread,
and add threads to the other process:
• If the other process has two threads, then it has 2/3rds of the
running threads on the system and so gets 2/3rds of the CPU time
• If it has three threads, then it gets 3/4ths of the CPU time
• If it has four threads, then it gets 4/5ths of the CPU time
• etc.
Every thread that is scheduled to run is given the same time slice
(quantum), and unless it yields the CPU it will use the entire quantum.
So, ignoring other effects (such as priority, i/o, synchronization,
etc. all of which can affect whether a thread gets to run at all, and
for how long), adding threads to one process that is competing with one
or more other processes for the CPU will result in that one process
getting more of the CPU time.
Pete