D
David
Our company is evaluating new technologies for a new product. I am a big fan
of using widely supported, easy to use, off the shelf technology as much as
possible. The application involved uses industrial I/O. I, personally,
would prefer to program in C# with an integrated debugger, but some of my
colleagues insist that the .Net Compact Framework is totally unsuited to the
task, because it is "not real time". If they win, we have to end up
developing in C on some sort of embedded development package, which seems to
me a big step down.
The application involves digital I/O in a machine on a factory floor. The
application will receive a message on the Ethernet port. Within 2
milliseconds, it must set the digital I/O bit, turning off the motor it is
controlling. After that, it is guaranteed that nothing important will happen
for the next 500 milliseconds.
It seemed to me that the .NET compact framework is well suited to the task
because the only non real time aspect is garbage collection. We can run the
task, get the message, set the bit, and force a garbage collection after we
are done, during the 500 millisecond idle time. That way, we will never end
up with a GC that interrupts anything important.
My colleagues are not sold on this idea. They have read that it is "not
real time", and that is that.
Can anyone offer supporting examples of using the .NET Compact (or even
Micro) framework on such a task?
of using widely supported, easy to use, off the shelf technology as much as
possible. The application involved uses industrial I/O. I, personally,
would prefer to program in C# with an integrated debugger, but some of my
colleagues insist that the .Net Compact Framework is totally unsuited to the
task, because it is "not real time". If they win, we have to end up
developing in C on some sort of embedded development package, which seems to
me a big step down.
The application involves digital I/O in a machine on a factory floor. The
application will receive a message on the Ethernet port. Within 2
milliseconds, it must set the digital I/O bit, turning off the motor it is
controlling. After that, it is guaranteed that nothing important will happen
for the next 500 milliseconds.
It seemed to me that the .NET compact framework is well suited to the task
because the only non real time aspect is garbage collection. We can run the
task, get the message, set the bit, and force a garbage collection after we
are done, during the 500 millisecond idle time. That way, we will never end
up with a GC that interrupts anything important.
My colleagues are not sold on this idea. They have read that it is "not
real time", and that is that.
Can anyone offer supporting examples of using the .NET Compact (or even
Micro) framework on such a task?