A webpage cannot be just JavaScript or any other langauge other than
HTML, it must start with HTML...and we use background processors like
ASP and PHP to pre-process and add deterministic logic ability to
HTML. Whatever a web page extension may be...ultimately what is
EXECUTED by the browser is HTML as can be witnessed by viewing the
source of any web page you navigate. Another language has had to be
used to deterministically generate the appropriate HTML. It would be
good if this was not the case. The appearance is a web-application,
but the real application is the logic in the other languages being
used behind the scenes and not the HTML.
I am not the most familiar with ASP.NET either, just a C#/VB.NET
developer writing distributed .NET systems whereby there is some
overlap with respect to security and typical problems of accessing
data across global networks.
I don't know if any other languages are aimed at sorting out web
langauge problems apart from CURL.
I agree it would take some backing to get a change - indeed from the
W3C! In this case, CURL is under development from a team which is
headed-up by Tim Berners-Lee and we know what he is known as the
father of! If he sees Internet v2 using a full-blown OO language and
not the current "pick and mix" approach, then I am with him on that
one, and he carries weight with the W3C.
I think we should be able to use one language targeted at a browser
whereby the code (not a "script") is delivered to the browser and run
by just-in-time compilation through a runtime. How that looks in a
browser depends on many things and is determined by the logic the
browser processes and not some powerful application tier sitting
behind a row of web-servers in a central location. The client browsers
have powerful machines these days - when browsing that power is
underutilized and the problems of concurrency can stress web-servers
which must develop pages just-in-time by generating them through ASP
or PHP an the ilk. In the CURL approach, the deterministic logic would
reside at the browser end. Current PCs are more than capable of this
and it would free up the processing power in corporations for other
things than running ???,000 users web-applications on some machine in
a data center.
I for one, will be looking to learn Curl as soon as I can. There are a
few good books on the subject that were lurking around as soon as
early as the fall of 2001. I very much like the concept - you can
probably tell
All I actually am interested in is if Microsoft are
going to make it part of their .NET strategy and allow to become part
of the VS.NET IDE - if not, I may end up having to learn someother
companies IDE - in the early stages of CURL this is a must anyway. But
VS.NET is most things to most PC developers and if Curl takes off, I
think MS will take it on board as part of their standing "embrace and
extend" policy.