Scalability question

  • Thread starter Thread starter Mark B
  • Start date Start date
M

Mark B

I have a website (WebsiteA) ASP.NET (VB) and its webservice
(WsAddNewTransaction) hosted on a shared server at www.serverintellect.com.
SQL 2005 backend database there too.

I have another website (WebsiteB) hosted there as well which I have been
using to test the webservice buy consuming WsAddNewTransaction.

I created an aspx page with a loop to consume the webservice 100 times:

'----------------------------------------------------------------
Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs)
Handles Me.Load

Dim strText As String = ""
Dim dtStart As DateTime = Now

'----------- fTransactionAdd() returns 'OK' if webservice
consumption successful

For i As Integer = 1 To 100
strText += fTransactionAdd() + " "
Next

Dim dtEnd As DateTime = Now
Dim tpDifference As TimeSpan = (dtEnd - dtStart)

Label1.Text = strText + vbCrLf + "Time Taken: " +
tpDifference.ToString

End Sub
'----------------------------------------------------------------


On average it reports back around 14-15 seconds to do it.

The webservice itself does these things:

- Runs a stored procedure to INSERT around 30 fields into a transactions
table in the SQL database.
- Creates a new thread to email the transaction owner and associate person
- Returns 'OK' if successful.

My question is:

It is possible that the webservice would need to be consumed around 400-600
times per second in total by various external clients. Yes -- relatively
high volume and relatively big database.

If I went to a dedicated server with www.serverintellect.com, would that be
enough to handle that sort of volume? What other things would need to be
done?
 
I have a website (WebsiteA) ASP.NET (VB) and its webservice
(WsAddNewTransaction) hosted on a shared server at
www.serverintellect.com. SQL 2005 backend database there too.

I have another website (WebsiteB) hosted there as well which I have been
using to test the webservice buy consuming WsAddNewTransaction.

I created an aspx page with a loop to consume the webservice 100 times:

'----------------------------------------------------------------
Protected Sub Page_Load(ByVal sender As Object, ByVal e As
System.EventArgs) Handles Me.Load

Dim strText As String = ""
Dim dtStart As DateTime = Now

'----------- fTransactionAdd() returns 'OK' if webservice consumption
successful

For i As Integer = 1 To 100
strText += fTransactionAdd() + " "
Next

Dim dtEnd As DateTime = Now
Dim tpDifference As TimeSpan = (dtEnd - dtStart)

Label1.Text = strText + vbCrLf + "Time Taken: " + tpDifference.ToString

End Sub
'----------------------------------------------------------------


On average it reports back around 14-15 seconds to do it.

The webservice itself does these things:

- Runs a stored procedure to INSERT around 30 fields into a transactions
table in the SQL database.
- Creates a new thread to email the transaction owner and associate person
- Returns 'OK' if successful.

My question is:

It is possible that the webservice would need to be consumed around
400-600 times per second in total by various external clients. Yes --
relatively high volume and relatively big database.

If I went to a dedicated server with www.serverintellect.com, would that
be enough to handle that sort of volume? What other things would need to
be done?

Are you talking about a dedicated server that is a back-end application
server with ASP.NET Web service or possible ASP.NET WCF Web or non Web
WCF service? It could host SQL Server as well. But the server is
dedicated to that functionality,

Of course, you don't want all functionality UI/WEB service/Database and
mailing service to be hosted by one server. The more you segregate
responsibilities across the infrastructure gives more scalability.

I think scalability is more aligned towards the application having the
ability to be scaled and not that you have changed network infrastructure.

Hey think about this, you have a Win 2k8 Web UI server, a Win 2k8 server
that's the application server and the application server is hosting a
WCF TCP/IP service on the Intranet infrastructure.

Communications between the client ASP.NET a WCF client consumes the WCF
service over TCP/IP not HTTP with the TCP/IP client and service using
binary serialization, which is much faster than using HTTP Web
application to Web service communications.

Your test there is not accounting for any data traveling over the wire
between the client and the service, which has to be factored in with
speed considerations and workload.
 
First, you may want to consider caching the web service results if the
results don't vary per caller.

Second, it doesn't matter who's hosting the web service, if you want to be
able to distribute the 400-600 calls, you need a web farm environment for
the web service machine.

Scott
 
Back
Top