Programatically create aspx page?

  • Thread starter Thread starter Mark B
  • Start date Start date
M

Mark B

Is there a VB.NET command to programmatically create a new aspx page on the
server from a string variable strMyNewHTML?
 
Is there a VB.NET command to programmatically create a new aspx page on
the server from a string variable strMyNewHTML?

Might have to explain what you're trying to do a bit more. But you can
programmatically create content yes.

Using the 'Response' stream.

Response.Write(strMyNewHTML)

In Page_Load will dynamically display what's in there.
 
Yeah but I need to save the aspx file onto the disk on the server so later

Yes, you could just save that information using:

File.WriteAllText(filename, data);

In the System.IO Namespace.

It does depend on what you're trying to do though, you can carry out URL
redirection and things like this in your 'web.config' - Where you can have
different URL's correspond to a single page, which you can alter based on
which URL the browser called.
 
Is that URL redirection via webconfig SEO friendly?

Basically our concept is similar to a dictionary.com one where they have:

dictionary.com/truck
dictionary.com/trunk
dictionary.com/try

etc

and each page is laden with the particular keyword. I thought the only way
they did this was by creating separate pages for each.
 
Is that URL redirection via webconfig SEO friendly?

Basically our concept is similar to a dictionary.com one where they have:

dictionary.com/truck
dictionary.com/trunk
dictionary.com/try

etc

and each page is laden with the particular keyword. I thought the only way
they did this was by creating separate pages for each.




Yes, you could just save that information using:
File.WriteAllText(filename, data);
In the System.IO Namespace.
It does depend on what you're trying to do though, you can carry out URL
redirection and things like this in your 'web.config' - Where you can have
different URL's correspond to a single page, which you can alter based on
which URL the browser called.

Mark,

Don't be crazy about SEO-friendly URLs. A link like /page.aspx?
id=truck has the same meaning as a link like /truck.

If you definitely want to have "short" URLs, then you can either use
httpModules (google for "URL Rewriting"), or ASP.NET MVC. In both
cases the idea is not to create a new aspx page, but return an output
as it would be a new page.

Let me know if you have further questions regarding this

Hope this helps
 
Hello,

Rather than throwing at us some ideas to achieve some unknown goal, could
you start by explaining what you are trying to do ?

For now, my understanding is that you would like to have "friendly" urls
which is done by using what is called "url rewriting". See for example :
http://msdn.microsoft.com/en-us/library/ms972974.aspx

The idea is that the request to a friendly url is intercepted and then your
url rewriting module directs transparently this request to an actual page
with possibly some url parts as query string parameters...

Also having the big picture could help to raise better suggestion. Do you
want to do this only for search engine or do you want also to actually use
those friendly urls on your site ? What is the benefit you are looking for ?
 
I'd like to just show you the website URL (picture paints a thousand words)
but it's not quite done yet. Hopefully in the next few days ... then I can
post it to this thread.
 
Ok, have you checked Google for webmaster tools ? AFAIK they provide you
with quite a bunch of tools including the ability to see how your site is
seen by the Google indexer and guides about best practices...

The key point here is to understand and measure how the change you made
impacts your site rather than doing random changes and have no way to find
out if it improved (or possibly damaged) your site ranking...
 
So if in the robots.txt I had:

www.mysite.com/definitions/default.aspx?id=truckwww.mysite.com/definitions/default.aspx?id=trunkwww.mysite.com/definitions/default.aspx?id=try

they'd all be stored separately in Google? It would be nice if they did --  
save us a lot of work and disk space.

So I would need to programmatically re-write the robots.txt whenever another
word was added to the database? Or would it suffice if my homepage had all
these links on (created programmatically)?

The robots.txt file is used to define what content can be excluded by
search engine spiders. You don't need to define every single URL
there. To index all pages, you either should delete robots.txt or put
there just two following lines

User-agent: *
Disallow:

I think it would not be a problem if you enumerate all links in that
file, but I'm pretty sure that this will not help to increase any
ranking.
 
OK thanks.

So if in the robots.txt I had:

www.mysite.com/definitions/default.aspx?id=truckwww.mysite.com/definitions/default.aspx?id=trunkwww.mysite.com/definitions/default.aspx?id=try

they'd all be stored separately in Google? It would be nice if they did --
save us a lot of work and disk space.

So I would need to programmatically re-write the robots.txt whenever
another
word was added to the database? Or would it suffice if my homepage had all
these links on (created programmatically)?

The robots.txt file is used to define what content can be excluded by
search engine spiders. You don't need to define every single URL
there. To index all pages, you either should delete robots.txt or put
there just two following lines

User-agent: *
Disallow:

I think it would not be a problem if you enumerate all links in that
file, but I'm pretty sure that this will not help to increase any
ranking.
 
Mark said:
So if in the robots.txt I had:

www.mysite.com/definitions/default.aspx?id=truck
www.mysite.com/definitions/default.aspx?id=trunk
www.mysite.com/definitions/default.aspx?id=try

they'd all be stored separately in Google? It would be nice if they
did -- save us a lot of work and disk space.

So I would need to programmatically re-write the robots.txt whenever
another word was added to the database? Or would it suffice if my
homepage had all these links on (created programmatically)?

I think you're looking for sitemaps:

"About Sitemaps - Sitemaps are a way to tell Google about pages on your site
we might not otherwise discover..."
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156184

(Works for other search engines too.)
 
Back
Top