sorting a list of urls by frequency

  • Thread starter Thread starter orco
  • Start date Start date

This will do the "or".
AM-DeadLink
(Freeware)
OS: Windows 9x/Me/NT4/2000/XP
Description:
http://www.pricelessware.org/2004/PL2004INTERNET.htm#URL:Checker-Validator
 
thank you for the program dszady
but don´t work

Am-deadlink 2.00 beta-5 removes duplicates urls perfectly and sorts
by alphabetic order

but don´t sorts urls by frequency


thx
 
Newsgroups: alt.comp.freeware
Subject: sorting a list of urls by frequency
From: orco <[email protected]>
X-Newsreader: Forte Agent 1.93/32.576 English (American)

i have a list of urls (with duplicates) like this:

http://google.es
http://biphome.spray.se/mp/
http://biphome.spray.se/mp/
http://yahoo.es
http://yahoo.es
http://yahoo.es
http://altavista.com
[Snip]
If you don't mind working in a DosBox you can do it with the UnxTools
package, specifically you are going to use Sort (to sort the file)
and Uniq (to count and remove duplicates).

The UnxTools homepage is located here:
<http://unxutils.sourceforge.net/>

Once 'installed' you just open a DosBox and type:

Type Test.txt | Sort | Uniq -c | Sort -rn

Using your example the output is:

3 http://yahoo.es
2 http://biphome.spray.se/mp/
1 http://google.es
1 http://altavista.com

'Test.txt' is just a placeholder for the full path of the file you
are going to process, what I mean is, if the real file is 'C:\Docs
\Dummy.txt' then you use:

Type C:\Docs\Dummy.txt | Sort | Uniq -c | Sort -rn

And so on.

HTH
 
Newsgroups: alt.comp.freeware
Subject: sorting a list of urls by frequency
From: orco <[email protected]>
X-Newsreader: Forte Agent 1.93/32.576 English (American)

i have a list of urls (with duplicates) like this:

http://google.es
http://biphome.spray.se/mp/
http://biphome.spray.se/mp/
http://yahoo.es
http://yahoo.es
http://yahoo.es
http://altavista.com
[Snip]
If you don't mind working in a DosBox you can do it with the UnxTools
package, specifically you are going to use Sort (to sort the file)
and Uniq (to count and remove duplicates).

The UnxTools homepage is located here:
<http://unxutils.sourceforge.net/>

Once 'installed' you just open a DosBox and type:

Type Test.txt | Sort | Uniq -c | Sort -rn

Using your example the output is:

3 http://yahoo.es
2 http://biphome.spray.se/mp/
1 http://google.es
1 http://altavista.com

'Test.txt' is just a placeholder for the full path of the file you
are going to process, what I mean is, if the real file is 'C:\Docs
\Dummy.txt' then you use:

Type C:\Docs\Dummy.txt | Sort | Uniq -c | Sort -rn

And so on.

HTH


rir3760, it works perfect and accomplished exactly what I need to do

THANKS !
:)
 
Newsgroups: alt.comp.freeware
Subject: sorting a list of urls by frequency
From: orco <[email protected]>
X-Newsreader: Forte Agent 1.93/32.576 English (American)

i have a list of urls (with duplicates) like this:

http://google.es
http://biphome.spray.se/mp/
http://biphome.spray.se/mp/
http://yahoo.es
http://yahoo.es
http://yahoo.es
http://altavista.com
[Snip]
If you don't mind working in a DosBox you can do it with the UnxTools
package, specifically you are going to use Sort (to sort the file)
and Uniq (to count and remove duplicates).

The UnxTools homepage is located here:
<http://unxutils.sourceforge.net/>

Once 'installed' you just open a DosBox and type:

Type Test.txt | Sort | Uniq -c | Sort -rn

Using your example the output is:

3 http://yahoo.es
2 http://biphome.spray.se/mp/
1 http://google.es
1 http://altavista.com

'Test.txt' is just a placeholder for the full path of the file you
are going to process, what I mean is, if the real file is 'C:\Docs
\Dummy.txt' then you use:

Type C:\Docs\Dummy.txt | Sort | Uniq -c | Sort -rn

And so on.

HTH


rir3760, it works perfect and accomplished exactly what I need to do

THANKS !
:)

I saved this page also. You never know :)
 
orco said:
Am-deadlink 2.00 beta-5 removes duplicates urls perfectly and sorts
by alphabetic order

but don´t sorts urls by frequency

There are different types of bookmark collections, the url links in my
favorites, the bookmark files in Opera, etc..

I have many older collections of bookmarks in these different forms, and I
would like a program which could look through all these collections, sort
bookmarks after the number of times it finds the same bookmark, or the
newest ones first.

So I could collect the most popular ones, and the newest ones in new
collections or bookmark lists.

I have never found a bookmark handler which could count the number of times
it finds a certain url and increase the ranking based on that.

I have 10 thousand bookmarks, and I want a list of the 200 I have
bookmarked most often during the last 4 years.
That is the task I am looking for a solution to.
 
I
would like a program which could look through all these collections, sort
bookmarks after the number of times it finds the same bookmark, or the
newest ones first.

< snip >

Newest ones first ? FireFox does ;


Sorted by added

or

Sorted by last modified

or

Sorted by last visited


Regards, John.

--
****************************************************
,-._|\ (A.C.F FAQ) http://clients.net2000.com.au/~johnf/faq.html
/ Oz \ John Fitzsimons - Melbourne, Australia.
\_,--.x/ http://www.vicnet.net.au/~johnf/welcome.htm
v http://clients.net2000.com.au/~johnf/
 
Back
Top