Home
About
Documentation
[an error occurred while processing this directive]
|
Spammer Tool Features
[ I recently found this ad on the Net while searching for something else. Check out the capabilities of the
software. Reasonably impressive and I'm sure it's nowhere near the best tool out there. You must be vigilant
in protecting your email address. I've removed any reference to the name of the software so that this
page doesn't advertise for them. ]
- Extract from WWW pages as well as Newsgroups
- Utilize "pattern matching" or standard "MAILTO:" hyperlink
extraction.
- Ability to extract from websites utilizing FRAMES.
- User selectable "Extraction Depth" feature.
- Ability to log URL's with the email addresses.
- Avoid Advertisement and Other Undesirable Links and Domains
- Intelligent Spidering
- Ability to save and resume spiders at a later time
- Three methods of WWW Extraction
- Targeted Keyword Searches
- "Opt-Out" Bypassing
- Advanced "Extractor Trap" Protection
- Email History Cache/Automatic Dupe Checking
- Avoid Binary/Script Files
- Full Proxy Support
- Ability to set the HTTP_REFERER variable
- Extract from WWW pages and Newsgroups
- Ability to extract email addresses from web pages based on email address text pattern matching. - Many email
addresses on web pages exist as non-hyperlink text. The program can extract all email addresses, hyperlink and
non-hyperlink. (Hyperlink meaning the "MAILTO:" type links). The program can also be set up to retrieve only hyperlink
email addresses.
- Ability to extract from websites utilizing FRAMES.
- The program is one of the few extraction programs that have the ability to spider websites using frames (and
without the aid of the NOFRAMES tag).
- Intelligent Spidering
- Visited URL's are cached so that visited web pages will not be visited again. (User has the ability to clear the
URL History cache at any time). WWW Spider operations can be set to: all pages, domain only, page only.
- Ability to Resume Spiders
- With the program, you can stop a WWW spider extraction operation at any time, save the URL's that are in the
waiting queue and then resume the same spider later. The "Extraction Depth" setting is retained for the list file and
set when the list is loaded. Using this method, a spider operation can go on for an indefinite period of time.
- Three methods of WWW Extraction
- Perform Keyword Searches utilizing up to nine different search engines, search a specific user defined URL or
search utilizing a URL list.
- Targeted Searches
- Search and extract email addresses from web pages that meet your search criteria. Up to nine search engines are
utilized to collect relevant URL's based on your search keywords. The URL's are then spidered to and scanned for email
addresses. Email addresses are collected from the Newsgroups you specify, based on your interest or topic resulting in
highly qualified email lists.
- "Opt-Out" Bypassing
- Any email address containing the word "spam" or "remove" will automatically be bypassed from the extraction
operation. People that have added these words to their email address do not want to receive unsolicited commercial email
no matter how good of a product or deal you have to offer. People who choose to "opt-out" should be respected in their
choice to not receive UCE. Your email address lists will be more qualified and save you time in cleaning them up with
this feature.
- Advanced "Extractor Trap" Protection
- An extractor trap is a web page usually (but not always) generated by a cgi script that creates a large number of
bogus email addresses and links which link it back to itself. An email harvesting program that connects to such a page
will collect the bogus email addresses and links and will "hang up" on the site and get filled up with the bad email
addresses. Some users of such web sites do not want a search engine robot to "hang up" on the page so they include a
meta tag in the web page specifically used for robot exclusion.
- Full Proxy Support
- Ability to work through a proxy server. The program supports proxy name addressing as well as proxy IP number
addressing.
- Ability to set the HTTP_REFERER variable
- The HTTP_REFERER is a variable that indicates the URL from which a website visitor came from. Some websites will
not allow access unless the referrer was from it's own domain. The HTTP-Referrer can be set to the domain of the current
URL or it can be set to any URL the user wishes.
- Save Web page TITLE with email address
- The title of a web page is what is viewed in the title bar of the web browser for a particular web page. The title
is set for most HTML pages. It identifies the web page content in a global context. This feature will allow you to
personalize your email to the recipient by including the website title in your email to them.
[ The above ad was an example of the capabilities of the spammers. Learn how to protect your email
address. ]
Please consider donating money to the cause, putting a link to
us on your page, and spreading the word about Mailgw.
Copyright 2023 by Gray Watson
Contact us.
Android ORM
Simple Java Magic
JMX using HTTP
Great Eggnog Recipe
|