Friday, November 22, 2013

What is Botnets Complete information

BY Unknown IN , No comments


Botnets

A botnet or robot network is a group of computers running a computer application controlled and manipulated only by the owner or the software source. The botnet may refer to a legitimate network of several computers that share program processing amongst them.

Usually though, when people talk about botnets, they are talking about a group of computers infected with the malicious kind of robot software, the bots, which present a security threat to the computer owner. Once the robot software (also known as malicious software or malware) has been successfully installed in a computer, this computer becomes a zombie or a drone, unable to resist the commands of the bot commander.

A botnet may be small or large depending on the complexity and sophistication of the bots used. A large botnet may be composed of ten thousand individual zombies. A small botnet, on the other hand may be composed of only a thousand drones. Usually, the owners of the zombie computers do not know that their computers and their computers� resources are being remotely controlled and exploited by an individual or a group of malware runners through Internet Relay Chat (IRC)

There are various types of malicious bots that have already infected and are continuing to infect the internet. Some bots have their own spreaders � the script that lets them infect other computers (this is the reason why some people dub botnets as computer viruses) � while some smaller types of bots do not have such capabilities.

Different Types of Bots

Here is a list of the most used bots in the internet today, their features and command set.

XtremBot, Agobot, Forbot, Phatbot

These are currently the best known bots with more than 500 versions in the internet today. The bot is written using C++ with cross platform capabilities as a compiler and GPL as the source code. These bots can range from the fairly simple to highly abstract module-based designs. The Tut Is Provided By Cyber Elite. Because of its modular approach, adding commands or scanners to increase its efficiency in taking advantage of vulnerabilities is fairly easy. It can use libpcap packet sniffing library, NTFS ADS and PCRE. Agobot is quite distinct in that it is the only bot that makes use of other control protocols besides IRC.

UrXBot, SDBot, UrBot and RBot

Like the previous type of bot, these bots are published under GPL, but unlike the above mentioned bots these bots are less abstract in design and written in rudimentary C compiler language. Although its implementation is less varied and its design less sohisticated, these type of bots are well known and widely used in the internet.

GT-Bots and mIRC based bots

These bots have many versions in the internet mainly because mIRC is one of the most used IRC client for windows. GT stands for global threat and is the common name for bots scripted using mIRC. GT-bots make use of the mIRC chat client to launch a set of binaries (mainly DLLs) and scripts; their scripts often have the file extensions .mrc.
Malicious Uses of Botnets

Types Of Botnet Attack

Denial of Service Attacks

A botnet can be used as a distributed denial of service weapon. A botnet attacks a network or a computer system for the purpose of disrupting service through the loss of connectivity or consumption of the victim network�s bandwidth and overloading of the resources of the victim�s computer system. Botnet attacks are also used to damage or take down a competitor�s website.

Fast flux is a DNS technique used by botnets to hide phishing and malware delivery sites behind an ever-changing network of compromised hosts acting as proxies.
Any Internet service can be a target by botnets. This can be done through flooding the website with recursive HTTP or bulletin-board search queries. This mode of attack in which higher level protocols are utilized to increase the effects of an attack is also termed as spidering.

Spyware

Its a software which sends information to its creators about a user's activities � typically passwords, credit card numbers and other information that can be sold on the black market. Compromised machines that are located within a corporate network can be worth more to the bot herder, as they can often gain access to confidential information held within that company. There have been several targeted attacks on large corporations with the aim of stealing sensitive information, one such example is the Aurora botnet.

Adware

Its exists to advertise some commercial entity actively and without the user's permission or awareness, for example by replacing banner ads on web pages with those of another content provider.

Spamming and Traffic Monitoring

A botnet can also be used to take advantage of an infected computer�s TCP/IP�s SOCKS proxy protocol for networking appications. After compromising a computer, the botnet commander can use the infected unit (a zombie) in conjunction with other zombies in his botnet (robot network) to harvest email addresses or to send massive amounts of spam or phishing mails.

Moreover, a bot can also function as a packet sniffer to find and intercept sensitive data passing through an infected machine. Typical data that these bots look out for are usernames and passwords which the botnet commander can use for his personal gain. Data about a competitor botnet installed in the same unit is also mined so the botnet commander can hijack this other botnet.

Access number replacements are where the botnet operator replaces the access numbers of a group of dial-up bots to that of a victim's phone number.The Tut Is Provided By Cyber Elite. Given enough bots partake in this attack, the victim is consistently bombarded with phone calls attempting to connect to the internet. Having very little to defend against this attack, most are forced into changing their phone numbers (land line, cell phone, etc.).

Keylogging and Mass Identity Theft

An encryption software within the victims� units can deter most bots from harvesting any real information. Unfortunately, some bots have adapted to this by installing a keylogger program in the infected machines. With a keylogger program, the bot owner can use a filtering program to gather only the key sequence typed before or after interesting keywords like PayPal or Yahoo mail. This is one of the reasons behind the massive PayPal accounts theft for the past several years.

Bots can also be used as agents for mass identity theft. It does this through phishing or pretending to be a legitimate company in order to convince the user to submit personal information and passwords. A link in these phishing mails can also lead to fake PayPal, eBay or other websites to trick the user into typing in the username and password.

Botnet Spread

Botnets can also be used to spread other botnets in the network. It does this by convincing the user to download after which the program is executed through FTP, HTTP or email.

Pay-Per-Click Systems Abuse

Botnets can be used for financial gain by automating clicks on a pay-per-click system. Compromised units can be used to click automatically on a site upon activation of a browser. For this reason, botnets are also used to earn money from Google�s Adsense and other affiliate programs by using zombies to artificially increase the click counter of an advertisement.

How To Make a Proper Website Complete

BY Unknown IN , No comments



A web standards checklist

The term web standards can mean different things to different people. For some, it is 'table-free sites', for others it is 'using valid code'. However, web standards are much broader than that. A site built to web standards should adhere to standards (HTML, XHTML, XML, CSS, XSLT, DOM, MathML, SVG etc) and pursue best practices (valid code, accessible code, semantically correct code, user-friendly URLs etc).

In other words, a site built to web standards should ideally be lean, clean, CSS-based, accessible, usable and search engine friendly.

About the checklist

This is not an uber-checklist. There are probably many items that could be added. More importantly, it should not be seen as a list of items that must be addressed on every site that you develop. It is simply a guide that can be used:

* to show the breadth of web standards
* as a handy tool for developers during the production phase of websites
* as an aid for developers who are interested in moving towards web standards

The checklist

1.Quality of code
1. Does the site use a correct Doctype?
2. Does the site use a Character set?
3. Does the site use Valid (X)HTML?
4. Does the site use Valid CSS?
5. Does the site use any CSS hacks?
6. Does the site use unnecessary classes or ids?
7. Is the code well structured?
8. Does the site have any broken links?
9. How does the site perform in terms of speed/page size?
10. Does the site have JavaScript errors?

2. Degree of separation between content and presentation
1. Does the site use CSS for all presentation aspects (fonts, colour, padding, borders etc)?
2. Are all decorative images in the CSS, or do they appear in the (X)HTML?

3. Accessibility for users
1. Are "alt" attributes used for all descriptive images?
2. Does the site use relative units rather than absolute units for text size?
3. Do any aspects of the layout break if font size is increased?
4. Does the site use visible skip menus?
5. Does the site use accessible forms?
6. Does the site use accessible tables?
7. Is there sufficient colour brightness/contrasts?
8. Is colour alone used for critical information?
9. Is there delayed responsiveness for dropdown menus (for users with reduced motor skills)?
10. Are all links descriptive (for blind users)?

4. Accessibility for devices
1. Does the site work acceptably across modern and older browsers?
2. Is the content accessible with CSS switched off or not supported?
3. Is the content accessible with images switched off or not supported?
4. Does the site work in text browsers such as Lynx?
5. Does the site work well when printed?
6. Does the site work well in Hand Held devices?
7. Does the site include detailed metadata?
8. Does the site work well in a range of browser window sizes?

5. Basic Usability
1. Is there a clear visual hierarchy?
2. Are heading levels easy to distinguish?
3. Does the site have easy to understand navigation?
4. Does the site use consistent navigation?
5. Are links underlined?
6. Does the site use consistent and appropriate language?
7. Do you have a sitemap page and contact page? Are they easy to find?
8. For large sites, is there a search tool?
9. Is there a link to the home page on every page in the site?
10. Are visited links clearly defined with a unique colour?

6. Site management
1. Does the site have a meaningful and helpful 404 error page that works from any depth in the site?
2. Does the site use friendly URLs?
3. Do your URLs work without "www"?
4. Does the site have a favicon?

1. Quality of code

1.1 Does the site use a correct Doctype?
A doctype (short for 'document type declaration') informs the validator which version of (X)HTML you're using, and must appear at the very top of every web page. Doctypes are a key component of compliant web pages: your markup and CSS won't validate without them.
http://www.alistapart.com/articles/doctype/

More:
http://www.w3.org/QA/2002/04/valid-dtd-list.html
http://css.maxdesign.com.au/listamatic/about-boxmodel.htm
http://gutfeldt.ch/matthias/articles/doctypeswitch.html

1.2 Does the site use a Character set?
If a user agent (eg. a browser) is unable to detect the character encoding used in a Web document, the user may be presented with unreadable text. This information is particularly important for those maintaining and extending a multilingual site, but declaring the character encoding of the document is important for anyone producing XHTML/HTML or CSS.
http://www.w3.org/International/tutorials/tutorial-char-enc/

More:
http://www.w3.org/International/O-charset.html

1.3 Does the site use Valid (X)HTML?
Valid code will render faster than code with errors. Valid code will render better than invalid code. Browsers are becoming more standards compliant, and it is becoming increasingly necessary to write valid and standards compliant HTML.
http://www.maxdesign.com.au/presentation/sit2003/06.htm

More:
http://validator.w3.org/

1.4 Does the site use Valid CSS?
You need to make sure that there aren't any errors in either your HTML or your CSS, since mistakes in either place can result in botched document appearance.
http://www.meyerweb.com/eric/articles/webrev/199904.html

More:
http://jigsaw.w3.org/css-validator/

1.5 Does the site use any CSS hacks?
Basically, hacks come down to personal choice, the amount of knowledge you have of workarounds, the specific design you are trying to achieve.
http://www.mail-archive.com/wsg@webstandardsgroup.org/msg05823.html

More:
http://css-discuss.incutio.com/?page=CssHack
http://css-discuss.incutio.com/?page=ToHackOrNotToHack
http://centricle.com/ref/css/filters/

1.6 Does the site use unnecessary classes or ids?
I've noticed that developers learning new skills often end up with good CSS but poor XHTML. Specifically, the HTML code tends to be full of unnecessary divs and ids. This results in fairly meaningless HTML and bloated style sheets.
http://www.clagnut.com/blog/228/

1.7 Is the code well structured?
Semantically correct markup uses html elements for their given purpose. Well structured HTML has semantic meaning for a wide range of user agents (browsers without style sheets, text browsers, PDAs, search engines etc.)
http://www.maxdesign.com.au/presentation/benefits/index04.htm

More:
http://www.w3.org/2003/12/semantic-extractor.html

1.8 Does the site have any broken links?
Broken links can frustrate users and potentially drive customers away. Broken links can also keep search engines from properly indexing your site.

More:
http://validator.w3.org/checklink

1.9 How does the site perform in terms of speed/page size?
Don't make me wait... That's the message users give us in survey after survey. Even broadband users can suffer the slow-loading blues.
http://www.websiteoptimization.com/speed/

1.10 Does the site have JavaScript errors?
Internet Explore for Windows allows you to turn on a debugger that will pop up a new window and let you know there are javascript errors on your site. This is available under 'Internet Options' on the Advanced tab. Uncheck 'Disable script debugging'.

2. Degree of separation between content and presentation

2.1 Does the site use CSS for all presentation aspects (fonts, colour, padding, borders etc)?
Use style sheets to control layout and presentation.
http://www.w3.org/TR/WCAG10/wai-pageauth.html#tech-style-sheets

2.2 Are all decorative images in the CSS, or do they appear in the (X)HTML?
The aim for web developers is to remove all presentation from the html code, leaving it clean and semantically correct.
http://www.maxdesign.com.au/presentation/benefits/index07.htm

3. Accessibility for users

3.1 Are "alt" attributes used for all descriptive images?
Provide a text equivalent for every non-text element
http://www.w3.org/TR/WCAG10/wai-pageauth.html#tech-text-equivalent

3.2 Does the site use relative units rather than absolute units for text size?
Use relative rather than absolute units in markup language attribute values and style sheet property values'.
http://www.w3.org/TR/WCAG10/wai-pageauth.html#tech-relative-units

More:
http://www.w3.org/TR/WCAG10/wai-pageauth.html#tech-relative-units
http://www.clagnut.com/blog/348/

3.3 Do any aspects of the layout break if font size is increased?
Try this simple test. Look at your website in a browser that supports easy incrementation of font size. Now increase your browser's font size. And again. And again... Look at your site. Does the page layout still hold together? It is dangerous for developers to assume that everyone browses using default font sizes.
3.4 Does the site use visible skip menus?

A method shall be provided that permits users to skip repetitive navigation links.
http://www.section508.gov/index.cfm?FuseAction=Content&ID=12

Group related links, identify the group (for user agents), and, until user agents do so, provide a way to bypass the group.
http://www.w3.org/TR/WCAG10-TECHS/#tech-group-links

...blind visitors are not the only ones inconvenienced by too many links in a navigation area. Recall that a mobility-impaired person with poor adaptive technology might be stuck tabbing through that morass.
http://joeclark.org/book/sashay/serialization/Chapter08.html#h4-2020

More:
http://www.niehs.nih.gov/websmith/508/o.htm

3.5 Does the site use accessible forms?
Forms aren't the easiest of things to use for people with disabilities. Navigating around a page with written content is one thing, hopping between form fields and inputting information is another.

http://www.htmldog.com/guides/htmladvanced/forms/

More :

http://www.webstandards.org/learn/tutorials/accessible-forms/01-accessible-forms.html
http://www.accessify.com/tools-and-wizards/accessible-form-builder.asp
http://accessify.com/tutorials/better-accessible-forms.asp

3.6 Does the site use accessible tables?
For data tables, identify row and column headers... For data tables that have two or more logical levels of row or column headers, use markup to associate data cells and header cells.
http://www.w3.org/TR/WCAG10/wai-pageauth.html#tech-table-headers

More:
http://www.bcc.ctc.edu/webpublishing/ada/resources/tables.asp
http://www.accessify.com/tools-and-wizards/accessible-table-builder_step1.asp
http://www.webaim.org/techniques/tables/

3.7 Is there sufficient colour brightness/contrasts?
Ensure that foreground and background colour combinations provide sufficient contrast when viewed by someone having colour deficits.
http://www.w3.org/TR/WCAG10/wai-pageauth.html#tech-colour-contrast

More:
http://www.juicystudio.com/services/colourcontrast.asp

3.8 Is colour alone used for critical information?
Ensure that all information conveyed with colour is also available without colour, for example from context or markup.
http://www.w3.org/TR/WCAG10/wai-pageauth.html#tech-colour-convey

There are basically three types of colour deficiency; Deuteranope (a form of red/green colour deficit), Protanope (another form of red/green colour deficit) and Tritanope (a blue/yellow deficit- very rare).

More:
http://colourfilter.wickline.org/
http://www.toledo-bend.com/colourblind/Ishihara.html
http://www.vischeck.com/vischeck/vischeckURL.php

3.9 Is there delayed responsiveness for dropdown menus?
Users with reduced motor skills may find dropdown menus hard to use if responsiveness is set too fast.

3.10 Are all links descriptive?
Link text should be meaningful enough to make sense when read out of context - either on its own or as part of a sequence of links. Link text should also be terse.
http://www.w3.org/TR/WCAG10/wai-pageauth.html#tech-meaningful-links

4. Accessibility for devices.

4.1 Does the site work acceptably across modern and older browsers?

Before starting to build a CSS-based layout, you should decide which browsers to support and to what level you intend to support them.
http://www.maxdesign.com.au/presentation/process/index_step01.cfm

4.2 Is the content accessible with CSS switched off or not supported?
Some people may visit your site with either a browser that does not support CSS or a browser with CSS switched off. In content is structured well, this will not be an issue.

4.3 Is the content accessible with images switched off or not supported?
Some people browse websites with images switched off - especially people on very slow connections. Content should still be accessible for these people.

4.4 Does the site work in text browsers such as Lynx?
This is like a combination of images and CSS switched off. A text-based browser will rely on well structured content to provide meaning.

More:
http://www.delorie.com/web/lynxview

4.5 Does the site work well when printed?
You can take any (X)HTML document and simply style it for print, without having to touch the markup.
http://www.alistapart.com/articles/goingtoprint/

More:
http://www.d.umn.edu/itss/support/Training/Online/webdesign/css.html#print

4.6 Does the site work well in Hand Held devices?
This is a hard one to deal with until hand held devices consistently support their correct media type. However, some layouts work better in current hand-held devices. The importance of supporting hand held devices will depend on target audiences.

4.7 Does the site include detailed metadata?
Metadata is machine understandable information for the web
http://www.w3.org/Metadata/

Metadata is structured information that is created specifically to describe another resource. In other words, metadata is 'data about data'.

4.8 Does the site work well in a range of browser window sizes?
It is a common assumption amongst developers that average screen sizes are increasing. Some developers assume that the average screen size is now 1024px wide. But what about users with smaller screens and users with hand held devices? Are they part of your target audience and are they being disadvantaged?

5. Basic Usability

5.1 Is there a clear visual hierarchy?
Organise and prioritise the contents of a page by using size, prominence and content relationships.
http://www.great-web-design-tips.com/web-site-design/165.html

5.2 Are heading levels easy to distinguish?
Use header elements to convey document structure and use them according to specification.
http://www.w3.org/TR/WCAG10/wai-pageauth.html#tech-logical-headings

5.3 Is the site's navigation easy to understand?
Your navigation system should give your visitor a clue as to what page of the site they are currently on and where they can go next.
http://www.1stsitefree.com/design_nav.htm

5.4 Is the site's navigation consistent?
If each page on your site has a consistent style of presentation, visitors will find it easier to navigate between pages and find information
http://www.juicystudio.com/tutorial/accessibility/navigation.asp

5.5 Does the site use consistent and appropriate language?
The use of clear and simple language promotes effective communication. Trying to come across as articulate can be as difficult to read as poorly written grammar, especially if the language used isn't the visitor's primary language.
CODE
http://www.juicystudio.com/tutorial/accessibility/clear.asp

5.6 Does the site have a sitemap page and contact page? Are they easy to find?
Most site maps fail to convey multiple levels of the site's information architecture. In usability tests, users often overlook site maps or can't find them. Complexity is also a problem: a map should be a map, not a navigational challenge of its own.
http://www.useit.com/alertbox/20020106.html

5.7 For large sites, is there a search tool?
While search tools are not needed on smaller sites, and some people will not ever use them, site-specific search tools allow users a choice of navigation options.

5.8 Is there a link to the home page on every page in the site?
Some users like to go back to a site's home page after navigating to content within a site. The home page becomes a base camp for these users, allowing them to regroup before exploring new content.

5.9 Are links underlined?
To maximise the perceived affordance of clickability, colour and underline the link text. Users shouldn't have to guess or scrub the page to find out where they can click.
http://www.useit.com/alertbox/20040510.html

5.10 Are visited links clearly defined?
Most important, knowing which pages they've already visited frees users from unintentionally revisiting the same pages over and over again.
http://www.useit.com/alertbox/20040503.html

6. Site management

6.1 Does the site have a meaningful and helpful 404 error page that works from any depth in the site?
You've requested a page - either by typing a URL directly into the address bar or clicking on an out-of-date link and you've found yourself in the middle of cyberspace nowhere. A user-friendly website will give you a helping hand while many others will simply do nothing, relying on the browser's built-in ability to explain what the problem is.
CODE
http://www.alistapart.com/articles/perfect404/

6.2 Does the site use friendly URLs?
Most search engines (with a few exceptions - namely Google) will not index any pages that have a question mark or other character (like an ampersand or equals sign) in the URL... what good is a site if no one can find it? http://www.sitepoint.com/article/search-engine-friendly-urls

One of the worst elements of the web from a user interface standpoint is the URL. However, if they're short, logical, and self-correcting, URLs can be acceptably usable
http://www.merges.net/theory/20010305.html

More:
http://www.sitepoint.com/article/search-engine-friendly-urls
http://www.websitegoodies.com/article/32
http://www.merges.net/theory/20010305.html

6.3 Does the site's URL work without "www"?
While this is not critical, and in some cases is not even possible, it is always good to give people the choice of both options. If a user types your domain name without the www and gets no site, this could disadvantage both the user and you.

6.4 Does the site have a favicon?

A Favicon is a multi-resolution image included on nearly all professionally developed sites. The Favicon allows the webmaster to further promote their site, and to create a more customized appearance within a visitor's browser.
http://www.favicon.com/

Favicons are definitely not critical. However, if they are not present, they can cause 404 errors in your logs (site statistics). Browsers like IE will request them from the server when a site is bookmarked. If a favicon isn't available, a 404 error may be generated. Therefore, having a favicon could cut down on favicon specific 404 errors. The same is true of a 'robots.txt' file.