Web Traffic

Web traffic is the amount of data sent and received by visitors to a web site. This necessarily does not include the traffic generated by bots. Since the mid-1990s, web traffic has been the largest portion of Internet traffic.[1] This is determined by the number of visitors and the number of pages they visit. Sites monitor the incoming and outgoing traffic to see which parts or pages of their site are popular and if there are any apparent trends, such as one specific page being viewed mostly by people in a particular country. There are many ways to monitor this traffic and the gathered data is used to help structure sites, highlight security problems or indicate a potential lack of bandwidth.

Not all web traffic is welcomed. Some companies offer advertising schemes that, in return for increased web traffic (visitors), pay for screen space on the site. There is also "fake traffic", which is bot traffic generated by a third party. This type of traffic can damage a website's reputation, its visibility on Google, and overall domain authority.[]

Sites also often aim to increase their web traffic through inclusion on search engines and through search engine optimization.

Analysis

Web analytics is the measurement of the behavior of visitors to a website. In a commercial context, it especially refers to the measurement of which aspects of the website work towards the business objectives of Internet marketing initiatives; for example, which landing pages encourage people to make a purchase. Notable vendors of web analytics software and services include Google Analytics, IBM Digital Analytics (formerly Coremetrics) and Adobe Omniture.

Measurement

Example graph of web traffic at Wikipedia in December 2004

Web traffic is measured to see the popularity of web sites and individual pages or sections within a site. This can be done by viewing the traffic statistics found in the web server log file, an automatically generated list of all the pages served. A hit is generated when any file is served. The page itself is considered a file, but images are also files, thus a page with 5 images could generate 6 hits (the 5 images and the page itself). A page view is generated when a visitor requests any page within the web site - a visitor will always generate at least one page view (the main page) but could generate many more. Tracking applications external to the web site can record traffic by inserting a small piece of HTML code in every page of the web site.[2]

Web traffic is also sometimes measured by packet sniffing and thus gaining random samples of traffic data from which to extrapolate information about web traffic as a whole across total Internet usage.

The following types of information are often collated when monitoring web traffic: [3]

  • The number of visitors.
  • The average number of page views per visitor - a high number would indicate that the average visitors go deep inside the site, possibly because they like it or find it useful.
  • Average visit duration - the total length of a user's visit. As a rule the more time they spend the more they're interested in your company and are more prone to contact.
  • Average page duration - how long a page is viewed for. The more pages viewed, the better it is for your company.
  • Domain classes - all levels of the IP Addressing information required to deliver Webpages and content.
  • Busy times - the most popular viewing time of the site would show when would be the best time to do promotional campaigns and when would be the most ideal to perform maintenance
  • Most requested pages - the most popular pages
  • Most requested entry pages - the entry page is the first page viewed by a visitor and shows which are the pages most attracting visitors
  • Most requested exit pages - the most requested exit pages could help find bad pages, broken links or the exit pages may have a popular external link
  • Top paths - a path is the sequence of pages viewed by visitors from entry to exit, with the top paths identifying the way most customers go through the site
  • Referrers; The host can track the (apparent) source of the links and determine which sites are generating the most traffic for a particular page.

Web sites produce traffic rankings and statistics based on those people who access the sites while using their toolbars and other means of online measurements. The difficulty with this is that it's not looking at the complete traffic picture for a site. Large sites usually hire the services of companies such as the Nielsen NetRatings or Quancast, but their reports are available only by subscription.

Control

The amount of traffic seen by a web site is a measure of its popularity. By analysing the statistics of visitors it is possible to see shortcomings of the site and look to improve those areas. It is also possible to increase the popularity of a site and the number of people that visit it.

Limiting access

It is sometimes important to protect some parts of a site by password, allowing only authorized people to visit particular sections or pages.

Some site administrators have chosen to block their page to specific traffic, such as by geographic location. The re-election campaign site for U.S. President George W. Bush (GeorgeWBush.com) was blocked to all internet users outside of the U.S. on 25 October 2004 after a reported attack on the site.[4]

It is also possible to limit access to a web server both based on the number of connections and by the bandwidth expended by each connection. On Apache HTTP servers, this is accomplished by the limitipconn module and others.

From search engines

The majority of website traffic is driven by the search engines. Millions of people use search engines everyday to research various topics, buy products, and go about their daily surfing activities. Search engines use keywords to help users find relevant information and each of the major search engines has developed a unique algorithm to determine where websites are placed within the search results. When a user clicks on one of the listings in the search results, they are directed to the corresponding website and data is transferred from the website's server, thus counting the visitors towards the overall flow of traffic to that website.

Search engine optimization (SEO), is the ongoing practice of optimizing a website to help improve its rankings in the search engines. Several internal and external factors are involved which can help improve a site's listing within the search engines. The higher a site ranks within the search engines for a particular keyword, the more traffic they will receive.

Increasing traffic

Web traffic can be increased by placement of a site in search engines and purchase of advertising, including bulk e-mail, pop-up ads, and in-page advertisements. Web traffic can also be increased by purchasing through web traffic providers or non-internet based advertising.

Web traffic can be increased not only by attracting more visitors to a site, but also by encouraging individual visitors to "linger" on the site, viewing many pages in a visit. (see Outbrain for an example of this practice)

If a web page is not listed in the first pages of any search, the odds of someone finding it diminishes greatly (especially if there is other competition on the first page). Very few people go past the first page, and the percentage that go to subsequent pages is substantially lower. Consequently, getting proper placement on search engines, a practice known as SEO, is as important as the web site itself..[]

Traffic overload

Too much web traffic can dramatically slow down or prevent all access to a web site. This is caused by more file requests going to the server than it can handle and may be an intentional attack on the site or simply caused by over-popularity. Large scale web sites with numerous servers can often cope with the traffic required and it is more likely that smaller services are affected by traffic overload. Sudden traffic load may also hang your server or may result in shutdown of your services.

Denial of service attacks

Denial-of-service attacks (DoS attacks) have forced web sites to close after a malicious attack, flooding the site with more requests than it could cope with. Viruses have also been used to co-ordinate large scale distributed denial-of-service attacks.[5]

Sudden popularity

A sudden burst of publicity may accidentally cause a web traffic overload. A news item in the media, a quickly propagating email, or a link from a popular site may cause such a boost in visitors (sometimes called a flash crowd or the Slashdot effect).

Overall worldwide

According to Mozilla since January 2017 more than half of the Web traffic is encrypted with HTTPS.[6][7]

According to estimates cited by the Interactive Advertising Bureau in 2014 around one third of Web traffic is generated by Internet bots and malware.[8][9]

See also

References

  1. ^ Jeffay, Kevin. "Tracking the Evolution of Web Traffic: 1995-2003*" (PDF). UNC DiRT Group's Publications. University of North Carolina at Chapel Hill. 
  2. ^ Malacinski, Andrei; Dominick, Scott; Hartrick, Tom (1 March 2001). "Measuring Web Traffic". IBM. Retrieved 2011. 
  3. ^ "Web Analytics Definitions" (PDF). Web Analytics Association. 22 September 2008. Retrieved 2015. 
  4. ^ Miller, Rich (2004-10-26). "Bush Campaign Web Site Rejects Non-US Visitors". 
  5. ^ "Denial of Service". Cert.org. Retrieved 2012. 
  6. ^ "We're Halfway to Encrypting the Entire Web". Electronic Frontier Foundation. 21 February 2017. Retrieved 2017. 
  7. ^ Finley, Klint. "Half the Web Is Now Encrypted. That Makes Everyone Safer". WIRED. Retrieved 2017. 
  8. ^ Vranica, Suzanne (23 March 2014). "A 'Crisis' in Online Ads: One-Third of Traffic Is Bogus". Wall Street Journal. Retrieved 2017. 
  9. ^ "36% Of All Web Traffic Is Fake". Business Insider. Retrieved 2017. 

Bibliography


  This article uses material from the Wikipedia page available here. It is released under the Creative Commons Attribution-Share-Alike License 3.0.


Web_traffic
 



 
Connect with defaultLogic
What We've Done
Led Digital Marketing Efforts of Top 500 e-Retailers.
Worked with Top Brands at Leading Agencies.
Successfully Managed Over $50 million in Digital Ad Spend.
Developed Strategies and Processes that Enabled Brands to Grow During an Economic Downturn.
Taught Advanced Internet Marketing Strategies at the graduate level.



Manage research, learning and skills at defaultLogic. Create an account using LinkedIn or facebook to manage and organize your IT knowledge. defaultLogic works like a shopping cart for information -- helping you to save, discuss and share.


  Contact Us