Free «Why Net Neutrality Should Be Law» Essay Sample

Net neutrality has in the recent past, emerged as an a contentious internet and communications policy issue, the point of contention being whether it should completely be dictated by legislative laws and regulations on issues of openness or whether, “openness” should be left to the service provider (ISPs). Laws and regulations enforcing Net neutrality are feared they may weaken the competitive vibrancy of the content, applications and device components of the Internet, but on the hand, will have the effect of advantaging non-latency sensitive Internet innovations over latency-sensitive ones like voice and video. Logically, network neutrality regulation is not meant to confine itself to just physical broadband networks, but rather extend to interoperability, access and “openness” mandates on all types of applications – VoIP services, IM services, social networking, search engines and online commerce. The logical progression of net neutrality regulation would be an encompassing Internet-regulation regime, extending to both price and content.

Moreover, network and internet service providers understand the fact that neutrality has significant technicality principle, not a rule Internet companies should not get a free ride. With views that, operators must have a right to charge to get a fair return on network investments, neutrality legislation would strangle the internet, net traffic is segmented already and maintains that there is no risk that operators will block or degrade content.

  •  

    0

    Preparing Orders

  •  

    0

    Active Writers

  •  

    0%

    Positive Feedback

  •  

    0

    Support Agents

 

?
Type of service ?
Type of assignment ?
Number of pages ?
-
+
Academic level ?
Timeframes ?
Spacing ?
Currency ?
  • Total price
Continue to order
 

The notions of preserving “openness” on the Internet, remains the rationales on which imposing net neutrality mandates rests. Net neutrality proponents often have their arguments in terms of “open” versus “closed” networks and they warn of the dangers of broadband service providers using their transmission facilities to control applications or services that run over their “pipes.” They insist that operators should not be allowed to discriminate and prioritize traffic according to source or owner, prioritization means that traffic from some sites will be degraded while arguing that operator charges could shut out new internet companies they view competition as not yet sufficient to allow market forces to decide.

Both qualitative and quantitative research techniques that employed questionnaires interview and interview schedules were used in collection of data. Interview questionnaires helped collect quantitative data from key players in internet service providers’ firms. On the other hand, case studies of the some selected and relevant countries gave qualitative data. This included; a case study on United States debates on net neutrality, legislative ventures by the United Kingdom, European Union and United States provided information about the possible government involvement avenues in legislating net neutrality regulations. Triangulation approaches to data analysis helped capture all data aspects resulting in the critical findings relevant to the net neutrality developmental concern. Individual questionnaires were administered to base on the following variables: respondents conversant with the policy issues in the communications industry; respondents having knowledge of internet basics and network neutrality; Respondents working in diverse institutions (computer departments, ISPs, and other categories); Respondents coming from institution affiliated to an internet service providers; Respondents using internet in their respective work environments; and Respondents supporting key legislative issues regarding net neutrality. These variables were essentially used to establish relationships between different emerging issues associated with net neutrality coming from different backgrounds.

Introduction

 “Network Neutrality is the principle that Internet users should be able to access any web content they choose and use any applications they choose, without restrictions or limitations imposed by their Internet service provider.” (Bocache, Mikheyev, & Paque, 2007). Among the various interpretations, the most extreme understanding entails that a neutral network transports data packets without prioritizing any of them, not even on basis of the type of application to which they belong (Eldefer, 2008)

This essentially begins social and extends to political filtering inclusive of content control, distinct geo-location software and other feasible legal jurisdictions. This is essentially, and, finally, to alternative domain-name systems and non-Roman alphabet domain names, the Internet often looks quite different to different people. While it is difficult to evaluate the long-term consequences of these trends, some might have a negative effect on users’ experience and the future development of the Internet (Farber & Katz, 2007). This study focuses on one such potentially troublesome area, namely, the Net Neutrality controversy.

In the most general meaning, Net Neutrality questions the right of internet service providers (ISPs) and other network operators to deliver certain data packets faster than others based on the type of application, source and nature of content, and other criteria (Bocache, Mikheyev, & Paque, 2007). An absolutely neutral network is not technically feasible, especially in such existing complicated system as the Internet. The task of ensuring data exchange on a global scale and especially attempts to improve quality of service have forced Internet engineers to introduce intelligence into the network in form of “smart” routers and gateways (Bocache, Mikheyev, & Paque, 2007). However, as the Internet begins to play an ever-increasing role in the economy and, as technical capabilities of identifying certain packets and treating them differently grow; new criteria for discriminating against certain types of traffic are suggested. 

In the long-run, some of these new developments may have an impending negative effect on the evolution of Internet and subsequent end-user experience, especially countries considered as developing countries. Debates on Net Neutrality have recently come to the forefront of Internet governance discussions all over the world, especially in the United States (Bocache, Mikheyev, & Paque, 2007). As with any Internet-related discussion in the US, this one may have considerable global consequence. As will be discussed further, decisions with regard to Net Neutrality made today can have long-term effects on the evolution and use of the Internet at large. Hence, these decisions clearly fall within the scope of Internet governance as defined in the question; “should net neutrality be law?” which entails “the development and application by Governments, the private sector and civil society, in their respective roles, of shared principles, norms, rules, decision-making procedures, and programs that shape the evolution and use of the Internet” (Bocache, Mikheyev, & Paque, 2007).

 
Get 24/7 Free consulting
Toll free

Further, the issue of Net Neutrality is of direct relevance to the mandate of the Internet Governance Forum/World Summit on the Information Society (IGF/WSIS) and the content of other debates held at the international level, as the issue is relevant to the broader area of “Openness/Access to Information” (Bocache, Mikheyev, & Paque, 2007).

The IGF also has a clear mandate to attend to issues that affect the sustainability, development, and equitable distribution of Internet resources as described in WSIS documents (Eldefer, 2008). Among other things, this involves the “development of strategies for increasing affordable global connectivity, thereby facilitating improved and equitable access for all,” including pricing schemes for interconnection costs “oriented towards objective, transparent and non-discriminatory parameters” (Bocache, Mikheyev, & Paque, 2007). In addition, the outcome of the Net Neutrality debate can affect the pattern of innovation, competition, and investment in the provision of Internet access. Promoting an enabling environment in these areas is clearly indicated as one of the international community’s goals in the field of Internet governance (Bocache, Mikheyev, & Paque, 2007). The issues of Net Neutrality are thus clearly within the mandate of the leading international institutions involved in the Internet governance process.

The Net Neutrality debate is particularly important for developing countries, for several reasons.

ü  Rising or prohibitive costs of Internet access, caused by a multi-tiered Internet, could slow the implementation of universal access

ü  The occurrence of planned manipulation especially for non-occurring technical factors such as speed can affect information flow, prioritizing some information over others, in hidden, almost subliminal layers (Eldefer, 2008).

ü  Voice over Internet Protocol (VoIP) and other innovative applications offering new technologies at accessible costs have immediate applications for improving services that will directly help bridge the digital divide as they improve communications (Eldefer, 2008).

ü  The importance of the issues of access, transparency, and innovation for developing countries makes addressing the issue of Net Neutrality of primary concern (Eldefer, 2008).

In summation, there are numerous queries regarding Net Neutrality on the basis of formulating response mechanisms. Instituting protection for Net Neutrality is essentially, a question of how to accomplish the designed protocols? Should a political or legal solution be enacted at national or international levels? Can we trust an informal free-market solution that may develop on its own, or should legal and political means be used to enunciate this principle? Will market forces ensure the best outcome, whatever this may be? These are some of the questions to be addressed.

Save up to
25%!

We offer 10% more words per page than other websites, so actually you got 1 FREE page with every 10 ordered pages.

Together with 15% first order discount you get 25% OFF!

I hope that this report will help further the discussion of these important issues and, ultimately, decision-making with regard to them. I will analyze the issues of Net Neutrality with a particular focus on making it a legislative provision at the international level, and propose further steps to protect the developing countries’ interests. In order to achieve this goal, I address a number of pertinent issues in an attempt to further explore the topic.

Literature review

Net neutrality origins

According to the SavetheInternet.com Coalition, net neutrality's origin is tracked back to the inception of the internet. They have commented: "Pioneers like Vinton Cerf and Sir Tim Berners-Lee, the inventor of the World Wide Web, always intended the Internet to be a neutral network. And the occurrence of non-discriminatory provisions, for instance, Net Neutrality policies that have been governing the country’s communications networks from the 1930s."  The coalition narrates that the campaign for net neutrality started up after a 2005 decision by the Federal Communications Commission. They stated "...as a consequence of a 2005 decision by the FCC, Net Neutrality, the foundation of the free and open Internet, was put in jeopardy.

China is among governments outside the United States that restrict access to Internet content deemed inappropriate or not in agreement with government policy. Restrictions usually come occur through access restriction of Internet content, all the way up to restricting the access to the Internet itself. Since Internet usage can be monitored more easily in the public cyber cafés, private access points to the Internet may be discouraged. Techniques may be utilized that limit access to domains, web pages, or search words and phrases. Instances of ISPs requiring signing a pledge to apply certain filters or restrict usage of the Internet. The effect of restricting and filtering Internet portals insignificantly without the application challenges, but it was well established that it does occur. (Castells, 2009)

As a result of concerns from Consumer advocates that Internet providers might ban or degrade services that compete with their own offerings, like television shows delivered over the Web, Mr. Genachowski, the chairman of the Federal Communications Commission, outlined a proposal to add a fifth principle that was to prevent Internet operators from discriminating against certain services or applications (Eldefer, 2008). He also proposed that the rules explicitly were to govern any Internet service, even if delivered over wireless networks.

VIP services

Get
extended REVISION
2.00 USD

Get SMS NOTIFICATIONS 3.00 USD

Get an order
Proofread by editor 3.99 USD

Get a full
PDF plagiarism report
5.99 USD

Get
VIP Support 9.99 USD

Get an order prepared
by Top 30 writers 4.8 USD

VIP SERVICES
PACKAGE
WITH 20% DISCOUNT 23.82 USD

In an attempt to keep with President Obama’s campaign promises, Julius Genachowski proposed that the agency should expand and formalize rules, to keep Internet providers from discriminating against certain content flowing over their networks. The commission in 2005 adopted four broad principles relating to the idea of network neutrality as part of a move to deregulate the Internet services provided by telephone companies (Palmer, Hjelmvik, Ranum, Berghel, 2009). Those principles declared that consumers had the right to use the content, applications, services and devices of their choice using the Internet. They also promoted competition between Internet providers. Most significantly, he proposed that the net neutrality principles be formally adopted as commission rules. (Hansell, 2009)

Net Neutrality as Political Issue

In the U S, the current paradigm for internet regulation seemed to reinforce a consolidation of power and influence by companies who owned both internet service providers and media outlets (Powell, 2009). As much as network neutrality has been the standard practice of network providers, up to 2007, there has been no law that requires it. In 2006, it came up as an issue in United States when a grassroots response led by the SaveTheInternet rights based group, which led to the opposition of telecom legislation that ended up addressing issues regarding network neutrality. There was for once a bill in 2007 to enshrine neutrality in law and on July 22nd, Sen. Dick Durbin announced a participatory project to suggest ideas for and revisions to a national broadband policy bill. The FCC also formulated a query on the need to enforce network neutrality (Powell, 2009).

On June 2007, the Federal Trade Commission released a report christened, "Broadband Connectivity Competition Policy" where it dismisses the importance of immediate net neutrality restrictions (Palmer, Hjelmvik, Ranum and Berghel, 2009). Even though the report has limited legal implications, it indicates that the FTC may not be a supporter of net neutrality in the future (Powell, 2009). The FTC commented that it had yet to see any abuses by the telecom companies, and it was seeing more, not less, competition in broadband markets. Although, the FTC said it would continue watching the issue it had considerations on; the demand expected from content and applications providers for data prioritization, feasibility of effective data prioritization, throughout the many networks comprising the Internet, if allowing broadband providers to practice data prioritization could result in the degradation of non-prioritized data delivery,  the time capacity limitations of the networks comprising the Internet result in unmanageable or unacceptable levels of congestion, and the efficient response thereto: data prioritization, capacity increases, a combination of these, or some as yet unknown technological innovation to watch (Eldefer, 2008)

Top 30 writers

Get the highly skilled writer in the chosen discipline for $4.8 only!

Network forensics

Network Forensics was formed in 1990 and has been providing forensics support to law enforcement, solicitors and corporate clients ever since (Eldefer, 2008). The company, Control Risks, recently acquired Network Forensics in continues to provide the very best service in a range of forensic disciplines (Palmer, Hjelmvik, Ranum, Berghel, 2009). Network forensics is defined as the use of scientifically proven techniques to collect, fuse, identify, examine, correlate, analyze, and document digital evidence from multiple, actively processing and transmitting digital sources for the purpose of uncovering facts related to the planned intent, or measured success of unauthorized activities meant to disrupt, corrupt, and or compromise system components as well as providing information to assist in response to or recovery from these activities. Basically, net forensics is about monitoring network traffic, determination if there is an anomaly in the traffic and whether they can be an attack. If it is an attack, the nature of the attack is also determined. Some of the critical in forensics include traffic capture, preservation, visualization and analysis of the results. (Palmer, Hjelmvik, Ranum and Berghel, 2009)

The aims of network forensics, somewhat differs regarding whether it is performed by law enforcement rather than security operations. In this case analysis of captured network traffic can include tasks such as reassembling transferred files, searching for keywords and parsing human communication such as emails or chat sessions. Network forensics is still a developing area of science for a long time now leading to a lack of clarity regarding many emerging issues. Network forensics systems can be of two kinds:

ü  “Catch-it-as-you-can” systems, in which all packets passing through certain traffic point are captured and written to storage with analysis being done subsequently in batch mode (Ranum, 2002).

ü  “Stop, look and listen” systems, in which each packet is analyzed in a rudimentary way in memory and only certain information saved for future analysis. (Ranum, 2002)

High forensic technology and skilled investigators are required for Analysis of electronic evidence to interpret the mass of data produced. Network Forensics has the systems and expertise required to get to the truth.

VIP support

VIP support services:
special attention is assured! $9.99 only!

Areas of specialism include: Computer, E-discovery, Audio forensics, Transcription services, Video forensics, Court presentation, Document capture, Crime scene reconstruction, PDA analysis, Mobile phone analysis, Cell site analysis, and Audio/video playback (Eldefer, 2008).

Traceback Techniques

Denial-of-service attacks have the effects of resource consumption of a remote host or network, thereby blocking or degrading service to legitimate users. This so far has remained a hard security problems to address because they are simple to implement, difficult to prevent, and very difficult to trace. In the recent times, incidents of denial of internet service problems have escalated in terms of frequency, sophistication and severity. Howard reports that between the years of 1989 and 1995, the number of such attacks reported to the Computer Emergency Response Team (CERT) increased by 50% per year. More recently, a 1999 CSF/FBI survey reports that 32% of respondents detected denial-of-service attacks directed against their sites (Savage, Wetherall, Karlin, and Anderson, 2001). Even more worrying, recent reports indicate that attackers have developed tools to coordinate distributed attacks from many separate sites

Measures to deal with denial-of-service unfortunately have not advanced at the same pace. Most work in this area has focused on tolerating attacks by mitigating their effects on the victim (Savage, Wetherall, Karlin, and Anderson, 2001). This approach can provide an effective stopgap measure, but does not eliminate the problem, nor does it discourage attackers. The other option is ideally stopping an attacker at the source by tracing attacks back toward their origin. A lasting solution to the issue is utterly complicated by the potential use of indirection to "launder" the true causal origin of an attack (Savage, Wetherall, Karlin, and Anderson, 2001). Several programs are in place to try aid solve such problems. Efforts in attempting to reduce the anonymity have been afforded by the IP spoofing as briefly discussed, as follows;

Ingress filtering

This is the elimination of the ability to forge source address. The approach is frenquently called ingress filtering, which involves configuring routers to block packets that arrive with illegitimate source address.  An approach that requires a router with sufficient power to examine the source address of every packet and sufficient knowledge to distinguish between legitimate and illegitimate addresses (Savage, Wetherall, Karlin, and Anderson, 2001).   Clearly is shown that wider use of ingress filtering would have dramatic improvement on the Internet's robustness to denial-of-service attacks. Similarly, it is prudent to assume that such a system will never be full proof and therefore traceback technologies continues to be importance.

Want an expert to write a paper for you Talk to an operator now  

In most traceback techniques existing the mechanisms is to start from the router closest to the victim and interactively test its upstream links until they determine which one is used to carry the attackers traffic (Bruce, 2005).

Ordinarily, the process is repeated many times at the upstream router until the source has been reached. This technique assumes that an attack remains active until the completion of a trace and is therefore inappropriate for attacks that are detected after the fact, attacks that occur intermittently, or attacks that modulate their behavior in response to a traceback (Savage, Wetherall, Karlin, and Anderson, 2001). Two types of link internet testing schemes are used, which are input debugging and controlled flooding.

Logging

Data mining techniques are used to determine the path of the packets transverse from

Such packets logged at the key routers (Savage, Wetherall, Karlin, and Anderson, 2001). This approach is essential in that it can trace an attack long after it has been completed.

ICMP traceback

This is based on the use of explicit router- generated Icmp traceback massages. The principle idea is for the router to sample, with low probability one of the packets it is forwarding and copy the components into a special ICMP Traceback message including information about the adjacent routers along the path to the destination (Savage, Wetherall, Karlin, and Anderson, 2001)

Several disadvantages in the current design complicate its use. These include: ICMP traffic is becoming more and more differentiated hence may itself be filtered in a network under attack; the ICMP Traceback message relies on an input debugging that is not available in some router architectures; if only some of the routers participate it seems difficult to positively "connect" traceback messages from participating routers separated by a nonparticipating router; and finally, it requires a key distribution infrastructure to deal with the problem of attackers sending false ICMP Traceback messages. (Savage, Wetherall, Karlin, and Anderson, 2001). Non-conformity, service failures and loss of critical income are some of the impending effects related to poor quality. Several models have been fronted in regard to estimating the running cost of poor quality as seen in many operating and business management settings. The cost of poor quality has greatly been pursued through the application of various operational models, which are relevant in the business environment (Barbara, De Souza and Catunda, 2008). Quality has developed into a prerequisite parameter hence it is no longer consider a significant differentiator in the modern day’s business climate characterized with intense competition. However, the cost of poor quality is still considered a substantial measure with regard to providing critical performance indicators of a parameter.

Plagiarism check

Attractive plagiarism check option:
ensure your papers are authentic!

The concept of quality as a parameter has been critically applied in the modern day business environment. The common practices by the accounting fraternity propose the use of profit and loss accounts availed in the final balance sheet (Barbara, De Souza and Catunda, 2008). Quality largely affects the nature of products and services on provision and their subsequent production. This has an overarching on the provision of quality.

According to QIMPRO, “The sheer size of internal failure costs, external failure costs and appraisal costs indicate that cost of poor quality or chronic waste does not exist as a homogenous mass” (QIMPRO STANDARDS ORGANIZATION, 2005). The nature of cost of poor quality usually occurs according to definitive segments attributed to particular causes. Ordinarily, the natures of the causes portray an element of unequal nature of components, which may at times occur in bulk form. It is however, a common and repetitive phenomenon that these costs have a nature of not reflecting in the final accounting reports.

This is a misplaced assumption since majority of these costs have a nature of being greater than in the figure given in the face value assessment. Majority of companies and institutions have a poor quality cost occurring in the range between 20% and 30% of the total sales or alternatively 25% and 40% of the total operating expense for the business (QIMPRO STANDARDS ORGANIZATION, 2005).

Among the many factors leading to poor quality, include the effect of a missing link in the provision of a functional expertise and the existing organizational structure which substantial deletion of significant quality parameters that are critical in modern day business settings. In normal settings the functional structure is usually detached from the overall organizational structure in such a manner that there is a constantly existing gap in the manner in which majority of activities are conducted. This creates flaws in the systems put in place to ensure and provide a guarantee concerning the level of approving quality parameters. This therefore creates avenues through which compliance rules are broken.

There is also an aspect played with regard to quality planning, quality control, and quality improvement concerns. Surveys carried out postulate the fact there are numerous flaws in existence in the follow up duties left to the top management personnel inclusive of operations management. The self-assessment tests conducted indicated an increased tendency for top managers to institute stronger principles when it comes to planning concerns. However, when it comes to quality control there is significant laxity that goes unnoticed over a long period. This creates fundamental flaws that become increasingly difficult to make follow up on come the time when there is need to institute significant control.

Proofreading service

Do you want your papers to be flawless?
Use our proofreading service!

There is also a significant gap in the establishment of important policies that can be used to implement quality control procedures. Policy formulation is hence a critical step in ensuring top quality of products and service deliverables. This is because they provide provisions that can be instituted into an enterprise’s business environment in order to control the manner in which policies are being transformed in the business environment. The institution of incompetent policies that have been formulated by a non professional body puts significant doubt on the quality and level of service provision seen in majority of institutional arrangements. This because they lead to a promotion of non professional practices in the services and products produced at the final point hence they eventually end up in the distribution stream.

There is also an effect resulting from the practice of total quality management strategies. These are entirely instituted in organizational practices, quality principles, employee fulfillment and customer satisfaction (Barbara, De Souza and Catunda, 2008). These components are very critical in the contemporary business environment in which the practice of total quality management serves to elevate quality standards. There is still a huge gap in promotion of effective total quality management practices are being done especially in small business environments.

   

What Our Customers Say

Now Accepting Apple Pay!
Click here to chat with us